Science.gov

Sample records for high-level template matching

  1. Template Matching Using a Fluid Flow Model

    NASA Astrophysics Data System (ADS)

    Newman, William Curtis

    Template matching is successfully used in machine recognition of isolated spoken words. In these systems a word is broken into frames (20 millisecond time slices) and the spectral characteristics of each frame are found. Thus, each word is represented as a 2-dimensional (2-D) function of spectral characteristic and frame number. An unknown word is recognized by matching its 2-D representation to previously stored example words, or templates, also in this 2-D form. A new model for this matching step will be introduced. The 2-D representations of the template and unknown are used to determine the shape of a volume of viscous fluid. This volume is broken up into many small elements. The unknown is changed into the template by allowing flows between the element boundaries. Finally the match between the template and unknown is determined by calculating a weighted squared sum of the flow values. The model also allows the relative flow resistance between the element boundaries to be changed. This is useful for characterizing the important features of a given template. The flow resistances are changed according to the gradient of a simple performance function. This performance function is evaluated using a set of training samples provided by the user. The model is applied to isolated word and single character recognition tasks. Results indicate the applications where this model works best.

  2. Photon signature analysis using template matching

    NASA Astrophysics Data System (ADS)

    Bradley, D. A.; Hashim, S.; Saripan, M. I.; Wells, K.; Dunn, W. L.

    2011-10-01

    We describe an approach to detect improvised explosive devices (IEDs) by using a template matching procedure. This approach relies on the signature due to backstreaming γ photons from various targets. In this work we have simulated cylindrical targets of aluminum, iron, copper, water and ammonium nitrate (nitrogen-rich fertilizer). We simulate 3.5 MeV source photons distributed on a plane inside a shielded area using Monte Carlo N-Particle (MCNP TM) code version 5 (V5). The 3.5 MeV source gamma rays yield 511 keV peaks due to pair production and scattered gamma rays. In this work, we simulate capture of those photons that backstream, after impinging on the target element, toward a NaI detector. The captured backstreamed photons are expected to produce a unique spectrum that will become part of a simple signal processing recognition system based on the template matching method. Different elements were simulated using different sets of random numbers in the Monte Carlo simulation. To date, the sum of absolute differences (SAD) method has been used to match the template. In the examples investigated, template matching was found to detect all elements correctly.

  3. Automatic target detection using binary template matching

    NASA Astrophysics Data System (ADS)

    Jun, Dong-San; Sun, Sun-Gu; Park, HyunWook

    2005-03-01

    This paper presents a new automatic target detection (ATD) algorithm to detect targets such as battle tanks and armored personal carriers in ground-to-ground scenarios. Whereas most ATD algorithms were developed for forward-looking infrared (FLIR) images, we have developed an ATD algorithm for charge-coupled device (CCD) images, which have superior quality to FLIR images in daylight. The proposed algorithm uses fast binary template matching with an adaptive binarization, which is robust to various light conditions in CCD images and saves computation time. Experimental results show that the proposed method has good detection performance.

  4. A Novel BA Complex Network Model on Color Template Matching

    PubMed Central

    Han, Risheng; Yue, Guangxue; Ding, Hui

    2014-01-01

    A novel BA complex network model of color space is proposed based on two fundamental rules of BA scale-free network model: growth and preferential attachment. The scale-free characteristic of color space is discovered by analyzing evolving process of template's color distribution. And then the template's BA complex network model can be used to select important color pixels which have much larger effects than other color pixels in matching process. The proposed BA complex network model of color space can be easily integrated into many traditional template matching algorithms, such as SSD based matching and SAD based matching. Experiments show the performance of color template matching results can be improved based on the proposed algorithm. To the best of our knowledge, this is the first study about how to model the color space of images using a proper complex network model and apply the complex network model to template matching. PMID:25243235

  5. Robust structural identification via polyhedral template matching

    NASA Astrophysics Data System (ADS)

    Mahler Larsen, Peter; Schmidt, Søren; Schiøtz, Jakob

    2016-06-01

    Successful scientific applications of large-scale molecular dynamics often rely on automated methods for identifying the local crystalline structure of condensed phases. Many existing methods for structural identification, such as common neighbour analysis, rely on interatomic distances (or thresholds thereof) to classify atomic structure. As a consequence they are sensitive to strain and thermal displacements, and preprocessing such as quenching or temporal averaging of the atomic positions is necessary to provide reliable identifications. We propose a new method, polyhedral template matching (PTM), which classifies structures according to the topology of the local atomic environment, without any ambiguity in the classification, and with greater reliability than e.g. common neighbour analysis in the presence of thermal fluctuations. We demonstrate that the method can reliably be used to identify structures even in simulations near the melting point, and that it can identify the most common ordered alloy structures as well. In addition, the method makes it easy to identify the local lattice orientation in polycrystalline samples, and to calculate the local strain tensor. An implementation is made available under a Free and Open Source Software license.

  6. Template Matching Approach to Signal Prediction

    NASA Technical Reports Server (NTRS)

    Mackey, Ryan; Kulikov, Igor

    2010-01-01

    A new approach to signal prediction and prognostic assessment of spacecraft health resolves an inherent difficulty in fusing sensor data with simulated data. This technique builds upon previous work that demonstrated the importance of physics-based transient models to accurate prediction of signal dynamics and system performance. While models can greatly improve predictive accuracy, they are difficult to apply in general because of variations in model type, accuracy, or intended purpose. However, virtually any flight project will have at least some modeling capability at its disposal, whether a full-blown simulation, partial physics models, dynamic look-up tables, a brassboard analogue system, or simple hand-driven calculation by a team of experts. Many models can be used to develop a predict, or an estimate of the next day s or next cycle s behavior, which is typically used for planning purposes. The fidelity of a predict varies from one project to another, depending on the complexity of the simulation (i.e. linearized or full differential equations) and the level of detail in anticipated system operation, but typically any predict cannot be adapted to changing conditions or adjusted spacecraft command execution. Applying a predict blindly, without adapting the predict to current conditions, produces mixed results at best, primarily due to mismatches between assumed execution of spacecraft activities and actual times of execution. This results in the predict becoming useless during periods of complicated behavior, exactly when the predict would be most valuable. Each spacecraft operation tends to show up as a transient in the data, and if the transients are misaligned, using the predict can actually harm forecasting performance. To address this problem, the approach here expresses the predict in terms of a baseline function superposed with one or more transient functions. These transients serve as signal templates, which can be relocated in time and space against

  7. Particle recognition in microfluidic applications using a template matching algorithm

    NASA Astrophysics Data System (ADS)

    Girault, Mathias; Odaka, Masao; Kim, Hyonchol; Matsuura, Kenji; Terazono, Hideyuki; Yasuda, Kenji

    2016-06-01

    We herein examined the ability of a template matching algorithm to recognize particles with diameters ranging from 1 to 20 µm in a microfluidic channel. The algorithm consisted of measurements of the distance between the templates and the images captured with a high-speed camera in order to search for the presence of the desired particle. The results obtained indicated that the effects of blur and diffraction rings observed around the particle are important phenomena that limit the recognition of a target. Owing to the effects of diffraction rings, the distance between a template and an image is not exclusively linked to the position of the focus plane; it is also linked to the size of the particle being searched for. By using a set of three templates captured at different Z focuses and an 800× magnification, the template matching algorithm has the ability to recognize beads ranging in diameter from 1.7 to 20 µm with a resolution between 0.3 and 1 µm.

  8. Dynamic template-matching-based processing for handheld landmine detector

    NASA Astrophysics Data System (ADS)

    Ho, K. C.; Gader, Paul D.

    2003-09-01

    This paper investigates the use of landmine templates in the GPR data to improve the detection accuracy for a hand-held mine detection unit. The proposed algorithm applies to the discrimination operating mode after the initial detection from the search mode. The proposed template matching-based algorithm extracts the mine templates from the data acquired during the first few sweeps, and correlates the templates from the data at subsequent sweeps to enhance the detection of landmine. The proposed technique does not have a time lag in producing detection values and a detection value is generated at each sample location. Experimental results show that the proposed template matching-based detector is able to increase the detection especially for low-metal anti-personnel mines. Based on the experiment performed over the data set collected at a test site, at 95% Pd, the proposed algorithm reduces the probability of false alarms by 66% for the low-metal anti-personnel mines and 30% for the low-metal anti-tank mines.

  9. Active fibers: matching deformable tract templates to diffusion tensor images.

    PubMed

    Eckstein, Ilya; Shattuck, David W; Stein, Jason L; McMahon, Katie L; de Zubicaray, Greig; Wright, Margaret J; Thompson, Paul M; Toga, Arthur W

    2009-08-01

    Reliable quantitative analysis of white matter connectivity in the brain is an open problem in neuroimaging, with common solutions requiring tools for fiber tracking, tractography segmentation and estimation of intersubject correspondence. This paper proposes a novel, template matching approach to the problem. In the proposed method, a deformable fiber-bundle model is aligned directly with the subject tensor field, skipping the fiber tracking step. Furthermore, the use of a common template eliminates the need for tractography segmentation and defines intersubject shape correspondence. The method is validated using phantom DTI data and applications are presented, including automatic fiber-bundle reconstruction and tract-based morphometry.

  10. Uniform smooth filtering approach for fast template matching

    NASA Astrophysics Data System (ADS)

    Li, Bing C.

    2016-05-01

    Sum of square difference (SSD) and normalized cross correlation (NCC) are two different template matching techniques and their fast implementations have been investigated independently. The SSD approach is known to be simple and fast, however it is variant to image intensity change that lead to low performance. On the other hand, the NCC method is invariant to intensity change and has high performance, but its computational cost is high. In this paper, we derive an equation that connects NCC and SSD. From this equation, we propose SSD based partial elimination for the fast implementation of NCC template matching. This new technique takes the advantages of both NCC's high performance and SSD's low computational cost. It is fast and has high performance. Then we propose a uniform smoothing approach that further reduces computational cost for NCC. Experiments show that the proposed method is significantly faster than the techniques reported in literature.

  11. Template match using local feature with view invariance

    NASA Astrophysics Data System (ADS)

    Lu, Cen; Zhou, Gang

    2013-10-01

    Matching the template image in the target image is the fundamental task in the field of computer vision. Aiming at the deficiency in the traditional image matching methods and inaccurate matching in scene image with rotation, illumination and view changing, a novel matching algorithm using local features are proposed in this paper. The local histograms of the edge pixels (LHoE) are extracted as the invariable feature to resist view and brightness changing. The merits of the LHoE is that the edge points have been little affected with view changing, and the LHoE can resist not only illumination variance but also the polution of noise. For the process of matching are excuded only on the edge points, the computation burden are highly reduced. Additionally, our approach is conceptually simple, easy to implement and do not need the training phase. The view changing can be considered as the combination of rotation, illumination and shear transformation. Experimental results on simulated and real data demonstrated that the proposed approach is superior to NCC(Normalized cross-correlation) and Histogram-based methods with view changing.

  12. A template-matching pandemonium recognizes unconstrained handwritten characters with high accuracy.

    PubMed

    Larsen, A; Bundesen, C

    1996-03-01

    Psychological data suggest that internal representations such as mental images can be used as templates in visual pattern recognition. But computational studies suggest that traditional template matching is insufficient for high-accuracy recognition of real-life patterns such as handwritten characters. Here we explore a model for visual pattern recognition that combines a template-matching and a feature-analysis approach: Character classification is based on weighted evidence from a number of analyzers (demons), each of which computes the degree of match between the input character and a stored template (a copy of a previously presented character). The template-matching pandemonium was trained to recognize totally unconstrained handwritten digits. With a mean of 37 templates per type of digit, the system has attained a recognition rate of 95.3%, which falls short of human performance by only 2%-3%.

  13. Template matching method for the analysis of interstellar cloud structure

    NASA Astrophysics Data System (ADS)

    Juvela, M.

    2016-09-01

    Context. The structure of interstellar medium can be characterised at large scales in terms of its global statistics (e.g. power spectra) and at small scales by the properties of individual cores. Interest has been increasing in structures at intermediate scales, resulting in a number of methods being developed for the analysis of filamentary structures. Aims: We describe the application of the generic template-matching (TM) method to the analysis of maps. Our aim is to show that it provides a fast and still relatively robust way to identify elongated structures or other image features. Methods: We present the implementation of a TM algorithm for map analysis. The results are compared against rolling Hough transform (RHT), one of the methods previously used to identify filamentary structures. We illustrate the method by applying it to Herschel surface brightness data. Results: The performance of the TM method is found to be comparable to that of RHT but TM appears to be more robust regarding the input parameters, for example, those related to the selected spatial scales. Small modifications of TM enable one to target structures at different size and intensity levels. In addition to elongated features, we demonstrate the possibility of using TM to also identify other types of structures. Conclusions: The TM method is a viable tool for data quality control, exploratory data analysis, and even quantitative analysis of structures in image data.

  14. Robust template matching for affine resistant image watermarks.

    PubMed

    Pereira, S; Pun, T

    2000-01-01

    Digital watermarks have been proposed as a method for discouraging illicit copying and distribution of copyrighted material. This paper describes a method for the secure and robust copyright protection of digital images. We present an approach for embedding a digital watermark into an image using the Fourier transform. To this watermark is added a template in the Fourier transform domain to render the method robust against general linear transformations. We detail a new algorithm based on polar maps for the accurate and efficient recovery of the template in an image which has undergone a general affine transformation. We also present results which demonstrate the robustness of the method against some common image processing operations such as compression, rotation, scaling, and aspect ratio changes. PMID:18255481

  15. Part-template matching-based target detection and identification in UAV videos

    NASA Astrophysics Data System (ADS)

    Kim, Hyuncheol; Im, Jaehyun; Kim, Taekyung; Bae, Jongsue; Lee, Sanghoon; Paik, Joonki

    2012-06-01

    Detecting and identifying targets in aerial images has been a challenging problem due to various types of image distortion factors, such as motion of a sensing device, weather variation, scale changes, and dynamic viewpoint. For accurate, robust recognition of objects in unmanned aerial vehicle (UAV) videos, we present a novel target detection and identification algorithm using part-template matching. The proposed method for target detection partitions the target into part-templates by efficient extraction method based on target part regions. We also propose distribution distance measurement-based target identification using the target part-template.

  16. A novel artificial bee colony algorithm based on internal-feedback strategy for image template matching.

    PubMed

    Li, Bai; Gong, Li-Gang; Li, Ya

    2014-01-01

    Image template matching refers to the technique of locating a given reference image over a source image such that they are the most similar. It is a fundamental mission in the field of visual target recognition. In general, there are two critical aspects of a template matching scheme. One is similarity measurement and the other is best-match location search. In this work, we choose the well-known normalized cross correlation model as a similarity criterion. The searching procedure for the best-match location is carried out through an internal-feedback artificial bee colony (IF-ABC) algorithm. IF-ABC algorithm is highlighted by its effort to fight against premature convergence. This purpose is achieved through discarding the conventional roulette selection procedure in the ABC algorithm so as to provide each employed bee an equal chance to be followed by the onlooker bees in the local search phase. Besides that, we also suggest efficiently utilizing the internal convergence states as feedback guidance for searching intensity in the subsequent cycles of iteration. We have investigated four ideal template matching cases as well as four actual cases using different searching algorithms. Our simulation results show that the IF-ABC algorithm is more effective and robust for this template matching mission than the conventional ABC and two state-of-the-art modified ABC algorithms do.

  17. A TSR Visual Servoing System Based on a Novel Dynamic Template Matching Method.

    PubMed

    Cai, Jia; Huang, Panfeng; Zhang, Bin; Wang, Dongke

    2015-01-01

    The so-called Tethered Space Robot (TSR) is a novel active space debris removal system. To solve its problem of non-cooperative target recognition during short-distance rendezvous events, this paper presents a framework for a real-time visual servoing system using non-calibrated monocular-CMOS (Complementary Metal Oxide Semiconductor). When a small template is used for matching with a large scene, it always leads to mismatches, so a novel template matching algorithm to solve the problem is presented. Firstly, the novel matching algorithm uses a hollow annulus structure according to a FAST (Features from Accelerated Segment) algorithm and makes the method be rotation-invariant. Furthermore, the accumulative deviation can be decreased by the hollow structure. The matching function is composed of grey and gradient differences between template and object image, which help it reduce the effects of illumination and noises. Then, a dynamic template update strategy is designed to avoid tracking failures brought about by wrong matching or occlusion. Finally, the system synthesizes the least square integrated predictor, realizing tracking online in complex circumstances. The results of ground experiments show that the proposed algorithm can decrease the need for sophisticated computation and improves matching accuracy. PMID:26703609

  18. A TSR Visual Servoing System Based on a Novel Dynamic Template Matching Method †

    PubMed Central

    Cai, Jia; Huang, Panfeng; Zhang, Bin; Wang, Dongke

    2015-01-01

    The so-called Tethered Space Robot (TSR) is a novel active space debris removal system. To solve its problem of non-cooperative target recognition during short-distance rendezvous events, this paper presents a framework for a real-time visual servoing system using non-calibrated monocular-CMOS (Complementary Metal Oxide Semiconductor). When a small template is used for matching with a large scene, it always leads to mismatches, so a novel template matching algorithm to solve the problem is presented. Firstly, the novel matching algorithm uses a hollow annulus structure according to a FAST (Features from Accelerated Segment) algorithm and makes the method be rotation-invariant. Furthermore, the accumulative deviation can be decreased by the hollow structure. The matching function is composed of grey and gradient differences between template and object image, which help it reduce the effects of illumination and noises. Then, a dynamic template update strategy is designed to avoid tracking failures brought about by wrong matching or occlusion. Finally, the system synthesizes the least square integrated predictor, realizing tracking online in complex circumstances. The results of ground experiments show that the proposed algorithm can decrease the need for sophisticated computation and improves matching accuracy. PMID:26703609

  19. A TSR Visual Servoing System Based on a Novel Dynamic Template Matching Method.

    PubMed

    Cai, Jia; Huang, Panfeng; Zhang, Bin; Wang, Dongke

    2015-12-21

    The so-called Tethered Space Robot (TSR) is a novel active space debris removal system. To solve its problem of non-cooperative target recognition during short-distance rendezvous events, this paper presents a framework for a real-time visual servoing system using non-calibrated monocular-CMOS (Complementary Metal Oxide Semiconductor). When a small template is used for matching with a large scene, it always leads to mismatches, so a novel template matching algorithm to solve the problem is presented. Firstly, the novel matching algorithm uses a hollow annulus structure according to a FAST (Features from Accelerated Segment) algorithm and makes the method be rotation-invariant. Furthermore, the accumulative deviation can be decreased by the hollow structure. The matching function is composed of grey and gradient differences between template and object image, which help it reduce the effects of illumination and noises. Then, a dynamic template update strategy is designed to avoid tracking failures brought about by wrong matching or occlusion. Finally, the system synthesizes the least square integrated predictor, realizing tracking online in complex circumstances. The results of ground experiments show that the proposed algorithm can decrease the need for sophisticated computation and improves matching accuracy.

  20. Optimizing Multi-Station Template Matching to Identify and Characterize Induced Seismicity in Ohio

    NASA Astrophysics Data System (ADS)

    Brudzinski, M. R.; Skoumal, R.; Currie, B. S.

    2014-12-01

    As oil and gas well completions utilizing multi-stage hydraulic fracturing have become more commonplace, the potential for seismicity induced by the deep disposal of frac-related flowback waters and the hydraulic fracturing process itself has become increasingly important. While it is rare for these processes to induce felt seismicity, the recent increase in the number of deep injection wells and volumes injected have been suspected to have contributed to a substantial increase of events = M 3 in the continental U.S. over the past decade. Earthquake template matching using multi-station waveform cross-correlation is an adept tool for investigating potentially induced sequences due to its proficiency at identifying similar/repeating seismic events. We have sought to refine this approach by investigating a variety of seismic sequences and determining the optimal parameters (station combinations, template lengths and offsets, filter frequencies, data access method, etc.) for identifying induced seismicity. When applied to a sequence near a wastewater injection well in Youngstown, Ohio, our optimized template matching routine yielded 566 events while other template matching studies found ~100-200 events. We also identified 77 events on 4-12 March 2014 that are temporally and spatially correlated with active hydraulic fracturing in Poland Township, Ohio. We find similar improvement in characterizing sequences in Washington and Harrison Counties, which appear to be related to wastewater injection and hydraulic fracturing, respectively. In the Youngstown and Poland Township cases, focal mechanisms and double difference relocation using the cross-correlation matrix finds left-lateral faults striking roughly east-west near the top of the basement. We have also used template matching to determine isolated earthquakes near several other wastewater injection wells are unlikely to be induced based on a lack of similar/repeating sequences. Optimized template matching utilizes

  1. Template Matching Method Based on Visual Feature Constraint and Structure Constraint

    NASA Astrophysics Data System (ADS)

    Li, Zhu; Tomotsune, Kojiro; Tomioka, Yoichi; Kitazawa, Hitoshi

    Template matching for image sequences captured with a moving camera is very important for several applications such as Robot Vision, SLAM, ITS, and video surveillance systems. However, it is difficult to realize accurate template matching using only visual feature information such as HSV histograms, edge histograms, HOG histograms, and SIFT features, because it is affected by several phenomena such as illumination change, viewpoint change, size change, and noise. In order to realize robust tracking, structure information such as the relative position of each part of the object should be considered. In this paper, we propose a method that considers both visual feature information and structure information. Experiments show that the proposed method realizes robust tracking and determine the relationships between object parts in the scenes and those in the template.

  2. Performance of peaky template matching under additive white Gaussian noise and uniform quantization

    NASA Astrophysics Data System (ADS)

    Horvath, Matthew S.; Rigling, Brian D.

    2015-05-01

    Peaky template matching (PTM) is a special case of a general algorithm known as multinomial pattern matching originally developed for automatic target recognition of synthetic aperture radar data. The algorithm is a model- based approach that first quantizes pixel values into Nq = 2 discrete values yielding generative Beta-Bernoulli models as class-conditional templates. Here, we consider the case of classification of target chips in AWGN and develop approximations to image-to-template classification performance as a function of the noise power. We focus specifically on the case of a uniform quantization" scheme, where a fixed number of the largest pixels are quantized high as opposed to using a fixed threshold. This quantization method reduces sensitivity to the scaling of pixel intensities and quantization in general reduces sensitivity to various nuisance parameters difficult to account for a priori. Our performance expressions are verified using forward-looking infrared imagery from the Army Research Laboratory Comanche dataset.

  3. High-speed template matching with point correlation in image pyramids

    NASA Astrophysics Data System (ADS)

    Penz, Harald; Bajla, Ivan; Mayer, Konrad; Krattenthaler, Werner

    1999-09-01

    Matching of a reference template with an image is a computationally expensive job. Particularly in fast real-time applications, large images and search ranges led to serious implementation problems. Therefore a reduction of the template size achieved by the selection of an appropriate subtemplate which is used for point correlation (subtemplate matching) may significantly decrease computational cost. In this paper a modified algorithm of the subtemplate point selection is proposed and explored. With the additional use of image pyramids, we can reduce the computational costs even further. The algorithm starts with a coarse search grid in the top level of the image pyramid generated for the full intended resolution. The procedure continues until the lowest level of the pyramid, the original image, is reached. The computational costs of this algorithm part satisfy the requirement for on- line processing. The preparation of the subtemplate for the point correlation is carried out in off-line mode, i.e., there is no rigorous limit of computational costs. The technique that applies the point correlation to image template matching within the image pyramid concept is proposed and the results obtained are discussed. It is especially useful for fast real- time system implementation when a large number of template matchings are needed in the same image.

  4. Sparse representation based on local time-frequency template matching for bearing transient fault feature extraction

    NASA Astrophysics Data System (ADS)

    He, Qingbo; Ding, Xiaoxi

    2016-05-01

    The transients caused by the localized fault are important measurement information for bearing fault diagnosis. Thus it is crucial to extract the transients from the bearing vibration or acoustic signals that are always corrupted by a large amount of background noise. In this paper, an iterative transient feature extraction approach is proposed based on time-frequency (TF) domain sparse representation. The approach is realized by presenting a new method, called local TF template matching. In this method, the TF atoms are constructed based on the TF distribution (TFD) of the Morlet wavelet bases and local TF templates are formulated from the TF atoms for the matching process. The instantaneous frequency (IF) ridge calculated from the TFD of an analyzed signal provides the frequency parameter values for the TF atoms as well as an effective template matching path on the TF plane. In each iteration, local TF templates are employed to do correlation with the TFD of the analyzed signal along the IF ridge tube for identifying the optimum parameters of transient wavelet model. With this iterative procedure, transients can be extracted in the TF domain from measured signals one by one. The final signal can be synthesized by combining the extracted TF atoms and the phase of the raw signal. The local TF template matching builds an effective TF matching-based sparse representation approach with the merit of satisfying the native pulse waveform structure of transients. The effectiveness of the proposed method is verified by practical defective bearing signals. Comparison results also show that the proposed method is superior to traditional methods in transient feature extraction.

  5. Vehicle extraction from high-resolution satellite image using template matching

    NASA Astrophysics Data System (ADS)

    Natt, Dehchaiwong; Cao, Xiaoguang

    2015-12-01

    The process of vehicle examination by using satellite images is complicated and cumbersome process. At the present, the high definition satellite images are being used, however, the images of the vehicles can be seen as just a small point which is difficult to separate it out from the background that the image details are not sufficient to identify small objects. In this research, the techniques for the process of vehicle examination by using satellite images were applied by using image data from Pléiades which is the satellite image with high resolution of 0.40 m. The objective of this research is to study and develop the device for data extracting from satellite images, and the received data would be organized and created as Geospatial information by the concept of the picture matching with a pattern matching or Template Matching developed with Matlab program and Sum of Absolute Difference method collaborated with Neural Network technique in order to help evaluating pattern matching between template images of cars and cars' images which were used to examine from satellite images. The result obtained from the comparison with template data shows that data extraction accuracy is greater than 90%, and the extracted data can be imported into Geospatial information database. Moreover, the data can be displayed in Geospatial information Software, and it also can be searched by quantity condition and satellite image position.

  6. Stable and Transportable Seismic Yield Estimation from Full Envelope Template Matching

    NASA Astrophysics Data System (ADS)

    Yoo, S. H.; Mayeda, K. M.

    2015-12-01

    We have developed a transportable, empirically based, multi-frequency local template approach to estimate surface explosion yield from seismic stations that range between ~5 to 50-km distance with yield estimates within 50% of the design yield at 95% confidence using as few as 4 stations for events ranging between ~50 kg to 50 tons. Unlike the regional coda methodology developed by Mayeda et al. (2003) and the more recent synthetic template approach of Pasyanos et al. (2012), near-local coda is short in duration and not well modeled by the simple Aki-style (e.g., Aki 1969) exponential single-scattering formulation in a full-space. With only short durations and high-frequency waveforms at our disposal, our empirical full envelope template matching approach provides nearly an exact match to these complicated waveforms in the near-field. Moreover, as these templates capture the nuance of both the initial P-wave, S-wave and scattered counterparts, our method is very robust and provides roughly a factor of 3 times less yield measurement scatter than peak P-wave and integrated first P-wave estimates (e.g., Ford et al., 2014) and also utilizes a much larger amplitude signal. Time permitting, we will also consider near-regional recordings of surface explosions, Sayarim in Israel.

  7. Distinguishing induced seismicity from natural seismicity in Ohio: Demonstrating the utility of waveform template matching

    NASA Astrophysics Data System (ADS)

    Skoumal, Robert J.; Brudzinski, Michael R.; Currie, Brian S.

    2015-09-01

    This study investigated the utility of multistation waveform cross correlation to help discern induced seismicity. Template matching was applied to all Ohio earthquakes cataloged since the arrival of nearby EarthScope TA stations in late 2010. Earthquakes that were within 5 km of fluid injection activities in regions that lacked previously documented seismicity were found to be swarmy. Moreover, the larger number of events produced by template matching for these swarmy sequences made it easier to establish more detailed temporal and spatial relationships between the seismicity and fluid injection activities, which is typically required for an earthquake to be considered induced. Study results detected three previously documented induced sequences (Youngstown, Poland Township, and Harrison County) and provided evidence that suggests two additional cases of induced seismicity (Belmont/Guernsey County and Washington County). Evidence for these cases suggested that unusual swarm-like behaviors in regions that lack previously documented seismicity can be used to help distinguish induced seismicity, complementing the traditional identification of an anthropogenic source spatially and temporally correlated with the seismicity. In support of this finding, we identified 17 additional cataloged earthquakes in regions of previously documented seismicity and away from disposal wells or hydraulic fracturing that returned very few template matches. The lack of swarminess helps to indicate that these events are most likely naturally occurring.

  8. Volume change determination of metastatic lung tumors in CT images using 3-D template matching

    NASA Astrophysics Data System (ADS)

    Ambrosini, Robert D.; Wang, Peng; O'Dell, Walter G.

    2009-02-01

    The ability of a clinician to properly detect changes in the size of lung nodules over time is a vital element to both the diagnosis of malignant growths and the monitoring of the response of cancerous lesions to therapy. We have developed a novel metastasis sizing algorithm based on 3-D template matching with spherical tumor appearance models that were created to match the expected geometry of the tumors of interest while accounting for potential spatial offsets of nodules in the slice thickness direction. The spherical template that best-fits the overall volume of each lung metastasis was determined through the optimization of the 3-D normalized cross-correlation coefficients (NCCC) calculated between the templates and the nodules. A total of 17 different lung metastases were extracted manually from real patient CT datasets and reconstructed in 3-D using spherical harmonics equations to generate simulated nodules for testing our algorithm. Each metastasis 3-D shape was then subjected to 10%, 25%, 50%, 75% and 90% scaling of its volume to allow for 5 possible volume change combinations relative to the original size per each reconstructed nodule and inserted back into CT datasets with appropriate blurring and noise addition. When plotted against the true volume change, the nodule volume changes calculated by our algorithm for these 85 data points exhibited a high degree of accuracy (slope = 0.9817, R2 = 0.9957). Our results demonstrate that the 3-D template matching method can be an effective, fast, and accurate tool for automated sizing of metastatic tumors.

  9. Matched filtering of gravitational waves from inspiraling compact binaries: Computational cost and template placement

    NASA Astrophysics Data System (ADS)

    Owen, Benjamin J.; Sathyaprakash, B. S.

    1999-07-01

    We estimate the number of templates, computational power, and storage required for a one-step matched filtering search for gravitational waves from inspiraling compact binaries. Our estimates for the one-step search strategy should serve as benchmarks for the evaluation of more sophisticated strategies such as hierarchical searches. We use a discrete family of two-parameter wave form templates based on the second post-Newtonian approximation for binaries composed of nonspinning compact bodies in circular orbits. We present estimates for all of the large- and mid-scale interferometers now under construction: LIGO (three configurations), VIRGO, GEO600, and TAMA. To search for binaries with components more massive than mmin=0.2Msolar while losing no more than 10% of events due to coarseness of template spacing, the initial LIGO interferometers will require about 1.0×1011 flops (floating point operations per second) for data analysis to keep up with data acquisition. This is several times higher than estimated in previous work by Owen, in part because of the improved family of templates and in part because we use more realistic (higher) sampling rates. Enhanced LIGO, GEO600, and TAMA will require computational power similar to initial LIGO. Advanced LIGO will require 7.8×1011 flops, and VIRGO will require 4.8×1012 flops to take full advantage of its broad target noise spectrum. If the templates are stored rather than generated as needed, storage requirements range from 1.5×1011 real numbers for TAMA to 6.2×1014 for VIRGO. The computational power required scales roughly as m-8/3min and the storage as m-13/3min. Since these scalings are perturbed by the curvature of the parameter space at second post-Newtonian order, we also provide estimates for a search with mmin=1Msolar. Finally, we sketch and discuss an algorithm for placing the templates in the parameter space.

  10. [Using neural networks based template matching method to obtain redshifts of normal galaxies].

    PubMed

    Xu, Xin; Luo, A-li; Wu, Fu-chao; Zhao, Yong-heng

    2005-06-01

    Galaxies can be divided into two classes: normal galaxy (NG) and active galaxy (AG). In order to determine NG redshifts, an automatic effective method is proposed in this paper, which consists of the following three main steps: (1) From the template of normal galaxy, the two sets of samples are simulated, one with the redshift of 0.0-0.3, the other of 0.3-0.5, then the PCA is used to extract the main components, and train samples are projected to the main component subspace to obtain characteristic spectra. (2) The characteristic spectra are used to train a Probabilistic Neural Network to obtain a Bayes classifier. (3) An unknown real NG spectrum is first inputted to this Bayes classifier to determine the possible range of redshift, then the template matching is invoked to locate the redshift value within the estimated range. Compared with the traditional template matching technique with an unconstrained range, our proposed method not only halves the computational load, but also increases the estimation accuracy. As a result, the proposed method is particularly useful for automatic spectrum processing produced from a large-scale sky survey project.

  11. Template-matching based detection of hyperbolas in ground-penetrating radargrams for buried utilities

    NASA Astrophysics Data System (ADS)

    Sagnard, Florence; Tarel, Jean-Philippe

    2016-08-01

    Ground-penetrating radar (GPR) is a mature geophysical technique that is used to map utility pipelines buried within 1.5 m of the ground surface in the urban landscape. In this work, the template-matching algorithm has been originally applied to the detection and localization of pipe signatures in two perpendicular antenna polarizations. The processing of a GPR radargram is based on four main steps. The first step consists in defining a template, usually from finite-difference time-domain simulations, made of the nearby area of the hyperbola apex associated with the mean size object to be detected in the soil, whose mean permittivity has been previously experimentally estimated. In the second step, the raw radargram is pre-processed to correct variations due to antenna coupling, then the template matching algorithm is used to detect and localize individual hyperbola signatures in an environment containing unwanted reflections, noise and overlapping signatures. The distance between the shifted template and a local zone in the radargram, based on the L1 norm, allows us to obtain a map of distances. A user-defined threshold allows us to select a reduced number of zones having a high similarity measure. In the third step, minimum or maximum discrete amplitudes belonging to a selected hyperbola curve are semi-automatically extracted in each zone. In the fourth step, the discrete hyperbola data (i, j) are fitted by a parametric hyperbola model using a non-linear least squares criterion. The algorithm was implemented and evaluated on numerical radargrams, and afterwards on experimental radargrams.

  12. Multiple template-based image matching using alpha-rooted quaternion phase correlation

    NASA Astrophysics Data System (ADS)

    DelMarco, Stephen

    2010-04-01

    In computer vision applications, image matching performed on quality-degraded imagery is difficult due to image content distortion and noise effects. State-of-the art keypoint based matchers, such as SURF and SIFT, work very well on clean imagery. However, performance can degrade significantly in the presence of high noise and clutter levels. Noise and clutter cause the formation of false features which can degrade recognition performance. To address this problem, previously we developed an extension to the classical amplitude and phase correlation forms, which provides improved robustness and tolerance to image geometric misalignments and noise. This extension, called Alpha-Rooted Phase Correlation (ARPC), combines Fourier domain-based alpha-rooting enhancement with classical phase correlation. ARPC provides tunable parameters to control the alpha-rooting enhancement. These parameter values can be optimized to tradeoff between high narrow correlation peaks, and more robust wider, but smaller peaks. Previously, we applied ARPC in the radon transform domain for logo image recognition in the presence of rotational image misalignments. In this paper, we extend ARPC to incorporate quaternion Fourier transforms, thereby creating Alpha-Rooted Quaternion Phase Correlation (ARQPC). We apply ARQPC to the logo image recognition problem. We use ARQPC to perform multiple-reference logo template matching by representing multiple same-class reference templates as quaternion-valued images. We generate recognition performance results on publicly-available logo imagery, and compare recognition results to results generated from standard approaches. We show that small deviations in reference templates of sameclass logos can lead to improved recognition performance using the joint matching inherent in ARQPC.

  13. SU-E-J-108: Template Matching Based On Multiple Templates Can Improve the Tumor Tracking Performance When There Is Large Tumor Deformation

    SciTech Connect

    Shi, X; Lin, J; Diwanji, T; Mooney, K; D'Souza, W; Mistry, N

    2014-06-01

    Purpose: Recently, template matching has been shown to be able to track tumor motion on cine-MRI images. However, artifacts such as deformation, rotation, and/or out-of-plane movement could seriously degrade the performance of this technique. In this work, we demonstrate the utility of multiple templates derived from different phases of tumor motion in reducing the negative effects of artifacts and improving the accuracy of template matching methods. Methods: Data from 2 patients with large tumors and significant tumor deformation were analyzed from a group of 12 patients from an earlier study. Cine-MRI (200 frames) imaging was performed while the patients were instructed to breathe normally. Ground truth tumor position was established on each frame manually by a radiation oncologist. Tumor positions were also automatically determined using template matching with either single or multiple (5) templates. The tracking errors, defined as the absolute differences in tumor positions determined by the manual and automated methods, when using either single or multiple templates were compared in both the AP and SI directions, respectively. Results: Using multiple templates reduced the tracking error of template matching. In the SI direction where the tumor movement and deformation were significant, the mean tracking error decreased from 1.94 mm to 0.91 mm (Patient 1) and from 6.61 mm to 2.06 mm (Patient 2). In the AP direction where the tumor movement was small, the reduction of the mean tracking error was significant in Patient 1 (from 3.36 mm to 1.04 mm), but not in Patient 2 ( from 3.86 mm to 3.80 mm). Conclusion: This study shows the effectiveness of using multiple templates in improving the performance of template matching when artifacts like large tumor deformation or out-of-plane motion exists. Accurate tumor tracking capabilities can be integrated with MRI guided radiation therapy systems. This work was supported in part by grants from NIH/NCI CA 124766 and Varian

  14. Continuous detection of cerebral vasodilatation and vasoconstriction using intracranial pulse morphological template matching.

    PubMed

    Asgari, Shadnaz; Gonzalez, Nestor; Subudhi, Andrew W; Hamilton, Robert; Vespa, Paul; Bergsneider, Marvin; Roach, Robert C; Hu, Xiao

    2012-01-01

    Although accurate and continuous assessment of cerebral vasculature status is highly desirable for managing cerebral vascular diseases, no such method exists for current clinical practice. The present work introduces a novel method for real-time detection of cerebral vasodilatation and vasoconstriction using pulse morphological template matching. Templates consisting of morphological metrics of cerebral blood flow velocity (CBFV) pulse, measured at middle cerebral artery using Transcranial Doppler, are obtained by applying a morphological clustering and analysis of intracranial pulse algorithm to the data collected during induced vasodilatation and vasoconstriction in a controlled setting. These templates were then employed to define a vasodilatation index (VDI) and a vasoconstriction index (VCI) for any inquiry data segment as the percentage of the metrics demonstrating a trend consistent with those obtained from the training dataset. The validation of the proposed method on a dataset of CBFV signals of 27 healthy subjects, collected with a similar protocol as that of training dataset, during hyperventilation (and CO₂ rebreathing tests) shows a sensitivity of 92% (and 82%) for detection of vasodilatation (and vasoconstriction) and the specificity of 90% (and 92%), respectively. Moreover, the proposed method of detection of vasodilatation (vasoconstriction) is capable of rejecting all the cases associated with vasoconstriction (vasodilatation) and outperforms other two conventional techniques by at least 7% for vasodilatation and 19% for vasoconstriction.

  15. Lattice and strain analysis of atomic resolution Z-contrast images based on template matching.

    PubMed

    Zuo, Jian-Min; Shah, Amish B; Kim, Honggyu; Meng, Yifei; Gao, Wenpei; Rouviére, Jean-Luc

    2014-01-01

    A real space approach is developed based on template matching for quantitative lattice analysis using atomic resolution Z-contrast images. The method, called TeMA, uses the template of an atomic column, or a group of atomic columns, to transform the image into a lattice of correlation peaks. This is helped by using a local intensity adjusted correlation and by the design of templates. Lattice analysis is performed on the correlation peaks. A reference lattice is used to correct for scan noise and scan distortions in the recorded images. Using these methods, we demonstrate that a precision of few picometers is achievable in lattice measurement using aberration corrected Z-contrast images. For application, we apply the methods to strain analysis of a molecular beam epitaxy (MBE) grown LaMnO₃ and SrMnO₃ superlattice. The results show alternating epitaxial strain inside the superlattice and its variations across interfaces at the spatial resolution of a single perovskite unit cell. Our methods are general, model free and provide high spatial resolution for lattice analysis.

  16. Semi-automatic template matching based extraction of hyperbolic signatures in ground-penetrating radar images

    NASA Astrophysics Data System (ADS)

    Sagnard, Florence; Tarel, Jean-Philippe

    2015-04-01

    In civil engineering applications, ground-penetrating radar (GPR) is one of the main non destructive technique based on the refraction and reflection of electromagnetic waves to probe the underground and particularly detect damages (cracks, delaminations, texture changes…) and buried objects (utilities, rebars…). An UWB ground-coupled radar operating in the frequency band [0.46;4] GHz and made of bowtie slot antennas has been used because, comparing to a air-launched radar, it increases energy transfer of electromagnetic radiation in the sub-surface and penetration depth. This paper proposes an original adaptation of the generic template matching algorithm to GPR images to recognize, localize and characterize with parameters a specific pattern associated with a hyperbola signature in the two main polarizations. The processing of a radargram (Bscan) is based on four main steps. The first step consists in pre-processing and scaling. The second step uses template matching to isolate and localize individual hyperbola signatures in an environment containing unwanted reflections, noise and overlapping signatures. The algorithm supposes to generate and collect a set of reference hyperbola templates made of a small reflection pattern in the vicinity of the apex in order to further analyze multiple time signals of embedded targets in an image. The standard Euclidian distance between the template shifted and a local zone in the radargram allows to obtain a map of distances. A user-defined threshold allows to select a reduced number of zones having a high similarity measure. In a third step, each zone is analyzed to detect minimum or maximum discrete amplitudes belonging to the first arrival times of a hyperbola signature. In the fourth step, the extracted discrete data (i,j) are fitted by a parametric hyperbola modeling based on the straight ray path hypothesis and using a constraint least square criterion associated with parameter ranges, that are the position, the

  17. Automatic identification of resting state networks: an extended version of multiple template-matching

    NASA Astrophysics Data System (ADS)

    Guaje, Javier; Molina, Juan; Rudas, Jorge; Demertzi, Athena; Heine, Lizette; Tshibanda, Luaba; Soddu, Andrea; Laureys, Steven; Gómez, Francisco

    2015-12-01

    Functional magnetic resonance imaging in resting state (fMRI-RS) constitutes an informative protocol to investigate several pathological and pharmacological conditions. A common approach to study this data source is through the analysis of changes in the so called resting state networks (RSNs). These networks correspond to well-defined functional entities that have been associated to different low and high brain order functions. RSNs may be characterized by using Independent Component Analysis (ICA). ICA provides a decomposition of the fMRI-RS signal into sources of brain activity, but it lacks of information about the nature of the signal, i.e., if the source is artifactual or not. Recently, a multiple template-matching (MTM) approach was proposed to automatically recognize RSNs in a set of Independent Components (ICs). This method provides valuable information to assess subjects at individual level. Nevertheless, it lacks of a mechanism to quantify how much certainty there is about the existence/absence of each network. This information may be important for the assessment of patients with severely damaged brains, in which RSNs may be greatly affected as a result of the pathological condition. In this work we propose a set of changes to the original MTM that improves the RSNs recognition task and also extends the functionality of the method. The key points of this improvement is a standardization strategy and a modification of method's constraints that adds flexibility to the approach. Additionally, we also introduce an analysis to the trustworthiness measurement of each RSN obtained by using template-matching approach. This analysis consists of a thresholding strategy applied over the computed Goodness-of-Fit (GOF) between the set of templates and the ICs. The proposed method was validated on 2 two independent studies (Baltimore, 23 healthy subjects and Liege, 27 healthy subjects) with different configurations of MTM. Results suggest that the method will provide

  18. Automated side-chain model building and sequence assignment by template matching

    SciTech Connect

    Terwilliger, Thomas C.

    2003-01-01

    A method for automated macromolecular side-chain model building and for aligning the sequence to the map is described. An algorithm is described for automated building of side chains in an electron-density map once a main-chain model is built and for alignment of the protein sequence to the map. The procedure is based on a comparison of electron density at the expected side-chain positions with electron-density templates. The templates are constructed from average amino-acid side-chain densities in 574 refined protein structures. For each contiguous segment of main chain, a matrix with entries corresponding to an estimate of the probability that each of the 20 amino acids is located at each position of the main-chain model is obtained. The probability that this segment corresponds to each possible alignment with the sequence of the protein is estimated using a Bayesian approach and high-confidence matches are kept. Once side-chain identities are determined, the most probable rotamer for each side chain is built into the model. The automated procedure has been implemented in the RESOLVE software. Combined with automated main-chain model building, the procedure produces a preliminary model suitable for refinement and extension by an experienced crystallographer.

  19. Automatic spike detection based on adaptive template matching for extracellular neural recordings.

    PubMed

    Kim, Sunghan; McNames, James

    2007-09-30

    Recordings of extracellular neural activity are used in many clinical applications and scientific studies. In most cases, these signals are analyzed as a point process, and a spike detection algorithm is required to estimate the times at which action potentials occurred. Recordings from high-density microelectrode arrays (MEAs) and low-impedance microelectrodes often have a low signal-to-noise ratio (SNR<10) and contain action potentials from more than one neuron. We describe a new detection algorithm based on template matching that only requires the user to specify the minimum and maximum firing rates of the neurons. The algorithm iteratively estimates the morphology of the most prominent action potentials. It is able to achieve a sensitivity of >90% with a false positive rate of <5Hz in recordings with an estimated SNR=3, and it performs better than an optimal threshold detector in recordings with an estimated SNR>2.5.

  20. Accurate three-dimensional pose recognition from monocular images using template matched filtering

    NASA Astrophysics Data System (ADS)

    Picos, Kenia; Diaz-Ramirez, Victor H.; Kober, Vitaly; Montemayor, Antonio S.; Pantrigo, Juan J.

    2016-06-01

    An accurate algorithm for three-dimensional (3-D) pose recognition of a rigid object is presented. The algorithm is based on adaptive template matched filtering and local search optimization. When a scene image is captured, a bank of correlation filters is constructed to find the best correspondence between the current view of the target in the scene and a target image synthesized by means of computer graphics. The synthetic image is created using a known 3-D model of the target and an iterative procedure based on local search. Computer simulation results obtained with the proposed algorithm in synthetic and real-life scenes are presented and discussed in terms of accuracy of pose recognition in the presence of noise, cluttered background, and occlusion. Experimental results show that our proposal presents high accuracy for 3-D pose estimation using monocular images.

  1. Three dimensional template matching segmentation method for motile cells in 3D+t video sequences.

    PubMed

    Pimentel, J A; Corkidi, G

    2010-01-01

    In this work, we describe a segmentation cell method oriented to deal with experimental data obtained from 3D+t microscopical volumes. The proposed segmentation technique takes advantage of the pattern of appearances exhibited by the objects (cells) from different focal planes, as a result of the object translucent properties and its interaction with light. This information allows us to discriminate between cells and artifacts (dust an other) with equivalent size and shape that are present in the biological preparation. Using a simple correlation criteria, the method matches a 3D video template (extracted from a sample of cells) with the motile cells contained into the biological sample, obtaining a high rate of true positives while discarding artifacts. In this work, our analysis is focused on sea urchin spermatozoa cells but is applicable to many other microscopical structures having the same optical properties. PMID:21096252

  2. Two dimensional template matching method for buried object discrimination in GPR data

    NASA Astrophysics Data System (ADS)

    Sezgin, Mehmet

    2009-05-01

    In this study discrimination of two different metallic object classes were studied, utilizing Ground Penetrating Radar (GPR). Feature sets of both classes have almost the same information for both Metal Detector (MD) and GPR data. There were no evident features those are easily discriminate classes. Background removal has been applied to original B-Scan data and then a normalization process was performed. Image thresholding was applied to segment B-Scan GPR images. So, main hyperbolic shape of buried object reflection was extracted and then a morphological process was performed optionally. Templates of each class representatives have been obtained and they were searched whether they match with true class or not. Two data sets were examined experimentally. Actually they were obtained in different time and burial for the same objects. Considerably high discrimination performance was obtained which was not possible by using individual Metal Detector data.

  3. The Elementary Operations of Human Vision Are Not Reducible to Template-Matching

    PubMed Central

    Neri, Peter

    2015-01-01

    It is generally acknowledged that biological vision presents nonlinear characteristics, yet linear filtering accounts of visual processing are ubiquitous. The template-matching operation implemented by the linear-nonlinear cascade (linear filter followed by static nonlinearity) is the most widely adopted computational tool in systems neuroscience. This simple model achieves remarkable explanatory power while retaining analytical tractability, potentially extending its reach to a wide range of systems and levels in sensory processing. The extent of its applicability to human behaviour, however, remains unclear. Because sensory stimuli possess multiple attributes (e.g. position, orientation, size), the issue of applicability may be asked by considering each attribute one at a time in relation to a family of linear-nonlinear models, or by considering all attributes collectively in relation to a specified implementation of the linear-nonlinear cascade. We demonstrate that human visual processing can operate under conditions that are indistinguishable from linear-nonlinear transduction with respect to substantially different stimulus attributes of a uniquely specified target signal with associated behavioural task. However, no specific implementation of a linear-nonlinear cascade is able to account for the entire collection of results across attributes; a satisfactory account at this level requires the introduction of a small gain-control circuit, resulting in a model that no longer belongs to the linear-nonlinear family. Our results inform and constrain efforts at obtaining and interpreting comprehensive characterizations of the human sensory process by demonstrating its inescapably nonlinear nature, even under conditions that have been painstakingly fine-tuned to facilitate template-matching behaviour and to produce results that, at some level of inspection, do conform to linear filtering predictions. They also suggest that compliance with linear transduction may be

  4. The Elementary Operations of Human Vision Are Not Reducible to Template-Matching.

    PubMed

    Neri, Peter

    2015-11-01

    It is generally acknowledged that biological vision presents nonlinear characteristics, yet linear filtering accounts of visual processing are ubiquitous. The template-matching operation implemented by the linear-nonlinear cascade (linear filter followed by static nonlinearity) is the most widely adopted computational tool in systems neuroscience. This simple model achieves remarkable explanatory power while retaining analytical tractability, potentially extending its reach to a wide range of systems and levels in sensory processing. The extent of its applicability to human behaviour, however, remains unclear. Because sensory stimuli possess multiple attributes (e.g. position, orientation, size), the issue of applicability may be asked by considering each attribute one at a time in relation to a family of linear-nonlinear models, or by considering all attributes collectively in relation to a specified implementation of the linear-nonlinear cascade. We demonstrate that human visual processing can operate under conditions that are indistinguishable from linear-nonlinear transduction with respect to substantially different stimulus attributes of a uniquely specified target signal with associated behavioural task. However, no specific implementation of a linear-nonlinear cascade is able to account for the entire collection of results across attributes; a satisfactory account at this level requires the introduction of a small gain-control circuit, resulting in a model that no longer belongs to the linear-nonlinear family. Our results inform and constrain efforts at obtaining and interpreting comprehensive characterizations of the human sensory process by demonstrating its inescapably nonlinear nature, even under conditions that have been painstakingly fine-tuned to facilitate template-matching behaviour and to produce results that, at some level of inspection, do conform to linear filtering predictions. They also suggest that compliance with linear transduction may be

  5. Impact of maximum speed on sprint performance during high-level youth female field hockey matches: female athletes in motion (FAiM) study.

    PubMed

    Vescovi, Jason D

    2014-07-01

    The aim of this study was to examine the impact of maximum sprint speed on peak and mean sprint speed during youth female field hockey matches. Two high-level female field hockey teams (U-17, n = 24, and U-21, n = 20) were monitored during a 4-game international test series using global position system technology and tested for maximum sprint speed. Dependent variables were compared using a 3-factor ANOVA (age group, position, and speed classification); effect sizes (Cohen d) and confidence limits were also calculated. Maximum sprint speed was similar between age groups and positions, with faster players having greater speed than slower players (29.3 ± 0.4 vs 27.2 ± 1.1 km/h). Overall, peak match speed in youth female field hockey players reaches approximately 90% of maximum sprint speed. Absolute peak match speed and mean sprint speed during matches were similar among the age groups (except match 1) and positions (except match 2); however, peak match speed was greater for faster players in matches 3 and 4. No differences were observed in the relative proportion for mean sprint speeds for age groups or positions, but slower players consistently displayed similar relative mean sprint speeds by using a greater proportion of their maximum sprint speed.

  6. Detection of cancerous masses in mammograms by template matching: optimization of template brightness distribution by means of evolutionary algorithm.

    PubMed

    Bator, Marcin; Nieniewski, Mariusz

    2012-02-01

    Optimization of brightness distribution in the template used for detection of cancerous masses in mammograms by means of correlation coefficient is presented. This optimization is performed by the evolutionary algorithm using an auxiliary mass classifier. Brightness along the radius of the circularly symmetric template is coded indirectly by its second derivative. The fitness function is defined as the area under curve (AUC) of the receiver operating characteristic (ROC) for the mass classifier. The ROC and AUC are obtained for a teaching set of regions of interest (ROIs), for which it is known whether a ROI is true-positive (TP) or false-positive (F). The teaching set is obtained by running the mass detector using a template with a predetermined brightness. Subsequently, the evolutionary algorithm optimizes the template by classifying masses in the teaching set. The optimal template (OT) can be used for detection of masses in mammograms with unknown ROIs. The approach was tested on the training and testing sets of the Digital Database for Screening Mammography (DDSM). The free-response receiver operating characteristic (FROC) obtained with the new mass detector seems superior to the FROC for the hemispherical template (HT). Exemplary results are the following: in the case of the training set in the DDSM, the true-positive fraction (TPF) = 0.82 for the OT and 0.79 for the HT; in the case of the testing set, TPF = 0.79 for the OT and 0.72 for the HT. These values were obtained for disease cases, and the false-positive per image (FPI) = 2.

  7. CAD System for Pulmonary Nodule Detection Using Gabor Filtering and Template Matching

    NASA Astrophysics Data System (ADS)

    Bastawrous, Hany Ayad; Nitta, Norihisa; Tsudagawa, Masaru

    This paper aims at developing a Computer Aided Diagnosis (CAD) system used for the detection of pulmonary nodules in chest Computed Tomography (CT) images. These lung nodules include both solid nodules and Ground Glass Opacity (GGO) nodules. In our scheme, we apply Gabor filter on the CT image in order to enhance the detection process. After this we perform some morphological operations including threshold process and labeling to extract all the objects inside the lung area. Then, some feature analysis is used to examine these objects to decide which of them are likely to be potential cancer candidates. Following the feature examination, a template matching between the potential cancer candidates and some Gaussian reference models is performed to determine the similarity between them. The algorithm was applied on 715 slices containing 25 GGO nodules and 82 solid nodules and achieved detection sensitivity of 92% for GGO nodules and 95% for solid nodules with False Positive (FP) rate of 0.75 FP/slice for GGO nodules and 2.32 FP/slice for solid nodules. Finally, we used an Artificial Neural Network (ANN) to reduce the number of FP findings. After using ANN, we were able to reduce the FP rate to 0.25 FP/slice for GGO nodules and 1.62 FP/slice for solid nodules but at the expense of detection sensitivity, which became 84 % for GGO nodules and 91% for solid nodules.

  8. An improved earthquake catalogue in the Marmara Sea region, Turkey, using massive template matching

    NASA Astrophysics Data System (ADS)

    Matrullo, Emanuela; Lengliné, Olivier; Schmittbuhl, Jean; Karabulut, Hayrullah; Bouchon, Michel

    2016-04-01

    After the 1999 Izmit earthquake, the Main Marmara Fault (MMF) represents a 150 km unruptured segment of the North Anatolian Fault located below the Marmara Sea. One of the principal issue for seismic hazard assessment in the region is to know if the MMF is totally or partially locked and where the nucleation of the major forthcoming event is going to take place. The area is actually one of the best-instrumented fault systems in Europe. Since year 2007, various seismic networks both broadband, short period and OBS stations were deployed in order to monitor continuously the seismicity along the MMF and the related fault systems. A recent analysis of the seismicity recorded during the 2007-2012 period has provided new insights on the recent evolution of this important regional seismic gap. This analysis was based on events detected with STA/LTA procedure and manually picked P and S wave arrivals times (Schmittbuhl et al., 2015). In order to extend the level of details and to fully take advantage of the dense seismic network we improved the seismic catalog using an automatic earthquake detection technique based on a template matching approach. This approach uses known earthquake seismic signals in order to detect newer events similar to the tested one from waveform cross-correlation. To set-up the methodology and verify the accuracy and the robustness of the results, we initially focused in the eastern part of the Marmara Sea (Cinarcik basin) and compared new detection with those manually identified. Through the massive analysis of cross-correlation based on the template scanning of the continuous recordings, we construct a refined catalog of earthquakes for the Marmara Sea in 2007-2014 period. Our improved earthquake catalog will provide an effective tool to improve the catalog completeness, to monitor and study the fine details of the time-space distribution of events, to characterize the repeating earthquake source processes and to understand the mechanical state of

  9. A self-calibrating approach for the segmentation of retinal vessels by template matching and contour reconstruction.

    PubMed

    Kovács, György; Hajdu, András

    2016-04-01

    The automated processing of retinal images is a widely researched area in medical image analysis. Screening systems based on the automated and accurate recognition of retinopathies enable the earlier diagnosis of diseases like diabetic retinopathy, hypertension and their complications. The segmentation of the vascular system is a crucial task in the field: on the one hand, the accurate extraction of the vessel pixels aids the detection of other anatomical parts (like the optic disc Hoover and Goldbaum, 2003) and lesions (like microaneurysms Sopharak et al., 2013); on the other hand, the geometrical features of the vascular system and their temporal changes are shown to be related to diseases, like the vessel tortuosity to Fabry disease Sodi et al., 2013 and the arteriolar-to-venus (A/V) ratio to hypertension (Pakter et al., 2005). In this study, a novel technique based on template matching and contour reconstruction is proposed for the segmentation of the vasculature. In the template matching step generalized Gabor function based templates are used to extract the center lines of vessels. Then, the intensity characteristics of vessel contours measured in training databases are reconstructed. The method was trained and tested on two publicly available databases, DRIVE and STARE; and reached an average accuracy of 0.9494 and 0.9610, respectively. We have also carried out cross-database tests and found that the accuracy scores are higher than that of any previous technique trained and tested on the same database.

  10. A model-based 3D template matching technique for pose acquisition of an uncooperative space object.

    PubMed

    Opromolla, Roberto; Fasano, Giancarmine; Rufino, Giancarlo; Grassi, Michele

    2015-01-01

    This paper presents a customized three-dimensional template matching technique for autonomous pose determination of uncooperative targets. This topic is relevant to advanced space applications, like active debris removal and on-orbit servicing. The proposed technique is model-based and produces estimates of the target pose without any prior pose information, by processing three-dimensional point clouds provided by a LIDAR. These estimates are then used to initialize a pose tracking algorithm. Peculiar features of the proposed approach are the use of a reduced number of templates and the idea of building the database of templates on-line, thus significantly reducing the amount of on-board stored data with respect to traditional techniques. An algorithm variant is also introduced aimed at further accelerating the pose acquisition time and reducing the computational cost. Technique performance is investigated within a realistic numerical simulation environment comprising a target model, LIDAR operation and various target-chaser relative dynamics scenarios, relevant to close-proximity flight operations. Specifically, the capability of the proposed techniques to provide a pose solution suitable to initialize the tracking algorithm is demonstrated, as well as their robustness against highly variable pose conditions determined by the relative dynamics. Finally, a criterion for autonomous failure detection of the presented techniques is presented.

  11. A Model-Based 3D Template Matching Technique for Pose Acquisition of an Uncooperative Space Object

    PubMed Central

    Opromolla, Roberto; Fasano, Giancarmine; Rufino, Giancarlo; Grassi, Michele

    2015-01-01

    This paper presents a customized three-dimensional template matching technique for autonomous pose determination of uncooperative targets. This topic is relevant to advanced space applications, like active debris removal and on-orbit servicing. The proposed technique is model-based and produces estimates of the target pose without any prior pose information, by processing three-dimensional point clouds provided by a LIDAR. These estimates are then used to initialize a pose tracking algorithm. Peculiar features of the proposed approach are the use of a reduced number of templates and the idea of building the database of templates on-line, thus significantly reducing the amount of on-board stored data with respect to traditional techniques. An algorithm variant is also introduced aimed at further accelerating the pose acquisition time and reducing the computational cost. Technique performance is investigated within a realistic numerical simulation environment comprising a target model, LIDAR operation and various target-chaser relative dynamics scenarios, relevant to close-proximity flight operations. Specifically, the capability of the proposed techniques to provide a pose solution suitable to initialize the tracking algorithm is demonstrated, as well as their robustness against highly variable pose conditions determined by the relative dynamics. Finally, a criterion for autonomous failure detection of the presented techniques is presented. PMID:25785309

  12. A model-based 3D template matching technique for pose acquisition of an uncooperative space object.

    PubMed

    Opromolla, Roberto; Fasano, Giancarmine; Rufino, Giancarlo; Grassi, Michele

    2015-01-01

    This paper presents a customized three-dimensional template matching technique for autonomous pose determination of uncooperative targets. This topic is relevant to advanced space applications, like active debris removal and on-orbit servicing. The proposed technique is model-based and produces estimates of the target pose without any prior pose information, by processing three-dimensional point clouds provided by a LIDAR. These estimates are then used to initialize a pose tracking algorithm. Peculiar features of the proposed approach are the use of a reduced number of templates and the idea of building the database of templates on-line, thus significantly reducing the amount of on-board stored data with respect to traditional techniques. An algorithm variant is also introduced aimed at further accelerating the pose acquisition time and reducing the computational cost. Technique performance is investigated within a realistic numerical simulation environment comprising a target model, LIDAR operation and various target-chaser relative dynamics scenarios, relevant to close-proximity flight operations. Specifically, the capability of the proposed techniques to provide a pose solution suitable to initialize the tracking algorithm is demonstrated, as well as their robustness against highly variable pose conditions determined by the relative dynamics. Finally, a criterion for autonomous failure detection of the presented techniques is presented. PMID:25785309

  13. Multistation template matching to characterize frequency-magnitude distributions of induced seismicity in the Central and Eastern US

    NASA Astrophysics Data System (ADS)

    Brudzinski, M. R.; Skoumal, R.; Currie, B.

    2015-12-01

    We analyze the frequency-magnitude distribution (FMD) of recent seismic sequences thought to be induced by wastewater injection and hydraulic fracturing in the Central and Eastern U.S. to investigate their physical origin and improve hazard estimates. Multistation template matching is utilized to increase the number of events analyzed by lowering the magnitude of detection. In cases where local deployments are available, we demonstrate that the FMD obtained through template matching using regional data are comparable to those obtained from traditional detection using the local deployment. Since deployments usually occur after seismicity has already been identified, catalogs constructed with regional data offer the advantage of providing a more complete history of the seismicity. We find two primary groups of FMDs for induced sequences: those that generally follow the Gutenberg-Richter power-law and those that generally do not. All of the induced sequences are typically characterized by swarm-like behavior, but the non-power-law FMDs are also characterized by a clustering of events at low magnitudes and particularly low aftershock productivity for a continental interior. Each of the observations in the non-power law FMD cases is predicted by numerical simulations of a seismogenic zone governed by a viscoelastic damage rheology with low effective viscosity in the fault zone. Such a reduction in effective viscosity is expected if fluid injection increases fluid pressures in the fault zone to the point that the fault zone begins to dilate.

  14. Toward detecting and identifying macromolecules in a cellular context: Template matching applied to electron tomograms

    PubMed Central

    Böhm, Jochen; Frangakis, Achilleas S.; Hegerl, Reiner; Nickell, Stephan; Typke, Dieter; Baumeister, Wolfgang

    2000-01-01

    Electron tomography is the only technique available that allows us to visualize the three-dimensional structure of unfixed and unstained cells currently with a resolution of 6–8 nm, but with the prospect to reach 2–4 nm. This raises the possibility of detecting and identifying specific macromolecular complexes within their cellular context by virtue of their structural signature. Templates derived from the high-resolution structure of the molecule under scrutiny are used to search the reconstructed volume. Here we outline and test a computationally feasible two-step procedure: In a first step, mean-curvature motion is used for segmentation, yielding subvolumes that contain with a high probability macromolecules in the expected size range. Subsequently, the particles contained in the subvolumes are identified by cross-correlation, using a set of three-dimensional templates. With simulated and real tomographic data we demonstrate that such an approach is feasible and we explore the detection limits. Even structurally similar particles, such as the thermosome, GroEL, and the 20S proteasome can be identified with high fidelity. This opens up exciting prospects for mapping the territorial distribution of macromolecules and for analyzing molecular interactions in situ. PMID:11087814

  15. Detection and Counting of Orchard Trees from Vhr Images Using a Geometrical-Optical Model and Marked Template Matching

    NASA Astrophysics Data System (ADS)

    Maillard, Philippe; Gomes, Marília F.

    2016-06-01

    This article presents an original algorithm created to detect and count trees in orchards using very high resolution images. The algorithm is based on an adaptation of the "template matching" image processing approach, in which the template is based on a "geometricaloptical" model created from a series of parameters, such as illumination angles, maximum and ambient radiance, and tree size specifications. The algorithm is tested on four images from different regions of the world and different crop types. These images all have < 1 meter spatial resolution and were downloaded from the GoogleEarth application. Results show that the algorithm is very efficient at detecting and counting trees as long as their spectral and spatial characteristics are relatively constant. For walnut, mango and orange trees, the overall accuracy was clearly above 90%. However, the overall success rate for apple trees fell under 75%. It appears that the openness of the apple tree crown is most probably responsible for this poorer result. The algorithm is fully explained with a step-by-step description. At this stage, the algorithm still requires quite a bit of user interaction. The automatic determination of most of the required parameters is under development.

  16. Relevance-Based Template Matching for Tracking Targets in FLIR Imagery

    PubMed Central

    Paravati, Gianluca; Esposito, Stefano

    2014-01-01

    One of the main challenges in automatic target tracking applications is represented by the need to maintain a low computational footprint, especially when dealing with real-time scenarios and the limited resources of embedded environments. In this context, significant results can be obtained by using forward-looking infrared sensors capable of providing distinctive features for targets of interest. In fact, due to their nature, forward-looking infrared (FLIR) images lend themselves to being used with extremely small footprint techniques based on the extraction of target intensity profiles. This work proposes a method for increasing the computational efficiency of template-based target tracking algorithms. In particular, the speed of the algorithm is improved by using a dynamic threshold that narrows the number of computations, thus reducing both execution time and resources usage. The proposed approach has been tested on several datasets, and it has been compared to several target tracking techniques. Gathered results, both in terms of theoretical analysis and experimental data, showed that the proposed approach is able to achieve the same robustness of reference algorithms by reducing the number of operations needed and the processing time. PMID:25093344

  17. Prediction of Signal Peptide Cleavage Sites with Subsite-Coupled and Template Matching Fusion Algorithm.

    PubMed

    Zhang, Shao-Wu; Zhang, Ting-He; Zhang, Jun-Nan; Huang, Yufei

    2014-03-01

    Fast and effective prediction of signal peptides (SP) and their cleavage sites is of great importance in computational biology. The approaches developed to predict signal peptide can be roughly divided into machine learning based, and sliding windows based. In order to further increase the prediction accuracy and coverage of organism for SP cleavage sites, we propose a novel method for predicting SP cleavage sites called Signal-CTF that utilizes machine learning and sliding windows, and is designed for N-termial secretory proteins in a large variety of organisms including human, animal, plant, virus, bacteria, fungi and archaea. Signal-CTF consists of three distinct elements: (1) a subsite-coupled and regularization function with a scaled window of fixed width that selects a set of candidates of possible secretion-cleavable segment for a query secretory protein; (2) a sum fusion system that integrates the outcomes from aligning the cleavage site template sequence with each of the aforementioned candidates in a scaled window of fixed width to determine the best candidate cleavage sites for the query secretory protein; (3) a voting system that identifies the ultimate signal peptide cleavage site among all possible results derived from using scaled windows of different width. When compared with Signal-3L and SignalP 4.0 predictors, the prediction accuracy of Signal-CTF is 4-12 %, 10-25 % higher than that of Signal-3L for human, animal and eukaryote, and SignalP 4.0 for eukaryota, Gram-positive bacteria and Gram-negative bacteria, respectively. Comparing with PRED-SIGNAL and SignalP 4.0 predictors on the 32 archaea secretory proteins of used in Bagos's paper, the prediction accuracy of Signal-CTF is 12.5 %, 25 % higher than that of PRED-SIGNAL and SignalP 4.0, respectively. The predicting results of several long signal peptides show that the Signal-CTF can better predict cleavage sites for long signal peptides than SignalP, Phobius, Philius, SPOCTOPUS, Signal

  18. Spinal Cord Segmentation by One Dimensional Normalized Template Matching: A Novel, Quantitative Technique to Analyze Advanced Magnetic Resonance Imaging Data

    PubMed Central

    Cadotte, Adam; Cadotte, David W.; Livne, Micha; Cohen-Adad, Julien; Fleet, David; Mikulis, David; Fehlings, Michael G.

    2015-01-01

    Spinal cord segmentation is a developing area of research intended to aid the processing and interpretation of advanced magnetic resonance imaging (MRI). For example, high resolution three-dimensional volumes can be segmented to provide a measurement of spinal cord atrophy. Spinal cord segmentation is difficult due to the variety of MRI contrasts and the variation in human anatomy. In this study we propose a new method of spinal cord segmentation based on one-dimensional template matching and provide several metrics that can be used to compare with other segmentation methods. A set of ground-truth data from 10 subjects was manually-segmented by two different raters. These ground truth data formed the basis of the segmentation algorithm. A user was required to manually initialize the spinal cord center-line on new images, taking less than one minute. Template matching was used to segment the new cord and a refined center line was calculated based on multiple centroids within the segmentation. Arc distances down the spinal cord and cross-sectional areas were calculated. Inter-rater validation was performed by comparing two manual raters (n = 10). Semi-automatic validation was performed by comparing the two manual raters to the semi-automatic method (n = 10). Comparing the semi-automatic method to one of the raters yielded a Dice coefficient of 0.91 +/- 0.02 for ten subjects, a mean distance between spinal cord center lines of 0.32 +/- 0.08 mm, and a Hausdorff distance of 1.82 +/- 0.33 mm. The absolute variation in cross-sectional area was comparable for the semi-automatic method versus manual segmentation when compared to inter-rater manual segmentation. The results demonstrate that this novel segmentation method performs as well as a manual rater for most segmentation metrics. It offers a new approach to study spinal cord disease and to quantitatively track changes within the spinal cord in an individual case and across cohorts of subjects. PMID:26445367

  19. Syntheses, structures, characterizations and charge-density matching of novel amino-templated uranyl selenates

    SciTech Connect

    Ling Jie; Sigmon, Ginger E.; Burns, Peter C.

    2009-02-15

    Five hybrid organic-inorganic uranyl selenates have been synthesized, characterized and their structures have been determined. The structure of (C{sub 2}H{sub 8}N){sub 2}[(UO{sub 2}){sub 2}(SeO{sub 4}){sub 3}(H{sub 2}O)] (EthylAUSe) is monoclinic, P2{sub 1}, a=8.290(1), b=12.349(2), c=11.038(2) A, {beta}=104.439(4){sup o}, V=1094.3(3) A{sup 3}, Z=2, R{sub 1}=0.0425. The structure of (C{sub 7}H{sub 10}N){sub 2}[(UO{sub 2})(SeO{sub 4}){sub 2}(H{sub 2}O)]H{sub 2}O (BenzylAUSe) is orthorhombic, Pna2{sub 1}, a=24.221(2), b=11.917(1), c=7.4528(7) A, V=2151.1(3) A{sup 3}, Z=4, R{sub 1}=0.0307. The structure of (C{sub 2}H{sub 10}N{sub 2})[(UO{sub 2})(SeO{sub 4}){sub 2}(H{sub 2}O)](H{sub 2}O){sub 2} (EDAUSe) is monoclinic, P2{sub 1}/c, a=11.677(2), b=7.908(1), c=15.698(2) A, {beta}=98.813(3){sup o}, V=1432.4(3) A{sup 3}, Z=4, R{sub 1}=0.0371. The structure of (C{sub 6}H{sub 22}N{sub 4})[(UO{sub 2})(SeO{sub 4}){sub 2}(H{sub 2}O)](H{sub 2}O) (TETAUSe) is monoclinic, P2{sub 1}/n, a=13.002(2), b=7.962(1), c=14.754(2) A, {beta}=114.077(2){sup o}, V=1394.5(3) A{sup 3}, Z=4, R{sub 1}=0.0323. The structure of (C{sub 6}H{sub 21}N{sub 4})[(UO{sub 2})(SeO{sub 4}){sub 2}(HSeO{sub 4})] (TAEAUSe) is monoclinic, P2{sub 1}/m, a=9.2218(6), b=12.2768(9), c=9.4464(7) A, {beta}=116.1650(10){sup o}, V=959.88(12) A{sup 3}, Z=2, R{sub 1}=0.0322. The inorganic structural units in these compounds are composed of uranyl pentagonal bipyramids and selenate tetrahedra. In each case, tetrahedra link bipyramids through vertex-sharing, resulting in chain or sheet topologies. The charge-density matching principle is discussed relative to the orientations of the organic molecules between the inorganic structural units. - Graphical abstract: The structures of five new inorganic-organic hybrid uranyl selenates present new structural topologies based upon chains and sheets of uranyl pentagonal bipyramids and selenate tetrahedra.

  20. Determination of Hypocenters and Focal Mechanism Solutions for Semi-Historical Earthquakes by Using Template Matching Technique

    NASA Astrophysics Data System (ADS)

    Ishibe, T.; Satake, K.; Muragishi, J.; Tsuruoka, H.; Nakagawa, S.; Sakai, S.; Hirata, N.

    2015-12-01

    Modern seismological analyses are difficult to carry out for earthquakes which occurred in the early period of instrumental observation (between 1870's to 1920's in Japan) because of sparse station distributions and low quality of data particularly clock errors. Source parameters of such old earthquakes can be estimated through comparisons of old data with recent seismological data with known hypocenters and focal mechanism solutions. In this study, we constructed a new method to determine hypocenters and focal mechanism solutions for semi-historical earthquakes using template matching technique. To quantify the similarity in hypocentral locations between recent and semi-historical earthquakes, we use RMS of S-P time differences. As for focal mechanism solutions, we calculated weighted (by the normalized P-wave amplitudes) misfit rate between observed first-motion polarities and expected polarities from recent focal mechanism solutions. We confirmed the effectiveness of this method by applying it to recent earthquakes and comparing the distribution of RMS S-P time differences and weighted misfit rates with hypocenters and focal mechanism solutions determined by the Japan Meteorological Agency. RMS S-P time differences show small values around the true hypocenter and the weighted misfit rates become small for the true focal mechanism solutions. We then preliminarily applied this method to several large earthquakes in semi-historical period. For the M6.8 earthquake of 1922 in the Kanto region, Japan, the six S-P times are similar to those reported from recent intermediate-depth earthquakes in the southern part of Chiba Prefecture. The thirteen first-motion polarities are consistent with those expected from recent strike-slip or normal-faulting types of earthquakes at depth 60-70 km within subducting Philippine Sea slab in this region. Such earthquakes are active along the western edge of slab-slab contact zone between the Philippine Sea and Pacific Plates.

  1. Source mechanism of small long-period events at Mount St. Helens in July 2005 using template matching, phase-weighted stacking, and full-waveform inversion

    USGS Publications Warehouse

    Matoza, Robin S.; Chouet, Bernard A.; Dawson, Phillip B.; Shearer, Peter M.; Haney, Matthew M.; Waite, Gregory P.; Moran, Seth C.; Mikesell, T. Dylan

    2015-01-01

    Long-period (LP, 0.5-5 Hz) seismicity, observed at volcanoes worldwide, is a recognized signature of unrest and eruption. Cyclic LP “drumbeating” was the characteristic seismicity accompanying the sustained dome-building phase of the 2004–2008 eruption of Mount St. Helens (MSH), WA. However, together with the LP drumbeating was a near-continuous, randomly occurring series of tiny LP seismic events (LP “subevents”), which may hold important additional information on the mechanism of seismogenesis at restless volcanoes. We employ template matching, phase-weighted stacking, and full-waveform inversion to image the source mechanism of one multiplet of these LP subevents at MSH in July 2005. The signal-to-noise ratios of the individual events are too low to produce reliable waveform-inversion results, but the events are repetitive and can be stacked. We apply network-based template matching to 8 days of continuous velocity waveform data from 29 June to 7 July 2005 using a master event to detect 822 network triggers. We stack waveforms for 359 high-quality triggers at each station and component, using a combination of linear and phase-weighted stacking to produce clean stacks for use in waveform inversion. The derived source mechanism pointsto the volumetric oscillation (~10 m3) of a subhorizontal crack located at shallow depth (~30 m) in an area to the south of Crater Glacier in the southern portion of the breached MSH crater. A possible excitation mechanism is the sudden condensation of metastable steam from a shallow pressurized hydrothermal system as it encounters cool meteoric water in the outer parts of the edifice, perhaps supplied from snow melt.

  2. Source mechanism of small long-period events at Mount St. Helens in July 2005 using template matching, phase-weighted stacking, and full-waveform inversion

    NASA Astrophysics Data System (ADS)

    Matoza, Robin S.; Chouet, Bernard A.; Dawson, Phillip B.; Shearer, Peter M.; Haney, Matthew M.; Waite, Gregory P.; Moran, Seth C.; Mikesell, T. Dylan

    2015-09-01

    Long-period (LP, 0.5-5 Hz) seismicity, observed at volcanoes worldwide, is a recognized signature of unrest and eruption. Cyclic LP "drumbeating" was the characteristic seismicity accompanying the sustained dome-building phase of the 2004-2008 eruption of Mount St. Helens (MSH), WA. However, together with the LP drumbeating was a near-continuous, randomly occurring series of tiny LP seismic events (LP "subevents"), which may hold important additional information on the mechanism of seismogenesis at restless volcanoes. We employ template matching, phase-weighted stacking, and full-waveform inversion to image the source mechanism of one multiplet of these LP subevents at MSH in July 2005. The signal-to-noise ratios of the individual events are too low to produce reliable waveform inversion results, but the events are repetitive and can be stacked. We apply network-based template matching to 8 days of continuous velocity waveform data from 29 June to 7 July 2005 using a master event to detect 822 network triggers. We stack waveforms for 359 high-quality triggers at each station and component, using a combination of linear and phase-weighted stacking to produce clean stacks for use in waveform inversion. The derived source mechanism points to the volumetric oscillation (˜10 m3) of a subhorizontal crack located at shallow depth (˜30 m) in an area to the south of Crater Glacier in the southern portion of the breached MSH crater. A possible excitation mechanism is the sudden condensation of metastable steam from a shallow pressurized hydrothermal system as it encounters cool meteoric water in the outer parts of the edifice, perhaps supplied from snow melt.

  3. Isotope pattern deconvolution for peptide mass spectrometry by non-negative least squares/least absolute deviation template matching

    PubMed Central

    2012-01-01

    Background The robust identification of isotope patterns originating from peptides being analyzed through mass spectrometry (MS) is often significantly hampered by noise artifacts and the interference of overlapping patterns arising e.g. from post-translational modifications. As the classification of the recorded data points into either ‘noise’ or ‘signal’ lies at the very root of essentially every proteomic application, the quality of the automated processing of mass spectra can significantly influence the way the data might be interpreted within a given biological context. Results We propose non-negative least squares/non-negative least absolute deviation regression to fit a raw spectrum by templates imitating isotope patterns. In a carefully designed validation scheme, we show that the method exhibits excellent performance in pattern picking. It is demonstrated that the method is able to disentangle complicated overlaps of patterns. Conclusions We find that regularization is not necessary to prevent overfitting and that thresholding is an effective and user-friendly way to perform feature selection. The proposed method avoids problems inherent in regularization-based approaches, comes with a set of well-interpretable parameters whose default configuration is shown to generalize well without the need for fine-tuning, and is applicable to spectra of different platforms. The R package IPPD implements the method and is available from the Bioconductor platform (http://bioconductor.fhcrc.org/help/bioc-views/devel/bioc/html/IPPD.html). PMID:23137144

  4. Auto-accumulation method using simulated annealing enables fully automatic particle pickup completely free from a matching template or learning data.

    PubMed

    Ogura, Toshihiko; Sato, Chikara

    2004-06-01

    Single-particle analysis is a 3-D structure determining method using electron microscopy (EM). In this method, a large number of projections is required to create 3-D reconstruction. In order to enable completely automatic pickup without a matching template or a training data set, we established a brand-new method in which the frames to pickup particles are randomly shifted and rotated over the electron micrograph and, using the total average image of the framed images as an index, each frame reaches a particle. In this process, shifts are selected to increase the contrast of the average. By iterated shifts and further selection of the shifts, the frames are induced to shift so as to surround particles. In this algorithm, hundreds of frames are initially distributed randomly over the electron micrograph in which multi-particle images are dispersed. Starting with these frames, one of them is selected and shifted randomly, and acceptance or non-acceptance of its new position is judged using the simulated annealing (SA) method in which the contrast score of the total average image is adopted as an index. After iteration of this process, the position of each frame converges so as to surround a particle and the framed images are picked up. This method is the first unsupervised fully automatic particle picking method which is applicable to EM of various kinds of proteins, especially to low-contrasted cryo-EM protein images.

  5. High level nuclear waste

    SciTech Connect

    Crandall, J L

    1980-01-01

    The DOE Division of Waste Products through a lead office at Savannah River is developing a program to immobilize all US high-level nuclear waste for terminal disposal. DOE high-level wastes include those at the Hanford Plant, the Idaho Chemical Processing Plant, and the Savannah River Plant. Commercial high-level wastes, for which DOE is also developing immobilization technology, include those at the Nuclear Fuel Services Plant and any future commercial fuels reprocessing plants. The first immobilization plant is to be the Defense Waste Processing Facility at Savannah River, scheduled for 1983 project submission to Congress and 1989 operation. Waste forms are still being selected for this plant. Borosilicate glass is currently the reference form, but alternate candidates include concretes, calcines, other glasses, ceramics, and matrix forms.

  6. Computerized detection of breast lesions in multi-centre and multi-instrument DCE-MR data using 3D principal component maps and template matching

    NASA Astrophysics Data System (ADS)

    Ertas, Gokhan; Doran, Simon; Leach, Martin O.

    2011-12-01

    In this study, we introduce a novel, robust and accurate computerized algorithm based on volumetric principal component maps and template matching that facilitates lesion detection on dynamic contrast-enhanced MR. The study dataset comprises 24 204 contrast-enhanced breast MR images corresponding to 4034 axial slices from 47 women in the UK multi-centre study of MRI screening for breast cancer and categorized as high risk. The scans analysed here were performed on six different models of scanner from three commercial vendors, sited in 13 clinics around the UK. 1952 slices from this dataset, containing 15 benign and 13 malignant lesions, were used for training. The remaining 2082 slices, with 14 benign and 12 malignant lesions, were used for test purposes. To prevent false positives being detected from other tissues and regions of the body, breast volumes are segmented from pre-contrast images using a fast semi-automated algorithm. Principal component analysis is applied to the centred intensity vectors formed from the dynamic contrast-enhanced T1-weighted images of the segmented breasts, followed by automatic thresholding to eliminate fatty tissues and slowly enhancing normal parenchyma and a convolution and filtering process to minimize artefacts from moderately enhanced normal parenchyma and blood vessels. Finally, suspicious lesions are identified through a volumetric sixfold neighbourhood connectivity search and calculation of two morphological features: volume and volumetric eccentricity, to exclude highly enhanced blood vessels, nipples and normal parenchyma and to localize lesions. This provides satisfactory lesion localization. For a detection sensitivity of 100%, the overall false-positive detection rate of the system is 1.02/lesion, 1.17/case and 0.08/slice, comparing favourably with previous studies. This approach may facilitate detection of lesions in multi-centre and multi-instrument dynamic contrast-enhanced breast MR data.

  7. Optimizing High Level Waste Disposal

    SciTech Connect

    Dirk Gombert

    2005-09-01

    If society is ever to reap the potential benefits of nuclear energy, technologists must close the fuel-cycle completely. A closed cycle equates to a continued supply of fuel and safe reactors, but also reliable and comprehensive closure of waste issues. High level waste (HLW) disposal in borosilicate glass (BSG) is based on 1970s era evaluations. This host matrix is very adaptable to sequestering a wide variety of radionuclides found in raffinates from spent fuel reprocessing. However, it is now known that the current system is far from optimal for disposal of the diverse HLW streams, and proven alternatives are available to reduce costs by billions of dollars. The basis for HLW disposal should be reassessed to consider extensive waste form and process technology research and development efforts, which have been conducted by the United States Department of Energy (USDOE), international agencies and the private sector. Matching the waste form to the waste chemistry and using currently available technology could increase the waste content in waste forms to 50% or more and double processing rates. Optimization of the HLW disposal system would accelerate HLW disposition and increase repository capacity. This does not necessarily require developing new waste forms, the emphasis should be on qualifying existing matrices to demonstrate protection equal to or better than the baseline glass performance. Also, this proposed effort does not necessarily require developing new technology concepts. The emphasis is on demonstrating existing technology that is clearly better (reliability, productivity, cost) than current technology, and justifying its use in future facilities or retrofitted facilities. Higher waste processing and disposal efficiency can be realized by performing the engineering analyses and trade-studies necessary to select the most efficient methods for processing the full spectrum of wastes across the nuclear complex. This paper will describe technologies being

  8. Genotyping and interpretation of STR-DNA: Low-template, mixtures and database matches-Twenty years of research and development.

    PubMed

    Gill, Peter; Haned, Hinda; Bleka, Oyvind; Hansson, Oskar; Dørum, Guro; Egeland, Thore

    2015-09-01

    The introduction of Short Tandem Repeat (STR) DNA was a revolution within a revolution that transformed forensic DNA profiling into a tool that could be used, for the first time, to create National DNA databases. This transformation would not have been possible without the concurrent development of fluorescent automated sequencers, combined with the ability to multiplex several loci together. Use of the polymerase chain reaction (PCR) increased the sensitivity of the method to enable the analysis of a handful of cells. The first multiplexes were simple: 'the quad', introduced by the defunct UK Forensic Science Service (FSS) in 1994, rapidly followed by a more discriminating 'six-plex' (Second Generation Multiplex) in 1995 that was used to create the world's first national DNA database. The success of the database rapidly outgrew the functionality of the original system - by the year 2000 a new multiplex of ten-loci was introduced to reduce the chance of adventitious matches. The technology was adopted world-wide, albeit with different loci. The political requirement to introduce pan-European databases encouraged standardisation - the development of European Standard Set (ESS) of markers comprising twelve-loci is the latest iteration. Although development has been impressive, the methods used to interpret evidence have lagged behind. For example, the theory to interpret complex DNA profiles (low-level mixtures), had been developed fifteen years ago, but only in the past year or so, are the concepts starting to be widely adopted. A plethora of different models (some commercial and others non-commercial) have appeared. This has led to a confusing 'debate' about the 'best' to use. The different models available are described along with their advantages and disadvantages. A section discusses the development of national DNA databases, along with details of an associated controversy to estimate the strength of evidence of matches. Current methodology is limited to

  9. Biometric template transformation: a security analysis

    NASA Astrophysics Data System (ADS)

    Nagar, Abhishek; Nandakumar, Karthik; Jain, Anil K.

    2010-01-01

    One of the critical steps in designing a secure biometric system is protecting the templates of the users that are stored either in a central database or on smart cards. If a biometric template is compromised, it leads to serious security and privacy threats because unlike passwords, it is not possible for a legitimate user to revoke his biometric identifiers and switch to another set of uncompromised identifiers. One methodology for biometric template protection is the template transformation approach, where the template, consisting of the features extracted from the biometric trait, is transformed using parameters derived from a user specific password or key. Only the transformed template is stored and matching is performed directly in the transformed domain. In this paper, we formally investigate the security strength of template transformation techniques and define six metrics that facilitate a holistic security evaluation. Furthermore, we analyze the security of two wellknown template transformation techniques, namely, Biohashing and cancelable fingerprint templates based on the proposed metrics. Our analysis indicates that both these schemes are vulnerable to intrusion and linkage attacks because it is relatively easy to obtain either a close approximation of the original template (Biohashing) or a pre-image of the transformed template (cancelable fingerprints). We argue that the security strength of template transformation techniques must consider also consider the computational complexity of obtaining a complete pre-image of the transformed template in addition to the complexity of recovering the original biometric template.

  10. High-Level Radioactive Waste.

    ERIC Educational Resources Information Center

    Hayden, Howard C.

    1995-01-01

    Presents a method to calculate the amount of high-level radioactive waste by taking into consideration the following factors: the fission process that yields the waste, identification of the waste, the energy required to run a 1-GWe plant for one year, and the uranium mass required to produce that energy. Briefly discusses waste disposal and…

  11. Biometric template revocation

    NASA Astrophysics Data System (ADS)

    Arndt, Craig M.

    2004-08-01

    Biometric are a powerful technology for identifying humans both locally and at a distance. In order to perform identification or verification biometric systems capture an image of some biometric of a user or subject. The image is then converted mathematical to representation of the person call a template. Since we know that every human in the world is different each human will have different biometric images (different fingerprints, or faces, etc.). This is what makes biometrics useful for identification. However unlike a credit card number or a password to can be given to a person and later revoked if it is compromised and biometric is with the person for life. The problem then is to develop biometric templates witch can be easily revoked and reissued which are also unique to the user and can be easily used for identification and verification. In this paper we develop and present a method to generate a set of templates which are fully unique to the individual and also revocable. By using bases set compression algorithms in an n-dimensional orthogonal space we can represent a give biometric image in an infinite number of equally valued and unique ways. The verification and biometric matching system would be presented with a given template and revocation code. The code will then representing where in the sequence of n-dimensional vectors to start the recognition.

  12. An on-line template improvement algorithm

    NASA Astrophysics Data System (ADS)

    Yin, Yilong; Zhao, Bo; Yang, Xiukun

    2005-03-01

    In automatic fingerprint identification system, incomplete or rigid template may lead to false rejection and false matching. So, how to improve quality of the template, which is called template improvement, is important to automatic fingerprint identify system. In this paper, we propose a template improve algorithm. Based on the case-based method of machine learning and probability theory, we improve the template by deleting pseudo minutia, restoring lost genuine minutia and updating the information of minutia such as positions and directions. And special fingerprint image database is built for this work. Experimental results on this database indicate that our method is effective and quality of fingerprint template is improved evidently. Accordingly, performance of fingerprint matching is also improved stably along with the increase of using time.

  13. Testing sensory evidence against mnemonic templates.

    PubMed

    Myers, Nicholas E; Rohenkohl, Gustavo; Wyart, Valentin; Woolrich, Mark W; Nobre, Anna C; Stokes, Mark G

    2015-01-01

    Most perceptual decisions require comparisons between current input and an internal template. Classic studies propose that templates are encoded in sustained activity of sensory neurons. However, stimulus encoding is itself dynamic, tracing a complex trajectory through activity space. Which part of this trajectory is pre-activated to reflect the template? Here we recorded magneto- and electroencephalography during a visual target-detection task, and used pattern analyses to decode template, stimulus, and decision-variable representation. Our findings ran counter to the dominant model of sustained pre-activation. Instead, template information emerged transiently around stimulus onset and quickly subsided. Cross-generalization between stimulus and template coding, indicating a shared neural representation, occurred only briefly. Our results are compatible with the proposal that template representation relies on a matched filter, transforming input into task-appropriate output. This proposal was consistent with a signed difference response at the perceptual decision stage, which can be explained by a simple neural model.

  14. The CMS high level trigger

    NASA Astrophysics Data System (ADS)

    Gori, Valentina

    2014-05-01

    The CMS experiment has been designed with a 2-level trigger system: the Level 1 Trigger, implemented on custom-designed electronics, and the High Level Trigger (HLT), a streamlined version of the CMS offline reconstruction software running on a computer farm. A software trigger system requires a tradeoff between the complexity of the algorithms running on the available computing power, the sustainable output rate, and the selection efficiency. Here we will present the performance of the main triggers used during the 2012 data taking, ranging from simpler single-object selections to more complex algorithms combining different objects, and applying analysis-level reconstruction and selection. We will discuss the optimisation of the triggers and the specific techniques to cope with the increasing LHC pile-up, reducing its impact on the physics performance.

  15. The CMS High Level Trigger

    NASA Astrophysics Data System (ADS)

    Trocino, Daniele

    2014-06-01

    The CMS experiment has been designed with a two-level trigger system: the Level-1 Trigger, implemented in custom-designed electronics, and the High-Level Trigger (HLT), a streamlined version of the CMS offline reconstruction software running on a computer farm. A software trigger system requires a tradeoff between the complexity of the algorithms running with the available computing power, the sustainable output rate, and the selection efficiency. We present the performance of the main triggers used during the 2012 data taking, ranging from simple single-object selections to more complex algorithms combining different objects, and applying analysis-level reconstruction and selection. We discuss the optimisation of the trigger and the specific techniques to cope with the increasing LHC pile-up, reducing its impact on the physics performance.

  16. Templated biomimetic multifunctional coatings

    NASA Astrophysics Data System (ADS)

    Sun, Chih-Hung; Gonzalez, Adriel; Linn, Nicholas C.; Jiang, Peng; Jiang, Bin

    2008-02-01

    We report a bioinspired templating technique for fabricating multifunctional optical coatings that mimic both unique functionalities of antireflective moth eyes and superhydrophobic cicada wings. Subwavelength-structured fluoropolymer nipple arrays are created by a soft-lithography-like process. The utilization of fluoropolymers simultaneously enhances the antireflective performance and the hydrophobicity of the replicated films. The specular reflectivity matches the optical simulation using a thin-film multilayer model. The dependence of the size and the crystalline ordering of the replicated nipples on the resulting antireflective properties have also been investigated by experiment and modeling. These biomimetic materials may find important technological application in self-cleaning antireflection coatings.

  17. Learning templates for artistic portrait lighting analysis.

    PubMed

    Chen, Xiaowu; Jin, Xin; Wu, Hongyu; Zhao, Qinping

    2015-02-01

    Lighting is a key factor in creating impressive artistic portraits. In this paper, we propose to analyze portrait lighting by learning templates of lighting styles. Inspired by the experience of artists, we first define several novel features that describe the local contrasts in various face regions. The most informative features are then selected with a stepwise feature pursuit algorithm to derive the templates of various lighting styles. After that, the matching scores that measure the similarity between a testing portrait and those templates are calculated for lighting style classification. Furthermore, we train a regression model by the subjective scores and the feature responses of a template to predict the score of a portrait lighting quality. Based on the templates, a novel face illumination descriptor is defined to measure the difference between two portrait lightings. Experimental results show that the learned templates can well describe the lighting styles, whereas the proposed approach can assess the lighting quality of artistic portraits as human being does.

  18. Stochastic template placement algorithm for gravitational wave data analysis

    SciTech Connect

    Harry, I. W.; Sathyaprakash, B. S.; Allen, B.

    2009-11-15

    This paper presents an algorithm for constructing matched-filter template banks in an arbitrary parameter space. The method places templates at random, then removes those which are 'too close' together. The properties and optimality of stochastic template banks generated in this manner are investigated for some simple models. The effectiveness of these template banks for gravitational wave searches for binary inspiral waveforms is also examined. The properties of a stochastic template bank are then compared to the deterministically placed template banks that are currently used in gravitational wave data analysis.

  19. Testing sensory evidence against mnemonic templates

    PubMed Central

    Myers, Nicholas E; Rohenkohl, Gustavo; Wyart, Valentin; Woolrich, Mark W; Nobre, Anna C; Stokes, Mark G

    2015-01-01

    Most perceptual decisions require comparisons between current input and an internal template. Classic studies propose that templates are encoded in sustained activity of sensory neurons. However, stimulus encoding is itself dynamic, tracing a complex trajectory through activity space. Which part of this trajectory is pre-activated to reflect the template? Here we recorded magneto- and electroencephalography during a visual target-detection task, and used pattern analyses to decode template, stimulus, and decision-variable representation. Our findings ran counter to the dominant model of sustained pre-activation. Instead, template information emerged transiently around stimulus onset and quickly subsided. Cross-generalization between stimulus and template coding, indicating a shared neural representation, occurred only briefly. Our results are compatible with the proposal that template representation relies on a matched filter, transforming input into task-appropriate output. This proposal was consistent with a signed difference response at the perceptual decision stage, which can be explained by a simple neural model. DOI: http://dx.doi.org/10.7554/eLife.09000.001 PMID:26653854

  20. High-level waste processing and disposal

    NASA Astrophysics Data System (ADS)

    Crandall, J. L.; Drause, H.; Sombret, C.; Uematsu, K.

    The national high level waste disposal plans for France, the Federal Republic of Germany, Japan, and the United States are covered. Three conclusions are reached. The first conclusion is that an excellent technology already exists for high level waste disposal. With appropriate packaging, spent fuel seems to be an acceptable waste form. Borosilicate glass reprocessing waste forms are well understood, in production in France, and scheduled for production in the next few years in a number of other countries. For final disposal, a number of candidate geological repository sites have been identified and several demonstration sites opened. The second conclusion is that adequate financing and a legal basis for waste disposal are in place in most countries. Costs of high level waste disposal will probably and about 5 to 10% to the costs of nuclear electric power. Third conclusion is less optimistic.

  1. A Software Architecture for High Level Applications

    SciTech Connect

    Shen,G.

    2009-05-04

    A modular software platform for high level applications is under development at the National Synchrotron Light Source II project. This platform is based on client-server architecture, and the components of high level applications on this platform will be modular and distributed, and therefore reusable. An online model server is indispensable for model based control. Different accelerator facilities have different requirements for the online simulation. To supply various accelerator simulators, a set of narrow and general application programming interfaces is developed based on Tracy-3 and Elegant. This paper describes the system architecture for the modular high level applications, the design of narrow and general application programming interface for an online model server, and the prototype of online model server.

  2. High-Level Application Framework for LCLS

    SciTech Connect

    Chu, P; Chevtsov, S.; Fairley, D.; Larrieu, C.; Rock, J.; Rogind, D.; White, G.; Zalazny, M.; /SLAC

    2008-04-22

    A framework for high level accelerator application software is being developed for the Linac Coherent Light Source (LCLS). The framework is based on plug-in technology developed by an open source project, Eclipse. Many existing functionalities provided by Eclipse are available to high-level applications written within this framework. The framework also contains static data storage configuration and dynamic data connectivity. Because the framework is Eclipse-based, it is highly compatible with any other Eclipse plug-ins. The entire infrastructure of the software framework will be presented. Planned applications and plug-ins based on the framework are also presented.

  3. PAIRWISE BLENDING OF HIGH LEVEL WASTE (HLW)

    SciTech Connect

    CERTA, P.J.

    2006-02-22

    The primary objective of this study is to demonstrate a mission scenario that uses pairwise and incidental blending of high level waste (HLW) to reduce the total mass of HLW glass. Secondary objectives include understanding how recent refinements to the tank waste inventory and solubility assumptions affect the mass of HLW glass and how logistical constraints may affect the efficacy of HLW blending.

  4. High-level radioactive wastes. Supplement 1

    SciTech Connect

    McLaren, L.H.

    1984-09-01

    This bibliography contains information on high-level radioactive wastes included in the Department of Energy's Energy Data Base from August 1982 through December 1983. These citations are to research reports, journal articles, books, patents, theses, and conference papers from worldwide sources. Five indexes, each preceded by a brief description, are provided: Corporate Author, Personal Author, Subject, Contract Number, and Report Number. 1452 citations.

  5. Do we understand high-level vision?

    PubMed

    Cox, David Daniel

    2014-04-01

    'High-level' vision lacks a single, agreed upon definition, but it might usefully be defined as those stages of visual processing that transition from analyzing local image structure to analyzing structure of the external world that produced those images. Much work in the last several decades has focused on object recognition as a framing problem for the study of high-level visual cortex, and much progress has been made in this direction. This approach presumes that the operational goal of the visual system is to read-out the identity of an object (or objects) in a scene, in spite of variation in the position, size, lighting and the presence of other nearby objects. However, while object recognition as a operational framing of high-level is intuitive appealing, it is by no means the only task that visual cortex might do, and the study of object recognition is beset by challenges in building stimulus sets that adequately sample the infinite space of possible stimuli. Here I review the successes and limitations of this work, and ask whether we should reframe our approaches to understanding high-level vision.

  6. FUZZY SUPERNOVA TEMPLATES. I. CLASSIFICATION

    SciTech Connect

    Rodney, Steven A.; Tonry, John L. E-mail: jt@ifa.hawaii.ed

    2009-12-20

    Modern supernova (SN) surveys are now uncovering stellar explosions at rates that far surpass what the world's spectroscopic resources can handle. In order to make full use of these SN data sets, it is necessary to use analysis methods that depend only on the survey photometry. This paper presents two methods for utilizing a set of SN light-curve templates to classify SN objects. In the first case, we present an updated version of the Bayesian Adaptive Template Matching program (BATM). To address some shortcomings of that strictly Bayesian approach, we introduce a method for Supernova Ontology with Fuzzy Templates (SOFT), which utilizes fuzzy set theory for the definition and combination of SN light-curve models. For well-sampled light curves with a modest signal-to-noise ratio (S/N >10), the SOFT method can correctly separate thermonuclear (Type Ia) SNe from core collapse SNe with >=98% accuracy. In addition, the SOFT method has the potential to classify SNe into sub-types, providing photometric identification of very rare or peculiar explosions. The accuracy and precision of the SOFT method are verified using Monte Carlo simulations as well as real SN light curves from the Sloan Digital Sky Survey and the SuperNova Legacy Survey. In a subsequent paper, the SOFT method is extended to address the problem of parameter estimation, providing estimates of redshift, distance, and host galaxy extinction without any spectroscopy.

  7. Programmable imprint lithography template

    DOEpatents

    Cardinale, Gregory F.; Talin, Albert A.

    2006-10-31

    A template for imprint lithography (IL) that reduces significantly template production costs by allowing the same template to be re-used for several technology generations. The template is composed of an array of spaced-apart moveable and individually addressable rods or plungers. Thus, the template can be configured to provide a desired pattern by programming the array of plungers such that certain of the plungers are in an "up" or actuated configuration. This arrangement of "up" and "down" plungers forms a pattern composed of protruding and recessed features which can then be impressed onto a polymer film coated substrate by applying a pressure to the template impressing the programmed configuration into the polymer film. The pattern impressed into the polymer film will be reproduced on the substrate by subsequent processing.

  8. High-Level Waste Melter Review

    SciTech Connect

    Ahearne, J.; Gentilucci, J.; Pye, L. D.; Weber, T.; Woolley, F.; Machara, N. P.; Gerdes, K.; Cooley, C.

    2002-02-26

    The U.S. Department of Energy (DOE) is faced with a massive cleanup task in resolving the legacy of environmental problems from years of manufacturing nuclear weapons. One of the major activities within this task is the treatment and disposal of the extremely large amount of high-level radioactive (HLW) waste stored at the Hanford Site in Richland, Washington. The current planning for the method of choice for accomplishing this task is to vitrify (glassify) this waste for disposal in a geologic repository. This paper describes the results of the DOE-chartered independent review of alternatives for solidification of Hanford HLW that could achieve major cost reductions with reasonable long-term risks, including recommendations on a path forward for advanced melter and waste form material research and development. The potential for improved cost performance was considered to depend largely on increased waste loading (fewer high-level waste canisters for disposal), higher throughput, or decreased vitrification facility size.

  9. High-Level Waste Melter Study Report

    SciTech Connect

    Perez, Joseph M.; Bickford, Dennis F.; Day, Delbert E.; Kim, Dong-Sang; Lambert, Steven L.; Marra, Sharon L.; Peeler, David K.; Strachan, Denis M.; Triplett, Mark B.; Vienna, John D.; Wittman, Richard S.

    2001-07-13

    At the Hanford Site in Richland, Washington, the path to site cleanup involves vitrification of the majority of the wastes that currently reside in large underground tanks. A Joule-heated glass melter is the equipment of choice for vitrifying the high-level fraction of these wastes. Even though this technology has general national and international acceptance, opportunities may exist to improve or change the technology to reduce the enormous cost of accomplishing the mission of site cleanup. Consequently, the U.S. Department of Energy requested the staff of the Tanks Focus Area to review immobilization technologies, waste forms, and modifications to requirements for solidification of the high-level waste fraction at Hanford to determine what aspects could affect cost reductions with reasonable long-term risk. The results of this study are summarized in this report.

  10. CLIPS template system for program understanding

    NASA Technical Reports Server (NTRS)

    Finkbine, Ronald B.

    1994-01-01

    Program understanding is a subfield of software reengineering and attempts to recognize the run-time behavior of source code. To this point, success in this area has been limited to very small code segments. An expert system, HLAR (High-Level Algorithm Recognizer), has been written in CLIPS and recognizes three sorting algorithms, selection sort, quicksort, and heapsort. This paper describes the HLAR system in general and, in depth, the CLIPS templates used for program representation and understanding.

  11. Structural templates for comparative protein docking

    PubMed Central

    Anishchenko, Ivan; Kundrotas, Petras J.; Tuzikov, Alexander V.; Vakser, Ilya A.

    2014-01-01

    Structural characterization of protein-protein interactions is important for understanding life processes. Because of the inherent limitations of experimental techniques, such characterization requires computational approaches. Along with the traditional protein-protein docking (free search for a match between two proteins), comparative (template-based) modeling of protein-protein complexes has been gaining popularity. Its development puts an emphasis on full and partial structural similarity between the target protein monomers and the protein-protein complexes previously determined by experimental techniques (templates). The template-based docking relies on the quality and diversity of the template set. We present a carefully curated, non-redundant library of templates containing 4,950 full structures of binary complexes and 5,936 protein-protein interfaces extracted from the full structures at 12Å distance cut-off. Redundancy in the libraries was removed by clustering the PDB structures based on structural similarity. The value of the clustering threshold was determined from the analysis of the clusters and the docking performance on a benchmark set. High structural quality of the interfaces in the template and validation sets was achieved by automated procedures and manual curation. The library is included in the Dockground resource for molecular recognition studies at http://dockground.bioinformatics.ku.edu. PMID:25488330

  12. Structural templates for comparative protein docking.

    PubMed

    Anishchenko, Ivan; Kundrotas, Petras J; Tuzikov, Alexander V; Vakser, Ilya A

    2015-09-01

    Structural characterization of protein-protein interactions is important for understanding life processes. Because of the inherent limitations of experimental techniques, such characterization requires computational approaches. Along with the traditional protein-protein docking (free search for a match between two proteins), comparative (template-based) modeling of protein-protein complexes has been gaining popularity. Its development puts an emphasis on full and partial structural similarity between the target protein monomers and the protein-protein complexes previously determined by experimental techniques (templates). The template-based docking relies on the quality and diversity of the template set. We present a carefully curated, nonredundant library of templates containing 4950 full structures of binary complexes and 5936 protein-protein interfaces extracted from the full structures at 12 Å distance cut-off. Redundancy in the libraries was removed by clustering the PDB structures based on structural similarity. The value of the clustering threshold was determined from the analysis of the clusters and the docking performance on a benchmark set. High structural quality of the interfaces in the template and validation sets was achieved by automated procedures and manual curation. The library is included in the Dockground resource for molecular recognition studies at http://dockground.bioinformatics.ku.edu.

  13. EAP high-level product architecture

    NASA Astrophysics Data System (ADS)

    Gudlaugsson, T. V.; Mortensen, N. H.; Sarban, R.

    2013-04-01

    EAP technology has the potential to be used in a wide range of applications. This poses the challenge to the EAP component manufacturers to develop components for a wide variety of products. Danfoss Polypower A/S is developing an EAP technology platform, which can form the basis for a variety of EAP technology products while keeping complexity under control. High level product architecture has been developed for the mechanical part of EAP transducers, as the foundation for platform development. A generic description of an EAP transducer forms the core of the high level product architecture. This description breaks down the EAP transducer into organs that perform the functions that may be present in an EAP transducer. A physical instance of an EAP transducer contains a combination of the organs needed to fulfill the task of actuator, sensor, and generation. Alternative principles for each organ allow the function of the EAP transducers to be changed, by basing the EAP transducers on a different combination of organ alternatives. A model providing an overview of the high level product architecture has been developed to support daily development and cooperation across development teams. The platform approach has resulted in the first version of an EAP technology platform, on which multiple EAP products can be based. The contents of the platform have been the result of multi-disciplinary development work at Danfoss PolyPower, as well as collaboration with potential customers and research institutions. Initial results from applying the platform on demonstrator design for potential applications are promising. The scope of the article does not include technical details.

  14. The effects of high level infrasound

    SciTech Connect

    Johnson, D.L.

    1980-02-01

    This paper will attempt to survey the current knowledge on the effects of relative high levels of infrasound on humans. While this conference is concerned mainly about hearing, some discussion of other physiological effects is appropriate. Such discussion also serves to highlight a basic question, 'Is hearing the main concern of infrasound and low frequency exposure, or is there a more sensitive mechanism'. It would be comforting to know that the focal point of this conference is indeed the most important concern. Therefore, besides hearing loss and auditory threshold of infrasonic and low frequency exposure, four other effects will be provided. These are performance, respiration, annoyance, and vibration.

  15. Service Oriented Architecture for High Level Applications

    SciTech Connect

    Chu, Chungming; Chevtsov, Sergei; Wu, Juhao; Shen, Guobao; /Brookhaven

    2012-06-28

    Standalone high level applications often suffer from poor performance and reliability due to lengthy initialization, heavy computation and rapid graphical update. Service-oriented architecture (SOA) is trying to separate the initialization and computation from applications and to distribute such work to various service providers. Heavy computation such as beam tracking will be done periodically on a dedicated server and data will be available to client applications at all time. Industrial standard service architecture can help to improve the performance, reliability and maintainability of the service. Robustness will also be improved by reducing the complexity of individual client applications.

  16. Templates, Numbers & Watercolors.

    ERIC Educational Resources Information Center

    Clemesha, David J.

    1990-01-01

    Describes how a second-grade class used large templates to draw and paint five-digit numbers. The lesson integrated artistic knowledge and vocabulary with their mathematics lesson in place value. Students learned how draftspeople use templates, and they studied number paintings by Charles Demuth and Jasper Johns. (KM)

  17. Technetium Chemistry in High-Level Waste

    SciTech Connect

    Hess, Nancy J.

    2006-06-01

    Tc contamination is found within the DOE complex at those sites whose mission involved extraction of plutonium from irradiated uranium fuel or isotopic enrichment of uranium. At the Hanford Site, chemical separations and extraction processes generated large amounts of high level and transuranic wastes that are currently stored in underground tanks. The waste from these extraction processes is currently stored in underground High Level Waste (HLW) tanks. However, the chemistry of the HLW in any given tank is greatly complicated by repeated efforts to reduce volume and recover isotopes. These processes ultimately resulted in mixing of waste streams from different processes. As a result, the chemistry and the fate of Tc in HLW tanks are not well understood. This lack of understanding has been made evident in the failed efforts to leach Tc from sludge and to remove Tc from supernatants prior to immobilization. Although recent interest in Tc chemistry has shifted from pretreatment chemistry to waste residuals, both needs are served by a fundamental understanding of Tc chemistry.

  18. High level intelligent control of telerobotics systems

    NASA Technical Reports Server (NTRS)

    Mckee, James

    1988-01-01

    A high level robot command language is proposed for the autonomous mode of an advanced telerobotics system and a predictive display mechanism for the teleoperational model. It is believed that any such system will involve some mixture of these two modes, since, although artificial intelligence can facilitate significant autonomy, a system that can resort to teleoperation will always have the advantage. The high level command language will allow humans to give the robot instructions in a very natural manner. The robot will then analyze these instructions to infer meaning so that is can translate the task into lower level executable primitives. If, however, the robot is unable to perform the task autonomously, it will switch to the teleoperational mode. The time delay between control movement and actual robot movement has always been a problem in teleoperations. The remote operator may not actually see (via a monitor) the results of high actions for several seconds. A computer generated predictive display system is proposed whereby the operator can see a real-time model of the robot's environment and the delayed video picture on the monitor at the same time.

  19. High-level waste: View from Nevada

    SciTech Connect

    Miller, B.

    1994-12-31

    {open_quotes}Instead of acknowledging the serious shortcomings of the current waste program, the Department of Energy (DOE) has sought to tighten the screws on Nevada,{close_quotes} says Nevada Governor Bob Miller. Nevada`s opposition to the federal government`s proposed high-level radioactive waste repository at Yucca Mountain has grown out of fundamental flaws within the siting process, says Miller. {open_quotes}This process has left the nation with one technically flawed site as its sole prospect for nuclear waste disposal,{close_quotes} he says. Miller claims that DOE has acknowledged that the site is inadequate. Nevertheless, he says, the agency has insisted on pressing ahead with its plans, attempting to {open_quotes}adjust the standards to fit the site.{close_quotes} Miller concludes that dry and/or above-ground waste storage at reactor site represents a more sensible - and less costly - disposal method for high-level wastes, at least in the short term.

  20. Commissioning of the CMS High Level Trigger

    SciTech Connect

    Agostino, Lorenzo; et al.

    2009-08-01

    The CMS experiment will collect data from the proton-proton collisions delivered by the Large Hadron Collider (LHC) at a centre-of-mass energy up to 14 TeV. The CMS trigger system is designed to cope with unprecedented luminosities and LHC bunch-crossing rates up to 40 MHz. The unique CMS trigger architecture only employs two trigger levels. The Level-1 trigger is implemented using custom electronics, while the High Level Trigger (HLT) is based on software algorithms running on a large cluster of commercial processors, the Event Filter Farm. We present the major functionalities of the CMS High Level Trigger system as of the starting of LHC beams operations in September 2008. The validation of the HLT system in the online environment with Monte Carlo simulated data and its commissioning during cosmic rays data taking campaigns are discussed in detail. We conclude with the description of the HLT operations with the first circulating LHC beams before the incident occurred the 19th September 2008.

  1. High-level connectionist models. Semiannual report

    SciTech Connect

    Pollack, J.B.

    1989-08-01

    The major achievement of this semiannum was the significant revision and extension of the Recursive Auto-Associative Memory (RAAM) work for publication in the journal Artificial Intelligence. Included as an appendix to this report, the article includes several new elements: (1) Background - The work was more clearly set into the area of recursive distributed representations, machine learning, and the adequacy of the connectionist approach for high-level cognitive modeling; (2) New Experiment - RAAM was applied to finding compact representations for sequences of letters; (3) Analysis - The developed representations were analyzed as features which range from categorical to distinctive. Categorical features distinguish between conceptual categories while distinctive features vary within categories and discriminate or label the members. The representations were also analyzed geometrically; and (4) Applications - Feasibility studies were performed and described on inference by association, and on using RAAM-generated patterns along with cascaded networks for natural language parsing. Both of these remain long-term goals of the project.

  2. The High Level Data Reduction Library

    NASA Astrophysics Data System (ADS)

    Ballester, P.; Gabasch, A.; Jung, Y.; Modigliani, A.; Taylor, J.; Coccato, L.; Freudling, W.; Neeser, M.; Marchetti, E.

    2015-09-01

    The European Southern Observatory (ESO) provides pipelines to reduce data for most of the instruments at its Very Large telescope (VLT). These pipelines are written as part of the development of VLT instruments, and are used both in the ESO's operational environment and by science users who receive VLT data. All the pipelines are highly specific geared toward instruments. However, experience showed that the independently developed pipelines include significant overlap, duplication and slight variations of similar algorithms. In order to reduce the cost of development, verification and maintenance of ESO pipelines, and at the same time improve the scientific quality of pipelines data products, ESO decided to develop a limited set of versatile high-level scientific functions that are to be used in all future pipelines. The routines are provided by the High-level Data Reduction Library (HDRL). To reach this goal, we first compare several candidate algorithms and verify them during a prototype phase using data sets from several instruments. Once the best algorithm and error model have been chosen, we start a design and implementation phase. The coding of HDRL is done in plain C and using the Common Pipeline Library (CPL) functionality. HDRL adopts consistent function naming conventions and a well defined API to minimise future maintenance costs, implements error propagation, uses pixel quality information, employs OpenMP to take advantage of multi-core processors, and is verified with extensive unit and regression tests. This poster describes the status of the project and the lesson learned during the development of reusable code implementing algorithms of high scientific quality.

  3. Physical synthesis of quantum circuits using templates

    NASA Astrophysics Data System (ADS)

    Mirkhani, Zahra; Mohammadzadeh, Naser

    2016-06-01

    Similar to traditional CMOS circuits, quantum circuit design flow is divided into two main processes: logic synthesis and physical design. Addressing the limitations imposed on optimization of the quantum circuit metrics because of no information sharing between logic synthesis and physical design processes, the concept of "physical synthesis" was introduced for quantum circuit flow, and a few techniques were proposed for it. Following that concept, in this paper a new approach for physical synthesis inspired by template matching idea in quantum logic synthesis is proposed to improve the latency of quantum circuits. Experiments show that by using template matching as a physical synthesis approach, the latency of quantum circuits can be improved by more than 23.55 % on average.

  4. Physical synthesis of quantum circuits using templates

    NASA Astrophysics Data System (ADS)

    Mirkhani, Zahra; Mohammadzadeh, Naser

    2016-10-01

    Similar to traditional CMOS circuits, quantum circuit design flow is divided into two main processes: logic synthesis and physical design. Addressing the limitations imposed on optimization of the quantum circuit metrics because of no information sharing between logic synthesis and physical design processes, the concept of " physical synthesis" was introduced for quantum circuit flow, and a few techniques were proposed for it. Following that concept, in this paper a new approach for physical synthesis inspired by template matching idea in quantum logic synthesis is proposed to improve the latency of quantum circuits. Experiments show that by using template matching as a physical synthesis approach, the latency of quantum circuits can be improved by more than 23.55 % on average.

  5. Engineering neural systems for high-level problem solving.

    PubMed

    Sylvester, Jared; Reggia, James

    2016-07-01

    There is a long-standing, sometimes contentious debate in AI concerning the relative merits of a symbolic, top-down approach vs. a neural, bottom-up approach to engineering intelligent machine behaviors. While neurocomputational methods excel at lower-level cognitive tasks (incremental learning for pattern classification, low-level sensorimotor control, fault tolerance and processing of noisy data, etc.), they are largely non-competitive with top-down symbolic methods for tasks involving high-level cognitive problem solving (goal-directed reasoning, metacognition, planning, etc.). Here we take a step towards addressing this limitation by developing a purely neural framework named galis. Our goal in this work is to integrate top-down (non-symbolic) control of a neural network system with more traditional bottom-up neural computations. galis is based on attractor networks that can be "programmed" with temporal sequences of hand-crafted instructions that control problem solving by gating the activity retention of, communication between, and learning done by other neural networks. We demonstrate the effectiveness of this approach by showing that it can be applied successfully to solve sequential card matching problems, using both human performance and a top-down symbolic algorithm as experimental controls. Solving this kind of problem makes use of top-down attention control and the binding together of visual features in ways that are easy for symbolic AI systems but not for neural networks to achieve. Our model can not only be instructed on how to solve card matching problems successfully, but its performance also qualitatively (and sometimes quantitatively) matches the performance of both human subjects that we had perform the same task and the top-down symbolic algorithm that we used as an experimental control. We conclude that the core principles underlying the galis framework provide a promising approach to engineering purely neurocomputational systems for problem

  6. Engineering neural systems for high-level problem solving.

    PubMed

    Sylvester, Jared; Reggia, James

    2016-07-01

    There is a long-standing, sometimes contentious debate in AI concerning the relative merits of a symbolic, top-down approach vs. a neural, bottom-up approach to engineering intelligent machine behaviors. While neurocomputational methods excel at lower-level cognitive tasks (incremental learning for pattern classification, low-level sensorimotor control, fault tolerance and processing of noisy data, etc.), they are largely non-competitive with top-down symbolic methods for tasks involving high-level cognitive problem solving (goal-directed reasoning, metacognition, planning, etc.). Here we take a step towards addressing this limitation by developing a purely neural framework named galis. Our goal in this work is to integrate top-down (non-symbolic) control of a neural network system with more traditional bottom-up neural computations. galis is based on attractor networks that can be "programmed" with temporal sequences of hand-crafted instructions that control problem solving by gating the activity retention of, communication between, and learning done by other neural networks. We demonstrate the effectiveness of this approach by showing that it can be applied successfully to solve sequential card matching problems, using both human performance and a top-down symbolic algorithm as experimental controls. Solving this kind of problem makes use of top-down attention control and the binding together of visual features in ways that are easy for symbolic AI systems but not for neural networks to achieve. Our model can not only be instructed on how to solve card matching problems successfully, but its performance also qualitatively (and sometimes quantitatively) matches the performance of both human subjects that we had perform the same task and the top-down symbolic algorithm that we used as an experimental control. We conclude that the core principles underlying the galis framework provide a promising approach to engineering purely neurocomputational systems for problem

  7. Plug and drill template

    NASA Technical Reports Server (NTRS)

    Orella, S.

    1979-01-01

    Device installs plugs and then drills them after sandwich face sheets are in place. Template guides drill bit into center of each concealed plug thereby saving considerable time and fostering weight reduction with usage of smaller plugs.

  8. STAR Grantee 101 template

    EPA Science Inventory

    The presentation covers the standard Terms and Conditions, from reporting, to Human Subject research, to publication disclaimers, and offers some resources to find helpful information. Some slides are intended as a template, where project officers can enter specific information (...

  9. Decontamination of high-level waste canisters

    SciTech Connect

    Nesbitt, J.F.; Slate, S.C.; Fetrow, L.K.

    1980-12-01

    This report presents evaluations of several methods for the in-process decontamination of metallic canisters containing any one of a number of solidified high-level waste (HLW) forms. The use of steam-water, steam, abrasive blasting, electropolishing, liquid honing, vibratory finishing and soaking have been tested or evaluated as potential techniques to decontaminate the outer surfaces of HLW canisters. Either these techniques have been tested or available literature has been examined to assess their applicability to the decontamination of HLW canisters. Electropolishing has been found to be the most thorough method to remove radionuclides and other foreign material that may be deposited on or in the outer surface of a canister during any of the HLW processes. Steam or steam-water spraying techniques may be adequate for some applications but fail to remove all contaminated forms that could be present in some of the HLW processes. Liquid honing and abrasive blasting remove contamination and foreign material very quickly and effectively from small areas and components although these blasting techniques tend to disperse the material removed from the cleaned surfaces. Vibratory finishing is very capable of removing the bulk of contamination and foreign matter from a variety of materials. However, special vibratory finishing equipment would have to be designed and adapted for a remote process. Soaking techniques take long periods of time and may not remove all of the smearable contamination. If soaking involves pickling baths that use corrosive agents, these agents may cause erosion of grain boundaries that results in rough surfaces.

  10. Performance of the CMS High Level Trigger

    NASA Astrophysics Data System (ADS)

    Perrotta, Andrea

    2015-12-01

    The CMS experiment has been designed with a 2-level trigger system. The first level is implemented using custom-designed electronics. The second level is the so-called High Level Trigger (HLT), a streamlined version of the CMS offline reconstruction software running on a computer farm. For Run II of the Large Hadron Collider, the increases in center-of-mass energy and luminosity will raise the event rate to a level challenging for the HLT algorithms. The increase in the number of interactions per bunch crossing, on average 25 in 2012, and expected to be around 40 in Run II, will be an additional complication. We present here the expected performance of the main triggers that will be used during the 2015 data taking campaign, paying particular attention to the new approaches that have been developed to cope with the challenges of the new run. This includes improvements in HLT electron and photon reconstruction as well as better performing muon triggers. We will also present the performance of the improved tracking and vertexing algorithms, discussing their impact on the b-tagging performance as well as on the jet and missing energy reconstruction.

  11. HIGH LEVEL RF FOR THE SNS RING.

    SciTech Connect

    ZALTSMAN,A.; BLASKIEWICZ,M.; BRENNAN,J.; BRODOWSKI,J.; METH,M.; SPITZ,R.; SEVERINO,F.

    2002-06-03

    A high level RF system (HLRF) consisting of power amplifiers (PA's) and ferrite loaded cavities is being designed and built by Brookhaven National Laboratory (BNL) for the Spallation Neutron Source (SNS) project. It is a fixed frequency, two harmonic system whose main function is to maintain a gap for the kicker rise time. Three cavities running at the fundamental harmonic (h=l) will provide 40 kV and one cavity at the second harmonic (h=2) will provide 20 kV. Each cavity has two gaps with a design voltage of 10 kV per gap and will be driven by a power amplifier (PA) directly adjacent to it. The PA uses a 600kW tetrode to provide the necessary drive current. The anode of the tetrode is magnetically coupled to the downstream cell of the cavity. Drive to the PA will be provided by a wide band, solid state amplifier located remotely. A dynamic tuning scheme will be implemented to help compensate for the effect of beam loading.

  12. CMS High Level Trigger Timing Measurements

    NASA Astrophysics Data System (ADS)

    Richardson, Clint

    2015-12-01

    The two-level trigger system employed by CMS consists of the Level 1 (L1) Trigger, which is implemented using custom-built electronics, and the High Level Trigger (HLT), a farm of commercial CPUs running a streamlined version of the offline CMS reconstruction software. The operational L1 output rate of 100 kHz, together with the number of CPUs in the HLT farm, imposes a fundamental constraint on the amount of time available for the HLT to process events. Exceeding this limit impacts the experiment's ability to collect data efficiently. Hence, there is a critical need to characterize the performance of the HLT farm as well as the algorithms run prior to start up in order to ensure optimal data taking. Additional complications arise from the fact that the HLT farm consists of multiple generations of hardware and there can be subtleties in machine performance. We present our methods of measuring the timing performance of the CMS HLT, including the challenges of making such measurements. Results for the performance of various Intel Xeon architectures from 2009-2014 and different data taking scenarios are also presented.

  13. Virus templated metallic nanoparticles

    NASA Astrophysics Data System (ADS)

    Aljabali, Alaa A. A.; Barclay, J. Elaine; Lomonossoff, George P.; Evans, David J.

    2010-12-01

    Plant viruses are considered as nanobuilding blocks that can be used as synthons or templates for novel materials. Cowpea mosaic virus (CPMV) particles have been shown to template the fabrication of metallic nanoparticles by an electroless deposition metallization process. Palladium ions were electrostatically bound to the virus capsid and, when reduced, acted as nucleation sites for the subsequent metal deposition from solution. The method, although simple, produced highly monodisperse metallic nanoparticles with a diameter of ca. <=35 nm. CPMV-templated particles were prepared with cobalt, nickel, iron, platinum, cobalt-platinum and nickel-iron.Plant viruses are considered as nanobuilding blocks that can be used as synthons or templates for novel materials. Cowpea mosaic virus (CPMV) particles have been shown to template the fabrication of metallic nanoparticles by an electroless deposition metallization process. Palladium ions were electrostatically bound to the virus capsid and, when reduced, acted as nucleation sites for the subsequent metal deposition from solution. The method, although simple, produced highly monodisperse metallic nanoparticles with a diameter of ca. <=35 nm. CPMV-templated particles were prepared with cobalt, nickel, iron, platinum, cobalt-platinum and nickel-iron. Electronic supplementary information (ESI) available: Additional experimental detail, agarose gel electrophoresis results, energy dispersive X-ray spectra, ζ-potential measurements, dynamic light scattering data, nanoparticle tracking analysis and an atomic force microscopy image of Ni-CPMV. See DOI: 10.1039/c0nr00525h

  14. The sidebar template and extraction of invariant feature of calligraphy and painting seal

    NASA Astrophysics Data System (ADS)

    Hu, Zheng-kun; Bao, Hong; Lou, Hai-tao

    2009-07-01

    The paper propose a novel seal extract method through template matching based on the characteristics of the external contour of the seal image in Chinese Painting and Calligraphy. By analyzing the characteristics of the seal edge, we obtain the priori knowledge of the seal edge, and set up the outline template of the seals, then design a template matching method by computing the distance difference between the outline template and the seal image edge which can extract seal image from Chinese Painting and Calligraphy effectively. This method is proved to have higher extraction rate by experiment results than the traditional image extract methods.

  15. DEFENSE HIGH LEVEL WASTE GLASS DEGRADATION

    SciTech Connect

    W. Ebert

    2001-09-20

    The purpose of this Analysis/Model Report (AMR) is to document the analyses that were done to develop models for radionuclide release from high-level waste (HLW) glass dissolution that can be integrated into performance assessment (PA) calculations conducted to support site recommendation and license application for the Yucca Mountain site. This report was developed in accordance with the ''Technical Work Plan for Waste Form Degradation Process Model Report for SR'' (CRWMS M&O 2000a). It specifically addresses the item, ''Defense High Level Waste Glass Degradation'', of the product technical work plan. The AP-3.15Q Attachment 1 screening criteria determines the importance for its intended use of the HLW glass model derived herein to be in the category ''Other Factors for the Postclosure Safety Case-Waste Form Performance'', and thus indicates that this factor does not contribute significantly to the postclosure safety strategy. Because the release of radionuclides from the glass will depend on the prior dissolution of the glass, the dissolution rate of the glass imposes an upper bound on the radionuclide release rate. The approach taken to provide a bound for the radionuclide release is to develop models that can be used to calculate the dissolution rate of waste glass when contacted by water in the disposal site. The release rate of a particular radionuclide can then be calculated by multiplying the glass dissolution rate by the mass fraction of that radionuclide in the glass and by the surface area of glass contacted by water. The scope includes consideration of the three modes by which water may contact waste glass in the disposal system: contact by humid air, dripping water, and immersion. The models for glass dissolution under these contact modes are all based on the rate expression for aqueous dissolution of borosilicate glasses. The mechanism and rate expression for aqueous dissolution are adequately understood; the analyses in this AMR were conducted to

  16. Iris-based authentication system with template protection and renewability

    NASA Astrophysics Data System (ADS)

    Ercole, Chiara; Campisi, Patrizio; Neri, Alessandro

    2007-10-01

    Biometrics is the most emerging technology for automatic people authentication, nevertheless severe concerns raised about security of such systems and users' privacy. In case of malicious attacks toward one or more components of the authentication system, stolen biometric features cannot be replaced. This paper focuses on securing the enrollment database and the communication channel between such database and the matcher. In particular, a method is developed to protect the stored biometric templates, adapting the fuzzy commitment scheme to iris biometrics by exploiting error correction codes tailored on template discriminability. The aforementioned method allows template renewability applied to iris based authentication and guarantees high security performing the match in the encrypted domain.

  17. Fast image matching algorithm based on projection characteristics

    NASA Astrophysics Data System (ADS)

    Zhou, Lijuan; Yue, Xiaobo; Zhou, Lijun

    2011-06-01

    Based on analyzing the traditional template matching algorithm, this paper identified the key factors restricting the speed of matching and put forward a brand new fast matching algorithm based on projection. Projecting the grayscale image, this algorithm converts the two-dimensional information of the image into one-dimensional one, and then matches and identifies through one-dimensional correlation, meanwhile, because of normalization has been done, when the image brightness or signal amplitude increasing in proportion, it could also perform correct matching. Experimental results show that the projection characteristics based image registration method proposed in this article could greatly improve the matching speed, which ensuring the matching accuracy as well.

  18. Multinomial pattern matching revisited

    NASA Astrophysics Data System (ADS)

    Horvath, Matthew S.; Rigling, Brian D.

    2015-05-01

    Multinomial pattern matching (MPM) is an automatic target recognition algorithm developed for specifically radar data at Sandia National Laboratories. The algorithm is in a family of algorithms that first quantizes pixel value into Nq bins based on pixel amplitude before training and classification. This quantization step reduces the sensitivity of algorithm performance to absolute intensity variation in the data, typical of radar data where signatures exhibit high variation for even small changes in aspect angle. Our previous work has focused on performance analysis of peaky template matching, a special case of MPM where binary quantization is used (Nq = 2). Unfortunately references on these algorithms are generally difficult to locate and here we revisit the MPM algorithm and illustrate the underlying statistical model and decision rules for two algorithm interpretations: the 1-of-K vector form and the scalar. MPM can also be used as a detector and specific attention is given to algorithm tuning where "peak pixels" are chosen based on their underlying empirical probabilities according to a reward minimization strategy aimed at reducing false alarms in the detection scenario and false positives in a classification capacity. The algorithms are demonstrated using Monte Carlo simulations on the AFRL civilian vehicle dataset for variety of choices of Nq.

  19. Fan fault diagnosis based on symmetrized dot pattern analysis and image matching

    NASA Astrophysics Data System (ADS)

    Xu, Xiaogang; Liu, Haixiao; Zhu, Hao; Wang, Songling

    2016-07-01

    To detect the mechanical failure of fans, a new diagnostic method based on the symmetrized dot pattern (SDP) analysis and image matching is proposed. Vibration signals of 13 kinds of running states are acquired on a centrifugal fan test bed and reconstructed by the SDP technique. The SDP pattern templates of each running state are established. An image matching method is performed to diagnose the fault. In order to improve the diagnostic accuracy, the single template, multiple templates and clustering fault templates are used to perform the image matching.

  20. Activity profile of high-level Australian lacrosse players.

    PubMed

    Polley, Chris S; Cormack, Stuart J; Gabbett, Tim J; Polglaze, Ted

    2015-01-01

    Despite lacrosse being one of the fastest growing team sports in the world, there is a paucity of information detailing the activity profile of high-level players. Microtechnology systems (global positioning systems and accelerometers) provide the opportunity to obtain detailed information on the activity profile in lacrosse. Therefore, this study aimed to analyze the activity profile of lacrosse match-play using microtechnology. Activity profile variables assessed relative to minutes of playing time included relative distance (meter per minute), distance spent standing (0-0.1 m·min), walking (0.2-1.7 m·min), jogging (1.8-3.2 m·min), running (3.3-5.6 m·min), sprinting (≥5.7 m·min), number of high, moderate, low accelerations and decelerations, and player load (PL per minute), calculated as the square root of the sum of the squared instantaneous rate of change in acceleration in 3 vectors (medio-lateral, anterior-posterior, and vertical). Activity was recorded from 14 lacrosse players over 4 matches during a national tournament. Players were separated into positions of attack, midfield, or defense. Differences (effect size [ES] ± 90% confidence interval) between positions and periods of play were considered likely positive when there was ≥75% likelihood of the difference exceeding an ES threshold of 0.2. Midfielders had likely covered higher (mean ± SD) meters per minute (100 ± 11) compared with attackers (87 ± 14; ES = 0.89 ± 1.04) and defenders (79 ± 14; ES = 1.54 ± 0.94) and more moderate and high accelerations and decelerations. Almost all variables across positions were reduced in quarter 4 compared with quarter 1. Coaches should accommodate for positional differences when preparing lacrosse players for competition. PMID:25264672

  1. Activity profile of high-level Australian lacrosse players.

    PubMed

    Polley, Chris S; Cormack, Stuart J; Gabbett, Tim J; Polglaze, Ted

    2015-01-01

    Despite lacrosse being one of the fastest growing team sports in the world, there is a paucity of information detailing the activity profile of high-level players. Microtechnology systems (global positioning systems and accelerometers) provide the opportunity to obtain detailed information on the activity profile in lacrosse. Therefore, this study aimed to analyze the activity profile of lacrosse match-play using microtechnology. Activity profile variables assessed relative to minutes of playing time included relative distance (meter per minute), distance spent standing (0-0.1 m·min), walking (0.2-1.7 m·min), jogging (1.8-3.2 m·min), running (3.3-5.6 m·min), sprinting (≥5.7 m·min), number of high, moderate, low accelerations and decelerations, and player load (PL per minute), calculated as the square root of the sum of the squared instantaneous rate of change in acceleration in 3 vectors (medio-lateral, anterior-posterior, and vertical). Activity was recorded from 14 lacrosse players over 4 matches during a national tournament. Players were separated into positions of attack, midfield, or defense. Differences (effect size [ES] ± 90% confidence interval) between positions and periods of play were considered likely positive when there was ≥75% likelihood of the difference exceeding an ES threshold of 0.2. Midfielders had likely covered higher (mean ± SD) meters per minute (100 ± 11) compared with attackers (87 ± 14; ES = 0.89 ± 1.04) and defenders (79 ± 14; ES = 1.54 ± 0.94) and more moderate and high accelerations and decelerations. Almost all variables across positions were reduced in quarter 4 compared with quarter 1. Coaches should accommodate for positional differences when preparing lacrosse players for competition.

  2. Double Emulsion Templated Celloidosomes

    NASA Astrophysics Data System (ADS)

    Arriaga, Laura R.; Marquez, Samantha M.; Kim, Shin-Hyun; Chang, Connie; Wilking, Jim; Monroy, Francisco; Marquez, Manuel; Weitz, David A.

    2012-02-01

    We present a novel approach for fabricating celloidosomes, which represent a hollow and spherical three-dimensional self-assembly of living cells encapsulating an aqueous core. Glass- capillary microfluidics is used to generate monodisperse water-in-oil-in-water double emulsion templates using lipids as stabilizers. Such templates allow for obtaining single but also double concentric celloidosomes. In addition, after a solvent removal step the double emulsion templates turn into monodisperse lipid vesicles, whose membrane spontaneously phase separates when choosing the adequate lipid composition, providing the adequate scaffold for fabricating Janus-celloidosomes. These structures may find applications in the development of bioreactors in which the synergistic effects of two different types of cells selectively adsorbed on one of the vesicle hemispheres may be exploited.

  3. Task demands determine the specificity of the search template.

    PubMed

    Bravo, Mary J; Farid, Hany

    2012-01-01

    When searching for an object, an observer holds a representation of the target in mind while scanning the scene. If the observer repeats the search, performance may become more efficient as the observer hones this target representation, or "search template," to match the specific demands of the search task. An effective search template must have two characteristics: It must reliably discriminate the target from the distractors, and it must tolerate variability in the appearance of the target. The present experiment examined how the tolerance of the search template is affected by the search task. Two groups of 18 observers trained on the same set of stimuli blocked either by target image (block-by-image group) or by target category (block-by-category group). One or two days after training, both groups were tested on a related search task. The pattern of test results revealed that the two groups of observers had developed different search templates, and that the templates of the block-by-category observers better captured the general characteristics of the category. These results demonstrate that observers match their search templates to the demands of the search task.

  4. Rapid Catalytic Template Searching as an Enzyme Function Prediction Procedure

    PubMed Central

    Nilmeier, Jerome P.; Kirshner, Daniel A.; Wong, Sergio E.; Lightstone, Felice C.

    2013-01-01

    We present an enzyme protein function identification algorithm, Catalytic Site Identification (CatSId), based on identification of catalytic residues. The method is optimized for highly accurate template identification across a diverse template library and is also very efficient in regards to time and scalability of comparisons. The algorithm matches three-dimensional residue arrangements in a query protein to a library of manually annotated, catalytic residues – The Catalytic Site Atlas (CSA). Two main processes are involved. The first process is a rapid protein-to-template matching algorithm that scales quadratically with target protein size and linearly with template size. The second process incorporates a number of physical descriptors, including binding site predictions, in a logistic scoring procedure to re-score matches found in Process 1. This approach shows very good performance overall, with a Receiver-Operator-Characteristic Area Under Curve (AUC) of 0.971 for the training set evaluated. The procedure is able to process cofactors, ions, nonstandard residues, and point substitutions for residues and ions in a robust and integrated fashion. Sites with only two critical (catalytic) residues are challenging cases, resulting in AUCs of 0.9411 and 0.5413 for the training and test sets, respectively. The remaining sites show excellent performance with AUCs greater than 0.90 for both the training and test data on templates of size greater than two critical (catalytic) residues. The procedure has considerable promise for larger scale searches. PMID:23675414

  5. Rapid catalytic template searching as an enzyme function prediction procedure.

    PubMed

    Nilmeier, Jerome P; Kirshner, Daniel A; Wong, Sergio E; Lightstone, Felice C

    2013-01-01

    We present an enzyme protein function identification algorithm, Catalytic Site Identification (CatSId), based on identification of catalytic residues. The method is optimized for highly accurate template identification across a diverse template library and is also very efficient in regards to time and scalability of comparisons. The algorithm matches three-dimensional residue arrangements in a query protein to a library of manually annotated, catalytic residues--The Catalytic Site Atlas (CSA). Two main processes are involved. The first process is a rapid protein-to-template matching algorithm that scales quadratically with target protein size and linearly with template size. The second process incorporates a number of physical descriptors, including binding site predictions, in a logistic scoring procedure to re-score matches found in Process 1. This approach shows very good performance overall, with a Receiver-Operator-Characteristic Area Under Curve (AUC) of 0.971 for the training set evaluated. The procedure is able to process cofactors, ions, nonstandard residues, and point substitutions for residues and ions in a robust and integrated fashion. Sites with only two critical (catalytic) residues are challenging cases, resulting in AUCs of 0.9411 and 0.5413 for the training and test sets, respectively. The remaining sites show excellent performance with AUCs greater than 0.90 for both the training and test data on templates of size greater than two critical (catalytic) residues. The procedure has considerable promise for larger scale searches.

  6. Environmental Learning Centers: A Template.

    ERIC Educational Resources Information Center

    Vozick, Eric

    1999-01-01

    Provides a working model, or template, for community-based environmental learning centers (ELCs). The template presents a philosophy as well as a plan for staff and administration operations, educational programming, and financial support. The template also addresses "green" construction and maintenance of buildings and grounds and includes a…

  7. Dental rehabilitation of amelogenesis imperfecta using thermoformed templates.

    PubMed

    Sockalingam, Snmp

    2011-01-01

    Amelogenesis imperfecta represents a group of dental developmental conditions that are genomic in origin. Hypoplastic AI, hypomineralised AI or both in combination were the most common types seen clinically. This paper describes oral rehabilitation of a 9-year-old Malay girl with inherited hypoplastic AI using transparent thermoforming templates. The defective surface areas were reconstructed to their original dimensions on stone cast models of the upper and lower arches using composite, and transparent thermoform templates were fabricated on the models. The templates were used as crown formers to reconstruct the defective teeth clinically using esthetically matching composite. The usage of the templates allowed direct light curing of the composite, accurate reproducibility of the anatomic contours of the defective teeth, reduced chair-side time and easy contouring and placement of homogenous thickness of composite in otherwise inaccessible sites of the affected teeth.

  8. Human action recognition using motion energy template

    NASA Astrophysics Data System (ADS)

    Shao, Yanhua; Guo, Yongcai; Gao, Chao

    2015-06-01

    Human action recognition is an active and interesting research topic in computer vision and pattern recognition field that is widely used in the real world. We proposed an approach for human activity analysis based on motion energy template (MET), a new high-level representation of video. The main idea for the MET model is that human actions could be expressed as the composition of motion energy acquired in a three-dimensional (3-D) space-time volume by using a filter bank. The motion energies were directly computed from raw video sequences, thus some problems, such as object location and segmentation, etc., are definitely avoided. Another important competitive merit of this MET method is its insensitivity to gender, hair, and clothing. We extract MET features by using the Bhattacharyya coefficient to measure the motion energy similarity between the action template video and the tested video, and then the 3-D max-pooling. Using these features as input to the support vector machine, extensive experiments on two benchmark datasets, Weizmann and KTH, were carried out. Compared with other state-of-the-art approaches, such as variation energy image, dynamic templates and local motion pattern descriptors, the experimental results demonstrate that our MET model is competitive and promising.

  9. A Framework for Translating a High Level Security Policy into Low Level Security Mechanisms

    NASA Astrophysics Data System (ADS)

    Hassan, Ahmed A.; Bahgat, Waleed M.

    2010-01-01

    Security policies have different components; firewall, active directory, and IDS are some examples of these components. Enforcement of network security policies to low level security mechanisms faces some essential difficulties. Consistency, verification, and maintenance are the major ones of these difficulties. One approach to overcome these difficulties is to automate the process of translation of high level security policy into low level security mechanisms. This paper introduces a framework of an automation process that translates a high level security policy into low level security mechanisms. The framework is described in terms of three phases; in the first phase all network assets are categorized according to their roles in the network security and relations between them are identified to constitute the network security model. This proposed model is based on organization based access control (OrBAC). However, the proposed model extend the OrBAC model to include not only access control policy but also some other administrative security policies like auditing policy. Besides, the proposed model enables matching of each rule of the high level security policy with the corresponding ones of the low level security policy. Through the second phase of the proposed framework, the high level security policy is mapped into the network security model. The second phase could be considered as a translation of the high level security policy into an intermediate model level. Finally, the intermediate model level is translated automatically into low level security mechanism. The paper illustrates the applicability of proposed approach through an application example.

  10. Cubic nitride templates

    DOEpatents

    Burrell, Anthony K; McCleskey, Thomas Mark; Jia, Quanxi; Mueller, Alexander H; Luo, Hongmei

    2013-04-30

    A polymer-assisted deposition process for deposition of epitaxial cubic metal nitride films and the like is presented. The process includes solutions of one or more metal precursor and soluble polymers having binding properties for the one or more metal precursor. After a coating operation, the resultant coating is heated at high temperatures under a suitable atmosphere to yield metal nitride films and the like. Such films can be used as templates for the development of high quality cubic GaN based electronic devices.

  11. 46 CFR 153.409 - High level alarms.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 46 Shipping 5 2010-10-01 2010-10-01 false High level alarms. 153.409 Section 153.409 Shipping... BULK LIQUID, LIQUEFIED GAS, OR COMPRESSED GAS HAZARDOUS MATERIALS Design and Equipment Cargo Gauging Systems § 153.409 High level alarms. When Table 1 refers to this section or requires a cargo to have...

  12. 40 CFR 227.30 - High-level radioactive waste.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 26 2012-07-01 2011-07-01 true High-level radioactive waste. 227.30...-level radioactive waste. High-level radioactive waste means the aqueous waste resulting from the operation of the first cycle solvent extraction system, or equivalent, and the concentrated waste...

  13. 40 CFR 227.30 - High-level radioactive waste.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 24 2010-07-01 2010-07-01 false High-level radioactive waste. 227.30...-level radioactive waste. High-level radioactive waste means the aqueous waste resulting from the operation of the first cycle solvent extraction system, or equivalent, and the concentrated waste...

  14. 40 CFR 227.30 - High-level radioactive waste.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 25 2011-07-01 2011-07-01 false High-level radioactive waste. 227.30...-level radioactive waste. High-level radioactive waste means the aqueous waste resulting from the operation of the first cycle solvent extraction system, or equivalent, and the concentrated waste...

  15. 40 CFR 227.30 - High-level radioactive waste.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 25 2014-07-01 2014-07-01 false High-level radioactive waste. 227.30...-level radioactive waste. High-level radioactive waste means the aqueous waste resulting from the operation of the first cycle solvent extraction system, or equivalent, and the concentrated waste...

  16. 40 CFR 227.30 - High-level radioactive waste.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 26 2013-07-01 2013-07-01 false High-level radioactive waste. 227.30...-level radioactive waste. High-level radioactive waste means the aqueous waste resulting from the operation of the first cycle solvent extraction system, or equivalent, and the concentrated waste...

  17. Process for solidifying high-level nuclear waste

    DOEpatents

    Ross, Wayne A.

    1978-01-01

    The addition of a small amount of reducing agent to a mixture of a high-level radioactive waste calcine and glass frit before the mixture is melted will produce a more homogeneous glass which is leach-resistant and suitable for long-term storage of high-level radioactive waste products.

  18. Using a library of structural templates to recognise catalytic sites and explore their evolution in homologous families.

    PubMed

    Torrance, James W; Bartlett, Gail J; Porter, Craig T; Thornton, Janet M

    2005-04-01

    Catalytic site structure is normally highly conserved between distantly related enzymes. As a consequence, templates representing catalytic sites have the potential to succeed at function prediction in cases where methods based on sequence or overall structure fail. There are many methods for searching protein structures for matches to structural templates, but few validated template libraries to use with these methods. We present a library of structural templates representing catalytic sites, based on information from the scientific literature. Furthermore, we analyse homologous template families to discover the diversity within families and the utility of templates for active site recognition. Templates representing the catalytic sites of homologous proteins mostly differ by less than 1A root mean square deviation, even when the sequence similarity between the two proteins is low. Within these sets of homologues there is usually no discernible relationship between catalytic site structure similarity and sequence similarity. Because of this structural conservation of catalytic sites, the templates can discriminate between matches to related proteins and random matches with over 85% sensitivity and predictive accuracy. Templates based on protein backbone positions are more discriminating than those based on side-chain atoms. These analyses show encouraging prospects for prediction of functional sites in structural genomics structures of unknown function, and will be of use in analyses of convergent evolution and exploring relationships between active site geometry and chemistry. The template library can be queried via a web server at and is available for download.

  19. Integration of professional judgement and decision-making in high-level adventure sports coaching practice.

    PubMed

    Collins, Loel; Collins, Dave

    2015-01-01

    This study examined the integration of professional judgement and decision-making processes in adventure sports coaching. The study utilised a thematic analysis approach to investigate the decision-making practices of a sample of high-level adventure sports coaches over a series of sessions. Results revealed that, in order to make judgements and decisions in practice, expert coaches employ a range of practical and pedagogic management strategies to create and opportunistically use time for decision-making. These approaches include span of control and time management strategies to facilitate the decision-making process regarding risk management, venue selection, aims, objectives, session content, and differentiation of the coaching process. The implication for coaches, coach education, and accreditation is the recognition and training of the approaches that "create time" for the judgements in practice, namely "creating space to think". The paper concludes by offering a template for a more expertise-focused progression in adventure sports coaching.

  20. Efficient generation and optimization of stochastic template banks by a neighboring cell algorithm

    NASA Astrophysics Data System (ADS)

    Fehrmann, Henning; Pletsch, Holger J.

    2014-12-01

    Placing signal templates (grid points) as efficiently as possible to cover a multidimensional parameter space is crucial in computing-intensive matched-filtering searches for gravitational waves, but also in similar searches in other fields of astronomy. To generate efficient coverings of arbitrary parameter spaces, stochastic template banks have been advocated, where templates are placed at random while rejecting those too close to others. However, in this simple scheme, for each new random point its distance to every template in the existing bank is computed. This rapidly increasing number of distance computations can render the acceptance of new templates computationally prohibitive, particularly for wide parameter spaces or in large dimensions. This paper presents a neighboring cell algorithm that can dramatically improve the efficiency of constructing a stochastic template bank. By dividing the parameter space into subvolumes (cells), for an arbitrary point an efficient hashing technique is exploited to obtain the index of its enclosing cell along with the parameters of its neighboring templates. Hence only distances to these neighboring templates in the bank are computed, massively lowering the overall computing cost, as demonstrated in simple examples. Furthermore, we propose a novel method based on this technique to increase the fraction of covered parameter space solely by directed template shifts, without adding any templates. As is demonstrated in examples, this method can be highly effective.

  1. Reference commercial high-level waste glass and canister definition.

    SciTech Connect

    Slate, S.C.; Ross, W.A.; Partain, W.L.

    1981-09-01

    This report presents technical data and performance characteristics of a high-level waste glass and canister intended for use in the design of a complete waste encapsulation package suitable for disposal in a geologic repository. The borosilicate glass contained in the stainless steel canister represents the probable type of high-level waste product that will be produced in a commercial nuclear-fuel reprocessing plant. Development history is summarized for high-level liquid waste compositions, waste glass composition and characteristics, and canister design. The decay histories of the fission products and actinides (plus daughters) calculated by the ORIGEN-II code are presented.

  2. Neptunium estimation in dissolver and high-level-waste solutions

    SciTech Connect

    Pathak, P.N.; Prabhu, D.R.; Kanekar, A.S.; Manchanda, V.K.

    2008-07-01

    This papers deals with the optimization of the experimental conditions for the estimation of {sup 237}Np in spent-fuel dissolver/high-level waste solutions using thenoyltrifluoroacetone as the extractant. (authors)

  3. Decision Document for Heat Removal from High Level Waste Tanks

    SciTech Connect

    WILLIS, W.L.

    2000-07-31

    This document establishes the combination of design and operational configurations that will be used to provide heat removal from high-level waste tanks during Phase 1 waste feed delivery to prevent the waste temperature from exceeding tank safety requirement limits. The chosen method--to use the primary and annulus ventilation systems to remove heat from the high-level waste tanks--is documented herein.

  4. High-Level Waste System Process Interface Description

    SciTech Connect

    d'Entremont, P.D.

    1999-01-14

    The High-Level Waste System is a set of six different processes interconnected by pipelines. These processes function as one large treatment plant that receives, stores, and treats high-level wastes from various generators at SRS and converts them into forms suitable for final disposal. The three major forms are borosilicate glass, which will be eventually disposed of in a Federal Repository, Saltstone to be buried on site, and treated water effluent that is released to the environment.

  5. Automatic script identification from images using cluster-based templates

    SciTech Connect

    Hochberg, J.; Kerns, L.; Kelly, P.; Thomas, T.

    1995-02-01

    We have developed a technique for automatically identifying the script used to generate a document that is stored electronically in bit image form. Our approach differs from previous work in that the distinctions among scripts are discovered by an automatic learning procedure, without any handson analysis. We first develop a set of representative symbols (templates) for each script in our database (Cyrillic, Roman, etc.). We do this by identifying all textual symbols in a set of training documents, scaling each symbol to a fixed size, clustering similar symbols, pruning minor clusters, and finding each cluster`s centroid. To identify a new document`s script, we identify and scale a subset of symbols from the document and compare them to the templates for each script. We choose the script whose templates provide the best match. Our current system distinguishes among the Armenian, Burmese, Chinese, Cyrillic, Ethiopic, Greek, Hebrew, Japanese, Korean, Roman, and Thai scripts with over 90% accuracy.

  6. Spike Detection Based on Normalized Correlation with Automatic Template Generation

    PubMed Central

    Hwang, Wen-Jyi; Wang, Szu-Huai; Hsu, Ya-Tzu

    2014-01-01

    A novel feedback-based spike detection algorithm for noisy spike trains is presented in this paper. It uses the information extracted from the results of spike classification for the enhancement of spike detection. The algorithm performs template matching for spike detection by a normalized correlator. The detected spikes are then sorted by the OSortalgorithm. The mean of spikes of each cluster produced by the OSort algorithm is used as the template of the normalized correlator for subsequent detection. The automatic generation and updating of templates enhance the robustness of the spike detection to input trains with various spike waveforms and noise levels. Experimental results show that the proposed algorithm operating in conjunction with OSort is an efficient design for attaining high detection and classification accuracy for spike sorting. PMID:24960082

  7. On the construction of a new stellar classification template library for the LAMOST spectral analysis pipeline

    SciTech Connect

    Wei, Peng; Luo, Ali; Li, Yinbi; Tu, Liangping; Wang, Fengfei; Zhang, Jiannan; Chen, Xiaoyan; Hou, Wen; Kong, Xiao; Wu, Yue; Zuo, Fang; Yi, Zhenping; Zhao, Yongheng; Chen, Jianjun; Du, Bing; Guo, Yanxin; Ren, Juanjuan; Pan, Jingchang; Jiang, Bin; Liu, Jie E-mail: weipeng@nao.cas.cn; and others

    2014-05-01

    The LAMOST spectral analysis pipeline, called the 1D pipeline, aims to classify and measure the spectra observed in the LAMOST survey. Through this pipeline, the observed stellar spectra are classified into different subclasses by matching with template spectra. Consequently, the performance of the stellar classification greatly depends on the quality of the template spectra. In this paper, we construct a new LAMOST stellar spectral classification template library, which is supposed to improve the precision and credibility of the present LAMOST stellar classification. About one million spectra are selected from LAMOST Data Release One to construct the new stellar templates, and they are gathered in 233 groups by two criteria: (1) pseudo g – r colors obtained by convolving the LAMOST spectra with the Sloan Digital Sky Survey ugriz filter response curve, and (2) the stellar subclass given by the LAMOST pipeline. In each group, the template spectra are constructed using three steps. (1) Outliers are excluded using the Local Outlier Probabilities algorithm, and then the principal component analysis method is applied to the remaining spectra of each group. About 5% of the one million spectra are ruled out as outliers. (2) All remaining spectra are reconstructed using the first principal components of each group. (3) The weighted average spectrum is used as the template spectrum in each group. Using the previous 3 steps, we initially obtain 216 stellar template spectra. We visually inspect all template spectra, and 29 spectra are abandoned due to low spectral quality. Furthermore, the MK classification for the remaining 187 template spectra is manually determined by comparing with 3 template libraries. Meanwhile, 10 template spectra whose subclass is difficult to determine are abandoned. Finally, we obtain a new template library containing 183 LAMOST template spectra with 61 different MK classes by combining it with the current library.

  8. Cloning nanocrystal morphology with soft templates

    NASA Astrophysics Data System (ADS)

    Thapa, Dev Kumar; Pandey, Anshu

    2016-08-01

    In most template directed preparative methods, while the template decides the nanostructure morphology, the structure of the template itself is a non-general outcome of its peculiar chemistry. Here we demonstrate a template mediated synthesis that overcomes this deficiency. This synthesis involves overgrowth of silica template onto a sacrificial nanocrystal. Such templates are used to copy the morphologies of gold nanorods. After template overgrowth, gold is removed and silver is regrown in the template cavity to produce a single crystal silver nanorod. This technique allows for duplicating existing nanocrystals, while also providing a quantifiable breakdown of the structure - shape interdependence.

  9. Brain templates and atlases.

    PubMed

    Evans, Alan C; Janke, Andrew L; Collins, D Louis; Baillet, Sylvain

    2012-08-15

    The core concept within the field of brain mapping is the use of a standardized, or "stereotaxic", 3D coordinate frame for data analysis and reporting of findings from neuroimaging experiments. This simple construct allows brain researchers to combine data from many subjects such that group-averaged signals, be they structural or functional, can be detected above the background noise that would swamp subtle signals from any single subject. Where the signal is robust enough to be detected in individuals, it allows for the exploration of inter-individual variance in the location of that signal. From a larger perspective, it provides a powerful medium for comparison and/or combination of brain mapping findings from different imaging modalities and laboratories around the world. Finally, it provides a framework for the creation of large-scale neuroimaging databases or "atlases" that capture the population mean and variance in anatomical or physiological metrics as a function of age or disease. However, while the above benefits are not in question at first order, there are a number of conceptual and practical challenges that introduce second-order incompatibilities among experimental data. Stereotaxic mapping requires two basic components: (i) the specification of the 3D stereotaxic coordinate space, and (ii) a mapping function that transforms a 3D brain image from "native" space, i.e. the coordinate frame of the scanner at data acquisition, to that stereotaxic space. The first component is usually expressed by the choice of a representative 3D MR image that serves as target "template" or atlas. The native image is re-sampled from native to stereotaxic space under the mapping function that may have few or many degrees of freedom, depending upon the experimental design. The optimal choice of atlas template and mapping function depend upon considerations of age, gender, hemispheric asymmetry, anatomical correspondence, spatial normalization methodology and disease

  10. Brain templates and atlases.

    PubMed

    Evans, Alan C; Janke, Andrew L; Collins, D Louis; Baillet, Sylvain

    2012-08-15

    The core concept within the field of brain mapping is the use of a standardized, or "stereotaxic", 3D coordinate frame for data analysis and reporting of findings from neuroimaging experiments. This simple construct allows brain researchers to combine data from many subjects such that group-averaged signals, be they structural or functional, can be detected above the background noise that would swamp subtle signals from any single subject. Where the signal is robust enough to be detected in individuals, it allows for the exploration of inter-individual variance in the location of that signal. From a larger perspective, it provides a powerful medium for comparison and/or combination of brain mapping findings from different imaging modalities and laboratories around the world. Finally, it provides a framework for the creation of large-scale neuroimaging databases or "atlases" that capture the population mean and variance in anatomical or physiological metrics as a function of age or disease. However, while the above benefits are not in question at first order, there are a number of conceptual and practical challenges that introduce second-order incompatibilities among experimental data. Stereotaxic mapping requires two basic components: (i) the specification of the 3D stereotaxic coordinate space, and (ii) a mapping function that transforms a 3D brain image from "native" space, i.e. the coordinate frame of the scanner at data acquisition, to that stereotaxic space. The first component is usually expressed by the choice of a representative 3D MR image that serves as target "template" or atlas. The native image is re-sampled from native to stereotaxic space under the mapping function that may have few or many degrees of freedom, depending upon the experimental design. The optimal choice of atlas template and mapping function depend upon considerations of age, gender, hemispheric asymmetry, anatomical correspondence, spatial normalization methodology and disease

  11. Development of Crystal-Tolerant High-Level Waste Glasses

    SciTech Connect

    Matyas, Josef; Vienna, John D.; Schaible, Micah J.; Rodriguez, Carmen P.; Crum, Jarrod V.; Arrigoni, Alyssa L.; Tate, Rachel M.

    2010-12-17

    Twenty five glasses were formulated. They were batched from HLW AZ-101 simulant or raw chemicals and melted and tested with a series of tests to elucidate the effect of spinel-forming components (Ni, Fe, Cr, Mn, and Zn), Al, and noble metals (Rh2O3 and RuO2) on the accumulation rate of spinel crystals in the glass discharge riser of the high-level waste (HLW) melter. In addition, the processing properties of glasses, such as the viscosity and TL, were measured as a function of temperature and composition. Furthermore, the settling of spinel crystals in transparent low-viscosity fluids was studied at room temperature to access the shape factor and hindered settling coefficient of spinel crystals in the Stokes equation. The experimental results suggest that Ni is the most troublesome component of all the studied spinel-forming components producing settling layers of up to 10.5 mm in just 20 days in Ni-rich glasses if noble metals or a higher concentration of Fe was not introduced in the glass. The layer of this thickness can potentially plug the bottom of the riser, preventing glass from being discharged from the melter. The noble metals, Fe, and Al were the components that significantly slowed down or stopped the accumulation of spinel at the bottom. Particles of Rh2O3 and RuO2, hematite and nepheline, acted as nucleation sites significantly increasing the number of crystals and therefore decreasing the average crystal size. The settling rate of ≤10-μm crystal size around the settling velocity of crystals was too low to produce thick layers. The experimental data for the thickness of settled layers in the glasses prepared from AZ-101 simulant were used to build a linear empirical model that can predict crystal accumulation in the riser of the melter as a function of concentration of spinel-forming components in glass. The developed model predicts the thicknesses of accumulated layers quite well, R2 = 0.985, and can be become an efficient tool for the formulation

  12. Adaptive, template moderated, spatially varying statistical classification.

    PubMed

    Warfield, S K; Kaus, M; Jolesz, F A; Kikinis, R

    2000-03-01

    A novel image segmentation algorithm was developed to allow the automatic segmentation of both normal and abnormal anatomy from medical images. The new algorithm is a form of spatially varying statistical classification, in which an explicit anatomical template is used to moderate the segmentation obtained by statistical classification. The algorithm consists of an iterated sequence of spatially varying classification and nonlinear registration, which forms an adaptive, template moderated (ATM), spatially varying statistical classification (SVC). Classification methods and nonlinear registration methods are often complementary, both in the tasks where they succeed and in the tasks where they fail. By integrating these approaches the new algorithm avoids many of the disadvantages of each approach alone while exploiting the combination. The ATM SVC algorithm was applied to several segmentation problems, involving different image contrast mechanisms and different locations in the body. Segmentation and validation experiments were carried out for problems involving the quantification of normal anatomy (MRI of brains of neonates) and pathology of various types (MRI of patients with multiple sclerosis, MRI of patients with brain tumors, MRI of patients with damaged knee cartilage). In each case, the ATM SVC algorithm provided a better segmentation than statistical classification or elastic matching alone. PMID:10972320

  13. High level radioactive waste management facility design criteria

    SciTech Connect

    Sheikh, N.A.; Salaymeh, S.R.

    1993-10-01

    This paper discusses the engineering systems for the structural design of the Defense Waste Processing Facility (DWPF) at the Savannah River Site (SRS). At the DWPF, high level radioactive liquids will be mixed with glass particles and heated in a melter. This molten glass will then be poured into stainless steel canisters where it will harden. This process will transform the high level waste into a more stable, manageable substance. This paper discuss the structural design requirements for this unique one of a kind facility. A special emphasis will be concentrated on the design criteria pertaining to earthquake, wind and tornado, and flooding.

  14. Final report on cermet high-level waste forms

    SciTech Connect

    Kobisk, E.H.; Quinby, T.C.; Aaron, W.S.

    1981-08-01

    Cermets are being developed as an alternate method for the fixation of defense and commercial high level radioactive waste in a terminal disposal form. Following initial feasibility assessments of this waste form, consisting of ceramic particles dispersed in an iron-nickel base alloy, significantly improved processing methods were developed. The characterization of cermets has continued through property determinations on samples prepared by various methods from a variety of simulated and actual high-level wastes. This report describes the status of development of the cermet waste form as it has evolved since 1977. 6 tables, 18 figures.

  15. Disposal of high-level nuclear waste in space

    NASA Astrophysics Data System (ADS)

    Coopersmith, Jonathan

    1992-08-01

    A solution of launching high-level nuclear waste into space is suggested. Disposal in space includes solidifying the wastes, embedding them in an explosion-proof vehicle, and launching it into earth orbit, and then into a solar orbit. The benefits of such a system include not only the safe disposal of high-level waste but also the establishment of an infrastructure for large-scale space exploration and development. Particular attention is given to the wide range of technical choices along with the societal, economic, and political factors needed for success.

  16. Bilateral medial patellofemoral ligament reconstruction in high-level athletes.

    PubMed

    Kuroda, Yuichi; Matsushita, Takehiko; Matsumoto, Tomoyuki; Kawakami, Yohei; Kurosaka, Masahiro; Kuroda, Ryosuke

    2014-10-01

    This report presents two cases of high-level athletes with bilateral patellar dislocations who were able to return to their preinjury level of activity after bilateral medial patellofemoral ligament (MPFL) reconstruction, without any major complications. Patient 1 was a 19-year-old male volleyball player for a top-level college volleyball team, and patient 2 was a 24-year-old woman who was a member of a national-level adult softball team. MPFL reconstruction could be an effective treatment for bilateral patellar dislocation in high-level athletes. Level of evidence V.

  17. The Use of ARTEMIS with High-Level Applications

    SciTech Connect

    B. A. Bowling; H. Shoaee; S. Witherspoon

    1995-10-01

    ARTEMIS is an online accelerator modeling server developed at CEBAF. One of the design goals of ARTEMIS was to provide an integrated modeling environment for high- level accelerator diagnostic and control applications such as automated beam steering, Linac Energy management (LEM) and the fast feedback system. This report illustrates the use of ARTEMIS in these applications as well as the application interface using the EPICS cdev device support API. Concentration is placed on the design and implementation aspects of high- level applications which utilize the ARTEMIS server for information on beam dynamics. Performance benchmarks for various model operations provided by ARTEMIS are also discussed.

  18. Templated Growth of Carbon Nanotubes

    NASA Technical Reports Server (NTRS)

    Siochik Emilie J. (Inventor)

    2007-01-01

    A method of growing carbon nanotubes uses a synthesized mesoporous si lica template with approximately cylindrical pores being formed there in. The surfaces of the pores are coated with a carbon nanotube precu rsor, and the template with the surfaces of the pores so-coated is th en heated until the carbon nanotube precursor in each pore is convert ed to a carbon nanotube.

  19. A narrow band pattern-matching model of vowel perception

    NASA Astrophysics Data System (ADS)

    Hillenbrand, James M.; Houde, Robert A.

    2003-02-01

    The purpose of this paper is to propose and evaluate a new model of vowel perception which assumes that vowel identity is recognized by a template-matching process involving the comparison of narrow band input spectra with a set of smoothed spectral-shape templates that are learned through ordinary exposure to speech. In the present simulation of this process, the input spectra are computed over a sufficiently long window to resolve individual harmonics of voiced speech. Prior to template creation and pattern matching, the narrow band spectra are amplitude equalized by a spectrum-level normalization process, and the information-bearing spectral peaks are enhanced by a ``flooring'' procedure that zeroes out spectral values below a threshold function consisting of a center-weighted running average of spectral amplitudes. Templates for each vowel category are created simply by averaging the narrow band spectra of like vowels spoken by a panel of talkers. In the present implementation, separate templates are used for men, women, and children. The pattern matching is implemented with a simple city-block distance measure given by the sum of the channel-by-channel differences between the narrow band input spectrum (level-equalized and floored) and each vowel template. Spectral movement is taken into account by computing the distance measure at several points throughout the course of the vowel. The input spectrum is assigned to the vowel template that results in the smallest difference accumulated over the sequence of spectral slices. The model was evaluated using a large database consisting of 12 vowels in /hVd/ context spoken by 45 men, 48 women, and 46 children. The narrow band model classified vowels in this database with a degree of accuracy (91.4%) approaching that of human listeners.

  20. Identifying high-level components in combinational circuits

    SciTech Connect

    Doom, T.; White, J.; Wojcik, A.; Chisholm, G.

    1998-07-01

    The problem of finding meaningful subcircuits in a logic layout appears in many contexts in computer-aided design. Existing techniques rely upon finding exact matchings of subcircuit structure within the layout. These syntactic techniques fail to identify functionally equivalent subcircuits that are differently implemented, optimized, or otherwise obfuscated. The authors present a mechanism for identifying functionally equivalent subcircuits that can overcome many of these limitations. Such semantic matching is particularly useful in the field of design recovery.

  1. The ATLAS Data Acquisition and High Level Trigger system

    NASA Astrophysics Data System (ADS)

    The ATLAS TDAQ Collaboration

    2016-06-01

    This paper describes the data acquisition and high level trigger system of the ATLAS experiment at the Large Hadron Collider at CERN, as deployed during Run 1. Data flow as well as control, configuration and monitoring aspects are addressed. An overview of the functionality of the system and of its performance is presented and design choices are discussed.

  2. High-level manpower movement and Japan's foreign aid.

    PubMed

    Furuya, K

    1992-01-01

    "Japan's technical assistance programs to Asian countries are summarized. Movements of high-level manpower accompanying direct foreign investments by private enterprise are also reviewed. Proposals for increased human resources development include education and training of foreigners in Japan as well as the training of Japanese aid experts and the development of networks for information exchange."

  3. THE XAL INFRASTRUCTURE FOR HIGH LEVEL CONTROL ROOM APPLICATIONS

    SciTech Connect

    Shishlo, Andrei P; Allen, Christopher K; Chu, Paul; Galambos, John D; Pelaia II, Tom

    2009-01-01

    XAL is a Java programming framework for building high-level control applications related to accelerator physics. The structure, details of implementation, and interaction between components, auxiliary XAL packages, and the latest modifications are discussed. A general overview of XAL applications created for the SNS project is presented.

  4. 46 CFR 153.409 - High level alarms.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... LEVEL ALARM.” Cargo Temperature Control Systems ... 46 Shipping 5 2011-10-01 2011-10-01 false High level alarms. 153.409 Section 153.409 Shipping... BULK LIQUID, LIQUEFIED GAS, OR COMPRESSED GAS HAZARDOUS MATERIALS Design and Equipment Cargo...

  5. 46 CFR 153.409 - High level alarms.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... LEVEL ALARM.” Cargo Temperature Control Systems ... 46 Shipping 5 2012-10-01 2012-10-01 false High level alarms. 153.409 Section 153.409 Shipping... BULK LIQUID, LIQUEFIED GAS, OR COMPRESSED GAS HAZARDOUS MATERIALS Design and Equipment Cargo...

  6. Nearest matched filter classification of spatiotemporal patterns.

    PubMed

    Hecht-Nielsen, R

    1987-05-15

    Recent advances in massively parallel optical and electronic neural network processing technology have made it plausible to consider the use of matched filter banks containing large numbers of individual filters as pattern classifiers for complex spatiotemporal pattern environments such as speech, sonar, radar, and advanced communications. This paper begins with an overview of how neural networks can be used to approximately implement such multidimensional matched filter banks. The nearest matched filter classifier is then formally defined. This definition is then reformulated to show that the classifier is equivalent to a nearest neighbor classifier in a separable infinite-dimensional metric space that specifies the local-in-time behavior of spatiotemporal patterns. The result of Cover and Hart is then applied to show that, given a statistically comprehensive set of filter templates, the nearest matched filter classifier will have near-Bayesian performance for spatiotemporal patterns. The combination of near-Bayesian classifier performance with the excellent performance of matched filtering in noise yields a powerful new classification technique. This result adds additional interest to Grossberg's hypothesis that the mammalian cerebral cortex carries out local-in-time nearest matched filter classification of both auditory and visual sensory inputs as an initial step in sensory pattern recognition-which may help explain the almost instantaneous pattern recognition capabilities of animals.

  7. Reusing information for high-level fusion: characterizing bias and uncertainty in human-generated intelligence

    NASA Astrophysics Data System (ADS)

    Burke, Dustin; Carlin, Alan; Picciano, Paul; Levchuk, Georgiy; Riordan, Brian

    2013-05-01

    To expedite the intelligence collection process, analysts reuse previously collected data. This poses the risk of analysis failure, because these data are biased in ways that the analyst may not know. Thus, these data may be incomplete, inconsistent or incorrect, have structural gaps and limitations, or simply be too old to accurately represent the current state of the world. Incorporating human-generated intelligence within the high-level fusion process enables the integration of hard (physical sensors) and soft information (human observations) to extend the ability of algorithms to associate and merge disparate pieces of information for a more holistic situational awareness picture. However, in order for high-level fusion systems to manage the uncertainty in soft information, a process needs to be developed for characterizing the sources of error and bias specific to human-generated intelligence and assessing the quality of this data. This paper outlines an approach Towards Integration of Data for unBiased Intelligence and Trust (TID-BIT) that implements a novel Hierarchical Bayesian Model for high-level situation modeling that allows the analyst to accurately reuse existing data collected for different intelligence requirements. TID-BIT constructs situational, semantic knowledge graphs that links the information extracted from unstructured sources to intelligence requirements and performs pattern matching over these attributed-network graphs for integrating information. By quantifying the reliability and credibility of human sources, TID-BIT enables the ability to estimate and account for uncertainty and bias that impact the high-level fusion process, resulting in improved situational awareness.

  8. Attributes and templates from active measurements with {sup 252}Cf

    SciTech Connect

    Mihalczo, J.T.; Mattingly, J.K.

    2000-02-01

    Active neutron interrogation is useful for the detection of shielded HEU and could also be used for Pu. In an active technique, fissile material is stimulated by an external neutron source to produce fission with the emanation of neutrons and gamma rays. The time distribution of particles leaving the fissile material is measured with respect to the source emission in a variety of ways. A variety of accelerator and radioactive sources can be used. Active interrogation of nuclear weapons/components can be used in two ways: template matching or attribute estimation. Template matching compares radiation signatures with known reference signatures and for treaty applications has the problem of authentication of the reference signatures along with storage and retrieval of templates. Attribute estimation determines, for example, the fissile mass from various features of the radiation signatures and does not require storage of radiation signatures but does require calibration, which can be repeated as necessary. A nuclear materials identification system (NMIS) has been in use at the Oak Ridge Y-12 Plant for verification of weapons components being received and in storage by template matching and has been used with calibrations for attribute (fissile mass) estimation for HEU metal. NMIS employs a {sup 252}Cf source of low intensity (< 2 x 10{sup 6} n/sec) such that the dose at 1 m is approximately twice that on a commercial airline at altitude. The use of such a source presents no significant safety concerns either for personnel or nuclear explosive safety, and has been approved for use at the Pantex Plant on fully assembled weapons systems.

  9. e-Stars Template Builder

    NASA Technical Reports Server (NTRS)

    Cox, Brian

    2003-01-01

    e-Stars Template Builder is a computer program that implements a concept of enabling users to rapidly gain access to information on projects of NASA's Jet Propulsion Laboratory. The information about a given project is not stored in a data base, but rather, in a network that follows the project as it develops. e-Stars Template Builder resides on a server computer, using Practical Extraction and Reporting Language (PERL) scripts to create what are called "e-STARS node templates," which are software constructs that allow for project-specific configurations. The software resides on the server and does not require specific software on the user machine except for an Internet browser. A user's computer need not be equipped with special software (other than an Internet-browser program). e-Stars Template Builder is compatible with Windows, Macintosh, and UNIX operating systems. A user invokes e-Stars Template Builder from a browser window. Operations that can be performed by the user include the creation of child processes and the addition of links and descriptions of documentation to existing pages or nodes. By means of this addition of "child processes" of nodes, a network that reflects the development of a project is generated.

  10. Evaluation and selection of candidate high-level waste forms

    SciTech Connect

    Bernadzikowski, T. A.; Allender, J. S.; Butler, J. L.; Gordon, D. E.; Gould, Jr., T. H.; Stone, J. A.

    1982-03-01

    Seven candidate waste forms being developed under the direction of the Department of Energy's National High-Level Waste (HLW) Technology Program, were evaluated as potential media for the immobilization and geologic disposal of high-level nuclear wastes. The evaluation combined preliminary waste form evaluations conducted at DOE defense waste-sites and independent laboratories, peer review assessments, a product performance evaluation, and a processability analysis. Based on the combined results of these four inputs, two of the seven forms, borosilicate glass and a titanate based ceramic, SYNROC, were selected as the reference and alternative forms for continued development and evaluation in the National HLW Program. Both the glass and ceramic forms are viable candidates for use at each of the DOE defense waste-sites; they are also potential candidates for immobilization of commercial reprocessing wastes. This report describes the waste form screening process, and discusses each of the four major inputs considered in the selection of the two forms.

  11. Multipurpose optimization models for high level waste vitrification

    SciTech Connect

    Hoza, M.

    1994-08-01

    Optimal Waste Loading (OWL) models have been developed as multipurpose tools for high-level waste studies for the Tank Waste Remediation Program at Hanford. Using nonlinear programming techniques, these models maximize the waste loading of the vitrified waste and optimize the glass formers composition such that the glass produced has the appropriate properties within the melter, and the resultant vitrified waste form meets the requirements for disposal. The OWL model can be used for a single waste stream or for blended streams. The models can determine optimal continuous blends or optimal discrete blends of a number of different wastes. The OWL models have been used to identify the most restrictive constraints, to evaluate prospective waste pretreatment methods, to formulate and evaluate blending strategies, and to determine the impacts of variability in the wastes. The OWL models will be used to aid in the design of frits and the maximize the waste in the glass for High-Level Waste (HLW) vitrification.

  12. RETENTION OF SULFATE IN HIGH LEVEL RADIOACTIVE WASTE GLASS

    SciTech Connect

    Fox, K.

    2010-09-07

    High level radioactive wastes are being vitrified at the Savannah River Site for long term disposal. Many of the wastes contain sulfate at concentrations that can be difficult to retain in borosilicate glass. This study involves efforts to optimize the composition of a glass frit for combination with the waste to improve sulfate retention while meeting other process and product performance constraints. The fabrication and characterization of several series of simulated waste glasses are described. The experiments are detailed chronologically, to provide insight into part of the engineering studies used in developing frit compositions for an operating high level waste vitrification facility. The results lead to the recommendation of a specific frit composition and a concentration limit for sulfate in the glass for the next batch of sludge to be processed at Savannah River.

  13. Life Extension of Aging High Level Waste (HLW) Tanks

    SciTech Connect

    BRYSON, D.

    2002-02-04

    The Double Shell Tanks (DSTs) play a critical role in the Hanford High-Level Waste Treatment Complex, and therefore activities are underway to protect and better understand these tanks. The DST Life Extension Program is focused on both tank life extension and on evaluation of tank integrity. Tank life extension activities focus on understanding tank failure modes and have produced key chemistry and operations controls to minimize tank corrosion and extend useful tank life. Tank integrity program activities have developed and applied key technologies to evaluate the condition of the tank structure and predict useful tank life. Program results to date indicate that DST useful life can be extended well beyond the original design life and allow the existing tanks to fill a critical function within the Hanford High-Level Waste Treatment Complex. In addition the tank life may now be more reliably predicted, facilitating improved planning for the use and possible future replacement of these tanks.

  14. Management of data quality of high level waste characterization

    SciTech Connect

    Winters, W.I., Westinghouse Hanford

    1996-06-12

    Over the past 10 years, the Hanford Site has been transitioning from nuclear materials production to Site cleanup operations. High-level waste characterization at the Hanford Site provides data to support present waste processing operations, tank safety programs, and future waste disposal programs. Quality elements in the high-level waste characterization program will be presented by following a sample through the data quality objective, sampling, laboratory analysis and data review process. Transition from production to cleanup has resulted in changes in quality systems and program; the changes, as well as other issues in these quality programs, will be described. Laboratory assessment through quality control and performance evaluation programs will be described, and data assessments in the laboratory and final reporting in the tank characterization reports will be discussed.

  15. Case for retrievable high-level nuclear waste disposal

    USGS Publications Warehouse

    Roseboom, Eugene H.

    1994-01-01

    Plans for the nation's first high-level nuclear waste repository have called for permanently closing and sealing the repository soon after it is filled. However, the hydrologic environment of the proposed site at Yucca Mountain, Nevada, should allow the repository to be kept open and the waste retrievable indefinitely. This would allow direct monitoring of the repository and maintain the options for future generations to improve upon the disposal methods or use the uranium in the spent fuel as an energy resource.

  16. Automatic rule generation for high-level vision

    NASA Technical Reports Server (NTRS)

    Rhee, Frank Chung-Hoon; Krishnapuram, Raghu

    1992-01-01

    Many high-level vision systems use rule-based approaches to solving problems such as autonomous navigation and image understanding. The rules are usually elaborated by experts. However, this procedure may be rather tedious. In this paper, we propose a method to generate such rules automatically from training data. The proposed method is also capable of filtering out irrelevant features and criteria from the rules.

  17. Mixing Processes in High-Level Waste Tanks - Final Report

    SciTech Connect

    Peterson, P.F.

    1999-05-24

    The mixing processes in large, complex enclosures using one-dimensional differential equations, with transport in free and wall jets is modeled using standard integral techniques. With this goal in mind, we have constructed a simple, computationally efficient numerical tool, the Berkeley Mechanistic Mixing Model, which can be used to predict the transient evolution of fuel and oxygen concentrations in DOE high-level waste tanks following loss of ventilation, and validate the model against a series of experiments.

  18. Handbook of high-level radioactive waste transportation

    SciTech Connect

    Sattler, L.R.

    1992-10-01

    The High-Level Radioactive Waste Transportation Handbook serves as a reference to which state officials and members of the general public may turn for information on radioactive waste transportation and on the federal government`s system for transporting this waste under the Civilian Radioactive Waste Management Program. The Handbook condenses and updates information contained in the Midwestern High-Level Radioactive Waste Transportation Primer. It is intended primarily to assist legislators who, in the future, may be called upon to enact legislation pertaining to the transportation of radioactive waste through their jurisdictions. The Handbook is divided into two sections. The first section places the federal government`s program for transporting radioactive waste in context. It provides background information on nuclear waste production in the United States and traces the emergence of federal policy for disposing of radioactive waste. The second section covers the history of radioactive waste transportation; summarizes major pieces of legislation pertaining to the transportation of radioactive waste; and provides an overview of the radioactive waste transportation program developed by the US Department of Energy (DOE). To supplement this information, a summary of pertinent federal and state legislation and a glossary of terms are included as appendices, as is a list of publications produced by the Midwestern Office of The Council of State Governments (CSG-MW) as part of the Midwestern High-Level Radioactive Waste Transportation Project.

  19. Materials Science of High-Level Nuclear Waste Immobilization

    SciTech Connect

    Weber, William J.; Navrotsky, Alexandra; Stefanovsky, S. V.; Vance, E. R.; Vernaz, Etienne Y.

    2009-01-09

    With the increasing demand for the development of more nuclear power comes the responsibility to address the technical challenges of immobilizing high-level nuclear wastes in stable solid forms for interim storage or disposition in geologic repositories. The immobilization of high-level nuclear wastes has been an active area of research and development for over 50 years. Borosilicate glasses and complex ceramic composites have been developed to meet many technical challenges and current needs, although regulatory issues, which vary widely from country to country, have yet to be resolved. Cooperative international programs to develop advanced proliferation-resistant nuclear technologies to close the nuclear fuel cycle and increase the efficiency of nuclear energy production might create new separation waste streams that could demand new concepts and materials for nuclear waste immobilization. This article reviews the current state-of-the-art understanding regarding the materials science of glasses and ceramics for the immobilization of high-level nuclear waste and excess nuclear materials and discusses approaches to address new waste streams.

  20. Overview of high-level waste management accomplishments

    SciTech Connect

    Lawroski, H; Berreth, J R; Freeby, W A

    1980-01-01

    Storage of power reactor spent fuel is necessary at present because of the lack of reprocessing operations particularly in the U.S. By considering the above solidification and storage scenario, there is more than reasonable assurance that acceptable, stable, low heat generation rate, solidified waste can be produced, and safely disposed. The public perception of no waste disposal solutions is being exploited by detractors of nuclear power application. The inability to even point to one overall system demonstration lends credibility to the negative assertions. By delaying the gathering of on-line information to qualify repository sites, and to implement a demonstration, the actions of the nuclear power detractors are self serving in that they can continue to point out there is no demonstration of satisfactory high-level waste disposal. By maintaining the liquid and solidified high-level waste in secure above ground storage until acceptable decay heat generation rates are achieved, by producing a compatible, high integrity, solid waste form, by providing a second or even third barrier as a compound container and by inserting the enclosed waste form in a qualified repository with spacing to assure moderately low temperature disposal conditions, there appears to be no technical reason for not progressing further with the disposal of high-level wastes and needed implementation of the complete nuclear power fuel cycle.

  1. High level cognitive information processing in neural networks

    NASA Technical Reports Server (NTRS)

    Barnden, John A.; Fields, Christopher A.

    1992-01-01

    Two related research efforts were addressed: (1) high-level connectionist cognitive modeling; and (2) local neural circuit modeling. The goals of the first effort were to develop connectionist models of high-level cognitive processes such as problem solving or natural language understanding, and to understand the computational requirements of such models. The goals of the second effort were to develop biologically-realistic model of local neural circuits, and to understand the computational behavior of such models. In keeping with the nature of NASA's Innovative Research Program, all the work conducted under the grant was highly innovative. For instance, the following ideas, all summarized, are contributions to the study of connectionist/neural networks: (1) the temporal-winner-take-all, relative-position encoding, and pattern-similarity association techniques; (2) the importation of logical combinators into connection; (3) the use of analogy-based reasoning as a bridge across the gap between the traditional symbolic paradigm and the connectionist paradigm; and (4) the application of connectionism to the domain of belief representation/reasoning. The work on local neural circuit modeling also departs significantly from the work of related researchers. In particular, its concentration on low-level neural phenomena that could support high-level cognitive processing is unusual within the area of biological local circuit modeling, and also serves to expand the horizons of the artificial neural net field.

  2. Precision feature point tracking method using a drift-correcting template update strategy

    NASA Astrophysics Data System (ADS)

    Peng, Xiaoming; Ma, Qian; Zhang, Qiheng; Chen, Wufan; Xu, Zhiyong

    2009-02-01

    We present a drift-correcting template update strategy for precisely tracking a feature point in 2D image sequences in this paper. The proposed strategy greatly extends Matthews et al's template tracking strategy [I. Matthews, T. Ishikawa and S. Baker, The template update problem, IEEE Trans. PAMI 26 (2004) 810-815.] by incorporating a robust non-rigid image registration step used in medical imaging. Matthews et al's strategy uses the first template to correct drifts in the current template; however, the drift would still build up if the first template becomes quite different from the current one as the tracking continues. In our strategy the first template is updated timely when it is quite different from the current one, and henceforth the updated first template can be used to correct template drifts in subsequent frames. The method based on the proposed strategy yields sub-pixel accuracy tracking results measured by the commercial software REALVIZ(R) MatchMover(R) Pro 4.0. Our method runs fast on a desktop PC (3.0 GHz Pentium(R) IV CPU, 1GB RAM, Windows(R) XP professional operating system, Microsoft Visual C++ 6.0 (R) programming), using about 0.03 seconds on average to track the feature point in a frame (under the assumption of a general affine transformation model, 61×61 pixels in template size) and when required, less than 0.1 seconds to update the first template. We also propose the architecture for implementing our strategy in parallel.

  3. Feature Matching with Affine-Function Transformation Models.

    PubMed

    Li, Hongsheng; Huang, Xiaolei; Huang, Junzhou; Zhang, Shaoting

    2014-12-01

    Feature matching is an important problem and has extensive uses in computer vision. However, existing feature matching methods support either a specific or a small set of transformation models. In this paper, we propose a unified feature matching framework which supports a large family of transformation models. We call the family of transformation models the affine-function family, in which all transformations can be expressed by affine functions with convex constraints. In this framework, the goal is to recover transformation parameters for every feature point in a template point set to calculate their optimal matching positions in an input image. Given pairwise feature dissimilarity values between all points in the template set and the input image, we create a convex dissimilarity function for each template point. Composition of such convex functions with any transformation model in the affine-function family is shown to have an equivalent convex optimization form that can be optimized efficiently. Four example transformation models in the affine-function family are introduced to show the flexibility of our proposed framework. Our framework achieves 0.0 percent matching errors for both CMU House and Hotel sequences following the experimental setup in [6]. PMID:26353148

  4. Evidence for high-level feature encoding and persistent memory during auditory stream segregation.

    PubMed

    Weintraub, David M; Snyder, Joel S

    2015-12-01

    A test sequence of alternating low-frequency (A) and high-frequency (B) tones in a repeating ". . . ABAB . . ." pattern is more likely to be heard as 2 segregated streams of tones when it is preceded by an isofrequency inducer sequence whose frequency matches either the A- or B-tone frequency (e.g., ". . . BBBB . . .") of the test, a phenomenon referred to as stream biasing. Low-level processes such as stimulus-selective adaptation of frequency-tuned neurons within early auditory processing stages have been thought by some to mediate stream biasing; however, the current study tested for the involvement of higher level processes. Inducers whose frequency matched neither the A- nor B-tone frequency (e.g., ". . . CCCC . . .") sometimes facilitated stream biasing. Stream biasing was also sensitive to complex features of the inducer sequence, namely whether the rhythmic pattern of the inducer matched the rhythm of the ABAB test. Stream biasing occurred even when an 8-s silent interval separated the inducer and test sequences, a time span longer than previously recognized (Beauvois & Meddis, 1997). These results suggest the involvement of persistent activation of high-level representations that affect perception. PMID:26375631

  5. Cooperation of catalysts and templates

    NASA Technical Reports Server (NTRS)

    White, D. H.; Kanavarioti, A.; Nibley, C. W.; Macklin, J. W.

    1986-01-01

    In order to understand how self-reproducing molecules could have originated on the primitive Earth or extraterrestrial bodies, it would be useful to find laboratory models of simple molecules which are able to carry out processes of catalysis and templating. Furthermore, it may be anticipated that systems in which several components are acting cooperatively to catalyze each other's synthesis will have different behavior with respect to natural selection than those of purely replicating systems. As the major focus of this work, laboratory models are devised to study the influence of short peptide catalysts on template reactions which produce oligonucleotides or additional peptides. Such catalysts could have been the earliest protoenzymes of selective advantage produced by replicating oligonucleotides. Since this is a complex problem, simpler systems are also studied which embody only one aspect at a time, such as peptide formation with and without a template, peptide catalysis of nontemplated peptide synthesis, and model reactions for replication of the type pioneered by Orgel.

  6. Hierarchical model of matching

    NASA Technical Reports Server (NTRS)

    Pedrycz, Witold; Roventa, Eugene

    1992-01-01

    The issue of matching two fuzzy sets becomes an essential design aspect of many algorithms including fuzzy controllers, pattern classifiers, knowledge-based systems, etc. This paper introduces a new model of matching. Its principal features involve the following: (1) matching carried out with respect to the grades of membership of fuzzy sets as well as some functionals defined on them (like energy, entropy,transom); (2) concepts of hierarchies in the matching model leading to a straightforward distinction between 'local' and 'global' levels of matching; and (3) a distributed character of the model realized as a logic-based neural network.

  7. Matching a Distribution by Matching Quantiles Estimation

    PubMed Central

    Sgouropoulos, Nikolaos; Yao, Qiwei; Yastremiz, Claudia

    2015-01-01

    Motivated by the problem of selecting representative portfolios for backtesting counterparty credit risks, we propose a matching quantiles estimation (MQE) method for matching a target distribution by that of a linear combination of a set of random variables. An iterative procedure based on the ordinary least-squares estimation (OLS) is proposed to compute MQE. MQE can be easily modified by adding a LASSO penalty term if a sparse representation is desired, or by restricting the matching within certain range of quantiles to match a part of the target distribution. The convergence of the algorithm and the asymptotic properties of the estimation, both with or without LASSO, are established. A measure and an associated statistical test are proposed to assess the goodness-of-match. The finite sample properties are illustrated by simulation. An application in selecting a counterparty representative portfolio with a real dataset is reported. The proposed MQE also finds applications in portfolio tracking, which demonstrates the usefulness of combining MQE with LASSO. PMID:26692592

  8. Wide band gap semiconductor templates

    DOEpatents

    Arendt, Paul N.; Stan, Liliana; Jia, Quanxi; DePaula, Raymond F.; Usov, Igor O.

    2010-12-14

    The present invention relates to a thin film structure based on an epitaxial (111)-oriented rare earth-Group IVB oxide on the cubic (001) MgO terminated surface and the ion-beam-assisted deposition ("IBAD") techniques that are amendable to be over coated by semiconductors with hexagonal crystal structures. The IBAD magnesium oxide ("MgO") technology, in conjunction with certain template materials, is used to fabricate the desired thin film array. Similarly, IBAD MgO with appropriate template layers can be used for semiconductors with cubic type crystal structures.

  9. Higher Accuracy Template for Corner Cube Reflected Image

    SciTech Connect

    Awwal, A S; Rice, K L; Leach, R R; Taha, T M

    2008-07-11

    Video images of laser beams are analyzed to determine the position of the laser beams for alignment purpose in the National Ignition Facility (NIF). Algorithms process beam images to facilitate automated laser alignment. One such beam image, known as the corner cube reflected pinhole image, exhibits wide beam quality variations that are processed by a matched-filter-based algorithm. The challenge is to design a representative template that captures these variations while at the same time assuring accurate position determination. This paper describes the development of a new analytical template to accurately estimate the center of a beam with good image quality. The templates are constructed to exploit several key recurring features observed in the beam images. When the beam image quality is low, the algorithm chooses a template that contains fewer features. The algorithm was implemented using a Xilinx Virtex II Pro FPGA implementation that provides a speedup of about 6.4 times over a baseline 3GHz Pentium 4 processor.

  10. Development of a High Level Waste Tank Inspection System

    SciTech Connect

    Appel, D.K.; Loibl, M.W.; Meese, D.C.

    1995-03-21

    The Westinghouse Savannah River Technology Center was requested by it`s sister site, West Valley Nuclear Service (WVNS), to develop a remote inspection system to gather wall thickness readings of their High Level Waste Tanks. WVNS management chose to take a proactive approach to gain current information on two tanks t hat had been in service since the early 70`s. The tanks contain high level waste, are buried underground, and have only two access ports to an annular space between the tank and the secondary concrete vault. A specialized remote system was proposed to provide both a visual surveillance and ultrasonic thickness measurements of the tank walls. A magnetic wheeled crawler was the basis for the remote delivery system integrated with an off-the-shelf Ultrasonic Data Acquisition System. A development program was initiated for Savannah River Technology Center (SRTC) to design, fabricate, and test a remote system based on the Crawler. The system was completed and involved three crawlers to perform the needed tasks, an Ultrasonic Crawler, a Camera Crawler, and a Surface Prep Crawler. The crawlers were computer controlled so that their operation could be done remotely and their position on the wall could be tracked. The Ultrasonic Crawler controls were interfaced with ABB Amdata`s I-PC, Ultrasonic Data Acquisition System so that thickness mapping of the wall could be obtained. A second system was requested by Westinghouse Savannah River Company (WSRC), to perform just ultrasonic mapping on their similar Waste Storage Tanks; however, the system needed to be interfaced with the P-scan Ultrasonic Data Acquisition System. Both remote inspection systems were completed 9/94. Qualifications tests were conducted by WVNS prior to implementation on the actual tank and tank development was achieved 10/94. The second inspection system was deployed at WSRC 11/94 with success, and the system is now in continuous service inspecting the remaining high level waste tanks at WSRC.

  11. High-level waste management technology program plan

    SciTech Connect

    Harmon, H.D.

    1995-01-01

    The purpose of this plan is to document the integrated technology program plan for the Savannah River Site (SRS) High-Level Waste (HLW) Management System. The mission of the SRS HLW System is to receive and store SRS high-level wastes in a see and environmentally sound, and to convert these wastes into forms suitable for final disposal. These final disposal forms are borosilicate glass to be sent to the Federal Repository, Saltstone grout to be disposed of on site, and treated waste water to be released to the environment via a permitted outfall. Thus, the technology development activities described herein are those activities required to enable successful accomplishment of this mission. The technology program is based on specific needs of the SRS HLW System and organized following the systems engineering level 3 functions. Technology needs for each level 3 function are listed as reference, enhancements, and alternatives. Finally, FY-95 funding, deliverables, and schedules are s in Chapter IV with details on the specific tasks that are funded in FY-95 provided in Appendix A. The information in this report represents the vision of activities as defined at the beginning of the fiscal year. Depending on emergent issues, funding changes, and other factors, programs and milestones may be adjusted during the fiscal year. The FY-95 SRS HLW technology program strongly emphasizes startup support for the Defense Waste Processing Facility and In-Tank Precipitation. Closure of technical issues associated with these operations has been given highest priority. Consequently, efforts on longer term enhancements and alternatives are receiving minimal funding. However, High-Level Waste Management is committed to participation in the national Radioactive Waste Tank Remediation Technology Focus Area. 4 refs., 5 figs., 9 tabs.

  12. Overview of the Spanish high-level waste program

    SciTech Connect

    Ulibarri, A.; Beceiro, A.R.

    1995-12-31

    The Empresa Nacional de Residuos Radiactivos, S.A. (ENRESA) was set up in 1984 with the mandate to be responsible for the management of all radioactive wastes generated in Spain. The strategy and main guidelines of ENRESA`s program to fulfill this mandate are contained in the General Radioactive Waste Plan (PGRR), a basic document which ENRESA is due to submit every year to the Ministry of Industry and Energy for Government approval. The Spanish nuclear electricity generating program consists of nine Light Water Reactors (LWR) with an overall capacity of 7.1 GWe, after the Vandellos 1 nuclear power plant were phased-out in 1989. The spent nuclear fuel from LWRs is defined, in accordance with the 1983 National Energy Plan, as high level waste, and its management is accordingly focused to the direct disposal option. The spent nuclear fuel from Vandellos 1, a graphite gas-cooled reactor which was in operation from 1972 to 1989, in reprocessed abroad, and the wastes generated in the processes will be returned to Spain. The final objective of the Spanish High Level Waste program is to dispose of the spent nuclear fuel and high level vitrified waste into a deep geological repository. In fulfilling this target, taking into account the time frame in which it can reasonably be achieved, a previous step is necessary in order to secure the temporary storage of the spent fuel. This paper presents the strategy and a description of the different elements of the program currently under way as established in the fourth General Radioactive Waste Plan that has been approved by the Government in December 1994.

  13. High-level neutron coincidence counter maintenance manual

    SciTech Connect

    Swansen, J.; Collinsworth, P.

    1983-05-01

    High-level neutron coincidence counter operational (field) calibration and usage is well known. This manual makes explicit basic (shop) check-out, calibration, and testing of new units and is a guide for repair of failed in-service units. Operational criteria for the major electronic functions are detailed, as are adjustments and calibration procedures, and recurrent mechanical/electromechanical problems are addressed. Some system tests are included for quality assurance. Data on nonstandard large-scale integrated (circuit) components and a schematic set are also included.

  14. CLASSIFICATION OF THE MGR DEFENSE HIGH LEVEL WASTE DISPOSAL CONTIANER

    SciTech Connect

    J.A. Ziegler

    1999-08-31

    The purpose of this analysis is to document the Quality Assurance (QA) classification of the Monitored Geologic Repository (MGR) defense high-level waste disposal container system structures, systems and components (SSCs) performed by the MGR Safety Assurance Department. This analysis also provides the basis for revision of YMP/90-55Q, Q-List (YMP 1998). The Q-List identifies those MGR SSCs subject to the requirements of DOE/RW-0333PY ''Quality Assurance Requirements and Description'' (QARD) (DOE 1998).

  15. Market Designs for High Levels of Variable Generation: Preprint

    SciTech Connect

    Milligan, M.; Holttinen, H.; Kiviluoma, J.; Orths, A.; Lynch, M.; Soder, L.

    2014-10-01

    Variable renewable generation is increasing in penetration in modern power systems, leading to higher variability in the supply and price of electricity as well as lower average spot prices. This raises new challenges, particularly in ensuring sufficient capacity and flexibility from conventional technologies. Because the fixed costs and lifetimes of electricity generation investments are significant, designing markets and regulations that ensure the efficient integration of renewable generation is a significant challenge. This papers reviews the state of play of market designs for high levels of variable generation in the United States and Europe and considers new developments in both regions.

  16. High-level wastes: DOE names three sites for characterization

    SciTech Connect

    1986-07-01

    DOE announced in May 1986 that there will be there site characterization studies made to determine suitability for a high-level radioactive waste repository. The studies will include several test drillings to the proposed disposal depths. Yucca Mountain, Nevada; Deaf Smith Country, Texas, and Hanford, Washington were identified as the study sites, and further studies for a second repository site in the East were postponed. The affected states all filed suits in federal circuit courts because they were given no advance warning of the announcement of their selection or the decision to suspend work on a second repository. Criticisms of the selection process include the narrowing or DOE options.

  17. Very-high-level neutral-beam control system

    SciTech Connect

    Elischer, V.; Jacobson, V.; Theil, E.

    1981-10-01

    As increasing numbers of neutral beams are added to fusion machines, their operation can consume a significant fraction of a facility's total resources. LBL has developed a very high level control system that allows a neutral beam injector to be treated as a black box with just 2 controls: one to set the beam power and one to set the pulse duration. This 2 knob view allows simple operation and provides a natural base for implementing even higher level controls such as automatic source conditioning.

  18. Spanish high level radioactive waste management system issues

    SciTech Connect

    Ulibarri, A.; Veganzones, A.

    1993-12-31

    The Empresa Nacional de Residuous Radiactivos, S.A. (ENRESA) was set up in 1984 as a state-owned limited liability company to be responsible for the management of all kinds of radioactive wastes in Spain. This paper provides an overview of the strategy and main lines of action stated in the third General Radioactive Waste Plan, currently in force, for the management of spent nuclear fuel and high-level wastes, as well as an outline of the main related projects, either being developed or foreseen. Aspects concerning the organizational structure, the economic and financing system and the international co-operational are also included.

  19. Corrosion and failure processes in high-level waste tanks

    SciTech Connect

    Mahidhara, R.K.; Elleman, T.S.; Murty, K.L.

    1992-11-01

    A large amount of radioactive waste has been stored safely at the Savannah River and Hanford sites over the past 46 years. The aim of this report is to review the experimental corrosion studies at Savannah River and Hanford with the intention of identifying the types and rates of corrosion encountered and indicate how these data contribute to tank failure predictions. The compositions of the High-Level Wastes, mild steels used in the construction of the waste tanks and degradation-modes particularly stress corrosion cracking and pitting are discussed. Current concerns at the Hanford Site are highlighted.

  20. Modern Alchemy: Solidifying high-level nuclear waste

    SciTech Connect

    Newton, C.C.

    1997-07-01

    The U.S. Department of Energy is putting a modern version of alchemy to work to produce an answer to a decades-old problem. It is taking place at the Savannah River Site (SRS) in Aiken, South Carolina and at the West Valley Demonstration Project (WVDP) near Buffalo, New York. At both locations, contractor Westinghouse Electric Corporation is applying technology that is turning liquid high-level radioactive waste (HLW) into a stabilized, durable glass for safer and easier management. The process is called vitrification. SRS and WVDP are now operating the nation`s first full-scale HLW vitrification plants.

  1. Supply-Chain Optimization Template

    NASA Technical Reports Server (NTRS)

    Quiett, William F.; Sealing, Scott L.

    2009-01-01

    The Supply-Chain Optimization Template (SCOT) is an instructional guide for identifying, evaluating, and optimizing (including re-engineering) aerospace- oriented supply chains. The SCOT was derived from the Supply Chain Council s Supply-Chain Operations Reference (SCC SCOR) Model, which is more generic and more oriented toward achieving a competitive advantage in business.

  2. Technical variation in a sample of high level judo players.

    PubMed

    Franchini, Emerson; Sterkowicz, Stanislaw; Meira, Cassio Miranda Júnior; Gomes, Fabio Rodrigo Ferreira; Tani, Go

    2008-06-01

    Technical actions performed by two groups of judokas who won medals at World Championships and Olympic Games during the period 1995-2001 were analyzed. In the Super Elite group (n = 17) were the best athletes in each weight category. The Elite group (n = 16) were medal winners who were not champions and did not win more than three medals. Super Elite judokas used a greater number of throwing techniques which resulted in scores, even when expressed relative to the total number of matches performed, and these techniques were applied in more directions than those of Elite judokas. Further, the number of different throwing techniques and the variability of directions in which techniques were applied were significantly correlated with number of wins and the number of points and ippon scored. Thus, a greater number of throwing techniques and use of directions for attack seem to be important in increasing unpredictability during judo matches.

  3. Viral-templated Palladium Nanocatalysts

    NASA Astrophysics Data System (ADS)

    Yang, Cuixian

    Despite recent progress on nanocatalysis, there exist several critical challenges in simple and readily controllable nanocatalyst synthesis including the unpredictable particle growth, deactivation of catalytic activity, cumbersome catalyst recovery and lack of in-situ reaction monitoring. In this dissertation, two novel approaches are presented for the fabrication of viral-templated palladium (Pd) nanocatalysts, and their catalytic activities for dichromate reduction reaction and Suzuki Coupling reaction were thoroughly studied. In the first approach, viral template based bottom-up assembly is employed for the Pd nanocatalyst synthesis in a chip-based format. Specifically, genetically displayed cysteine residues on each coat protein of Tobacco Mosaic Virus (TMV) templates provide precisely spaced thiol functionalities for readily controllable surface assembly and enhanced formation of catalytically active Pd nanoparticles. Catalysts with the chip-based format allow for simple separation and in-situ monitoring of the reaction extent. Thorough examination of synthesis-structure-activity relationship of Pd nanoparticles formed on surface-assembled viral templates shows that Pd nanoparticle size, catalyst loading density and catalytic activity of viral-templated Pd nanocatalysts can be readily controlled simply by tuning the synthesis conditions. The viral-templated Pd nanocatalysts with optimized synthesis conditions are shown to have higher catalytic activity per unit Pd mass than the commercial Pd/C catalysts. Furthermore, tunable and selective surface assembly of TMV biotemplates is exploited to control the loading density and location of Pd nanocatalysts on solid substrates via preferential electroless deposition. In addition, the catalytic activities of surface-assembled TMV-templated Pd nanocatalysts were also investigated for the ligand-free Suzuki Coupling reaction under mild reaction conditions. The chip-based format enables simple catalyst separation and

  4. Spreadsheet Templates for Chemical Equilibrium Calculations.

    ERIC Educational Resources Information Center

    Joshi, Bhairav D.

    1993-01-01

    Describes two general spreadsheet templates to carry out all types of one-equation chemical equilibrium calculations encountered by students in undergraduate chemistry courses. Algorithms, templates, macros, and representative examples are presented to illustrate the approach. (PR)

  5. Long-term high-level waste technology. Composite report

    NASA Astrophysics Data System (ADS)

    Cornman, W. R.

    1981-12-01

    Research and development studies on the immobilization of high-level wastes from the chemical reprocessing of nuclear reactor fuels are summarized. The reports are grouped under the following tasks: (1) program management and support; (2) waste preparation; (3) waste fixation; and (4) final handling. Some of the highlights are: leaching properties were obtained for titanate and tailored ceramic materials being developed at ICPP to immobilize zirconia calcine; comparative leach tests, hot-cell tests, and process evaluations were conducted of waste form alternatives to borosilicate glass for the immobilization of SRP high-level wastes, experiments were run at ANL to qualify neutron activation analysis and radioactive tracers for measuring leach rates from simulated waste glasses; comparative leach test samples of SYNROC D were prepared, characterized, and tested at LLNL; encapsulation of glass marbles with lead or lead alloys was demonstrated on an engineering scale at PNL; a canister for reference Commercial HLW was designed at PNL; a study of the optimization of salt-crete was completed at SRL; a risk assessment showed that an investment for tornado dampers in the interim storage building of the DWPF is unjustified.

  6. Permitting plan for the high-level waste interim storage

    SciTech Connect

    Deffenbaugh, M.L.

    1997-04-23

    This document addresses the environmental permitting requirements for the transportation and interim storage of solidified high-level waste (HLW) produced during Phase 1 of the Hanford Site privatization effort. Solidified HLW consists of canisters containing vitrified HLW (glass) and containers that hold cesium separated during low-level waste pretreatment. The glass canisters and cesium containers will be transported to the Canister Storage Building (CSB) in a U.S. Department of Energy (DOE)-provided transportation cask via diesel-powered tractor trailer. Tri-Party Agreement (TPA) Milestone M-90 establishes a new major milestone, and associated interim milestones and target dates, governing acquisition and/or modification of facilities necessary for: (1) interim storage of Tank Waste Remediation Systems (TWRS) immobilized HLW (IHLW) and other canistered high-level waste forms; and (2) interim storage and disposal of TWRS immobilized low-activity tank waste (ILAW). An environmental requirements checklist and narrative was developed to identify the permitting path forward for the HLW interim storage (HLWIS) project (See Appendix B). This permitting plan will follow the permitting logic developed in that checklist.

  7. Local acceptance of a high-level nuclear waste repository.

    PubMed

    Sjöberg, Lennart

    2004-06-01

    The siting of nuclear waste facilities has been very difficult in all countries. Recent experience in Sweden indicates, however, that it may be possible, under certain circumstances, to gain local support for the siting of a high-level nuclear waste (HLNW) repository. The article reports on a study of attitudes and risk perceptions of people living in four municipalities in Sweden where HLNW siting was being intensely discussed at the political level, in media, and among the public. Data showed a relatively high level of consensus on acceptability of at least further investigation of the issue; in two cases local councils have since voted in favor of a go-ahead, and in one case only a very small majority defeated the issue. Models of policy attitudes showed that these were related to attitude to nuclear power, attributes of the perceived HLNW risk, and trust. Factors responsible for acceptance are discussed at several levels. One is the attitude to nuclear power, which is becoming more positive, probably because no viable alternatives are in sight. Other factors have to do with the extensive information programs conducted in these municipalities, and with the logical nature of the conclusion that they would be good candidates for hosting the national HLNW repository.

  8. Space augmentation of military high-level waste disposal

    NASA Technical Reports Server (NTRS)

    English, T.; Lees, L.; Divita, E.

    1979-01-01

    Space disposal of selected components of military high-level waste (HLW) is considered. This disposal option offers the promise of eliminating the long-lived radionuclides in military HLW from the earth. A space mission which meets the dual requirements of long-term orbital stability and a maximum of one space shuttle launch per week over a period of 20-40 years, is a heliocentric orbit about halfway between the orbits of earth and Venus. Space disposal of high-level radioactive waste is characterized by long-term predictability and short-term uncertainties which must be reduced to acceptably low levels. For example, failure of either the Orbit Transfer Vehicle after leaving low earth orbit, or the storable propellant stage failure at perihelion would leave the nuclear waste package in an unplanned and potentially unstable orbit. Since potential earth reencounter and subsequent burn-up in the earth's atmosphere is unacceptable, a deep space rendezvous, docking, and retrieval capability must be developed.

  9. FLUIDIZED BED STEAM REFORMING ENABLING ORGANIC HIGH LEVEL WASTE DISPOSAL

    SciTech Connect

    Williams, M

    2008-05-09

    Waste streams planned for generation by the Global Nuclear Energy Partnership (GNEP) and existing radioactive High Level Waste (HLW) streams containing organic compounds such as the Tank 48H waste stream at Savannah River Site have completed simulant and radioactive testing, respectfully, by Savannah River National Laboratory (SRNL). GNEP waste streams will include up to 53 wt% organic compounds and nitrates up to 56 wt%. Decomposition of high nitrate streams requires reducing conditions, e.g. provided by organic additives such as sugar or coal, to reduce NOX in the off-gas to N2 to meet Clean Air Act (CAA) standards during processing. Thus, organics will be present during the waste form stabilization process regardless of the GNEP processes utilized and exists in some of the high level radioactive waste tanks at Savannah River Site and Hanford Tank Farms, e.g. organics in the feed or organics used for nitrate destruction. Waste streams containing high organic concentrations cannot be stabilized with the existing HLW Best Developed Available Technology (BDAT) which is HLW vitrification (HLVIT) unless the organics are removed by pretreatment. The alternative waste stabilization pretreatment process of Fluidized Bed Steam Reforming (FBSR) operates at moderate temperatures (650-750 C) compared to vitrification (1150-1300 C). The FBSR process has been demonstrated on GNEP simulated waste and radioactive waste containing high organics from Tank 48H to convert organics to CAA compliant gases, create no secondary liquid waste streams and create a stable mineral waste form.

  10. Spent Fuel and High-Level Radioactive Waste Transportation Report

    SciTech Connect

    Not Available

    1992-03-01

    This publication is intended to provide its readers with an introduction to the issues surrounding the subject of transportation of spent nuclear fuel and high-level radioactive waste, especially as those issues impact the southern region of the United States. It was originally issued by SSEB in July 1987 as the Spent Nuclear Fuel and High-Level Radioactive Waste Transportation Primer, a document patterned on work performed by the Western Interstate Energy Board and designed as a ``comprehensive overview of the issues.`` This work differs from that earlier effort in that it is designed for the educated layman with little or no background in nuclear waste Issues. In addition. this document is not a comprehensive examination of nuclear waste issues but should instead serve as a general introduction to the subject. Owing to changes in the nuclear waste management system, program activities by the US Department of Energy and other federal agencies and developing technologies, much of this information is dated quickly. While this report uses the most recent data available, readers should keep in mind that some of the material is subject to rapid change. SSEB plans periodic updates in the future to account for changes in the program. Replacement pages will be supplied to all parties in receipt of this publication provided they remain on the SSEB mailing list.

  11. Spent fuel and high-level radioactive waste transportation report

    SciTech Connect

    Not Available

    1990-11-01

    This publication is intended to provide its readers with an introduction to the issues surrounding the subject of transportation of spent nuclear fuel and high-level radioactive waste, especially as those issues impact the southern region of the United States. It was originally issued by the Southern States Energy Board (SSEB) in July 1987 as the Spent Nuclear Fuel and High-Level Radioactive Waste Transportation Primer, a document patterned on work performed by the Western Interstate Energy Board and designed as a ``comprehensive overview of the issues.`` This work differs from that earlier effort in that it is designed for the educated layman with little or no background in nuclear waste issues. In addition, this document is not a comprehensive examination of nuclear waste issues but should instead serve as a general introduction to the subject. Owing to changes in the nuclear waste management system, program activities by the US Department of Energy and other federal agencies and developing technologies, much of this information is dated quickly. While this report uses the most recent data available, readers should keep in mind that some of the material is subject to rapid change. SSEB plans periodic updates in the future to account for changes in the program. Replacement pages will be supplied to all parties in receipt of this publication provided they remain on the SSEB mailing list.

  12. Spent fuel and high-level radioactive waste transportation report

    SciTech Connect

    Not Available

    1989-11-01

    This publication is intended to provide its readers with an introduction to the issues surrounding the subject of transportation of spent nuclear fuel and high-level radioactive waste, especially as those issues impact the southern region of the United States. It was originally issued by the Southern States Energy Board (SSEB) in July 1987 as the Spent Nuclear Fuel and High-Level Radioactive Waste Transportation Primer, a document patterned on work performed by the Western Interstate Energy Board and designed as a ``comprehensive overview of the issues.`` This work differs from that earlier effort in that it is designed for the educated layman with little or no background in nuclear waste issues. In addition, this document is not a comprehensive examination of nuclear waste issues but should instead serve as a general introduction to the subject. Owing to changes in the nuclear waste management system, program activities by the US Department of Energy and other federal agencies and developing technologies, much of this information is dated quickly. While this report uses the most recent data available, readers should keep in mind that some of the material is subject to rapid change. SSEB plans periodic updates in the future to account for changes in the program. Replacement pages sew be supplied to all parties in receipt of this publication provided they remain on the SSEB mailing list.

  13. Burning high-level TRU waste in fusion fission reactors

    NASA Astrophysics Data System (ADS)

    Shen, Yaosong

    2016-09-01

    Recently, the concept of actinide burning instead of a once-through fuel cycle for disposing spent nuclear fuel seems to get much more attention. A new method of burning high-level transuranic (TRU) waste combined with Thorium-Uranium (Th-U) fuel in the subcritical reactors driven by external fusion neutron sources is proposed in this paper. The thorium-based TRU fuel burns all of the long-lived actinides via a hard neutron spectrum while outputting power. A one-dimensional model of the reactor concept was built by means of the ONESN_BURN code with new data libraries. The numerical results included actinide radioactivity, biological hazard potential, and much higher burnup rate of high-level transuranic waste. The comparison of the fusion-fission reactor with the thermal reactor shows that the harder neutron spectrum is more efficient than the soft. The Th-U cycle produces less TRU, less radiotoxicity and fewer long-lived actinides. The Th-U cycle provides breeding of 233U with a long operation time (>20 years), hence significantly reducing the reactivity swing while improving safety and burnup.

  14. How to achieve high-level expression of microbial enzymes

    PubMed Central

    Liu, Long; Yang, Haiquan; Shin, Hyun-dong; Chen, Rachel R.; Li, Jianghua; Du, Guocheng; Chen, Jian

    2013-01-01

    Microbial enzymes have been used in a large number of fields, such as chemical, agricultural and biopharmaceutical industries. The enzyme production rate and yield are the main factors to consider when choosing the appropriate expression system for the production of recombinant proteins. Recombinant enzymes have been expressed in bacteria (e.g., Escherichia coli, Bacillus and lactic acid bacteria), filamentous fungi (e.g., Aspergillus) and yeasts (e.g., Pichia pastoris). The favorable and very advantageous characteristics of these species have resulted in an increasing number of biotechnological applications. Bacterial hosts (e.g., E. coli) can be used to quickly and easily overexpress recombinant enzymes; however, bacterial systems cannot express very large proteins and proteins that require post-translational modifications. The main bacterial expression hosts, with the exception of lactic acid bacteria and filamentous fungi, can produce several toxins which are not compatible with the expression of recombinant enzymes in food and drugs. However, due to the multiplicity of the physiological impacts arising from high-level expression of genes encoding the enzymes and expression hosts, the goal of overproduction can hardly be achieved, and therefore, the yield of recombinant enzymes is limited. In this review, the recent strategies used for the high-level expression of microbial enzymes in the hosts mentioned above are summarized and the prospects are also discussed. We hope this review will contribute to the development of the enzyme-related research field. PMID:23686280

  15. High level language-based robotic control system

    NASA Technical Reports Server (NTRS)

    Rodriguez, Guillermo (Inventor); Kreutz, Kenneth K. (Inventor); Jain, Abhinandan (Inventor)

    1996-01-01

    This invention is a robot control system based on a high level language implementing a spatial operator algebra. There are two high level languages included within the system. At the highest level, applications programs can be written in a robot-oriented applications language including broad operators such as MOVE and GRASP. The robot-oriented applications language statements are translated into statements in the spatial operator algebra language. Programming can also take place using the spatial operator algebra language. The statements in the spatial operator algebra language from either source are then translated into machine language statements for execution by a digital control computer. The system also includes the capability of executing the control code sequences in a simulation mode before actual execution to assure proper action at execution time. The robot's environment is checked as part of the process and dynamic reconfiguration is also possible. The languages and system allow the programming and control of multiple arms and the use of inward/outward spatial recursions in which every computational step can be related to a transformation from one point in the mechanical robot to another point to name two major advantages.

  16. High level language-based robotic control system

    NASA Technical Reports Server (NTRS)

    Rodriguez, Guillermo (Inventor); Kruetz, Kenneth K. (Inventor); Jain, Abhinandan (Inventor)

    1994-01-01

    This invention is a robot control system based on a high level language implementing a spatial operator algebra. There are two high level languages included within the system. At the highest level, applications programs can be written in a robot-oriented applications language including broad operators such as MOVE and GRASP. The robot-oriented applications language statements are translated into statements in the spatial operator algebra language. Programming can also take place using the spatial operator algebra language. The statements in the spatial operator algebra language from either source are then translated into machine language statements for execution by a digital control computer. The system also includes the capability of executing the control code sequences in a simulation mode before actual execution to assure proper action at execution time. The robot's environment is checked as part of the process and dynamic reconfiguration is also possible. The languages and system allow the programming and control of multiple arms and the use of inward/outward spatial recursions in which every computational step can be related to a transformation from one point in the mechanical robot to another point to name two major advantages.

  17. Behavior construction and refinement from high-level specifications

    NASA Astrophysics Data System (ADS)

    Martignoni, Andrew J., III; Smart, William D.

    2004-12-01

    Mobile robots are excellent examples of systems that need to show a high level of autonomy. Often robots are loosely supervised by humans who are not intimately familiar with the inner workings of the robot. We cannot generally predict exact environmental conditions in which the robot will operate in advance. This means that the behavior must be adapted in the field. Untrained individuals cannot (and probably should not) program the robot to effect these changes. We need a system that will (a) allow re-tasking, and (b) allow adaptation of the behavior to the specific conditions in the field. In this paper we concentrate on (b). We will describe how to assemble controllers, based on high-level descriptions of the behavior. We will show how the behavior can be tuned by the human, despite not knowing how the code is put together. We will also show how this can be done automatically, using reinforcement learning, and point out the problems that must be overcome for this approach to work.

  18. Accurate Position Sensing of Defocused Beams Using Simulated Beam Templates

    SciTech Connect

    Awwal, A; Candy, J; Haynam, C; Widmayer, C; Bliss, E; Burkhart, S

    2004-09-29

    In position detection using matched filtering one is faced with the challenge of determining the best position in the presence of distortions such as defocus and diffraction noise. This work evaluates the performance of simulated defocused images as the template against the real defocused beam. It was found that an amplitude modulated phase-only filter is better equipped to deal with real defocused images that suffer from diffraction noise effects resulting in a textured spot intensity pattern. It is shown that the there is a tradeoff of performance dependent upon the type and size of the defocused image. A novel automated system was developed that can automatically select the right template type and size. Results of this automation for real defocused images are presented.

  19. High-level power analysis and optimization techniques

    NASA Astrophysics Data System (ADS)

    Raghunathan, Anand

    1997-12-01

    This thesis combines two ubiquitous trends in the VLSI design world--the move towards designing at higher levels of design abstraction, and the increasing importance of power consumption as a design metric. Power estimation and optimization tools are becoming an increasingly important part of design flows, driven by a variety of requirements such as prolonging battery life in portable computing and communication devices, thermal considerations and system cooling and packaging costs, reliability issues (e.g. electromigration, ground bounce, and I-R drops in the power network), and environmental concerns. This thesis presents a suite of techniques to automatically perform power analysis and optimization for designs at the architecture or register-transfer, and behavior or algorithm levels of the design hierarchy. High-level synthesis refers to the process of synthesizing, from an abstract behavioral description, a register-transfer implementation that satisfies the desired constraints. High-level synthesis tools typically perform one or more of the following tasks: transformations, module selection, clock selection, scheduling, and resource allocation and assignment (also called resource sharing or hardware sharing). High-level synthesis techniques for minimizing the area, maximizing the performance, and enhancing the testability of the synthesized designs have been investigated. This thesis presents high-level synthesis techniques that minimize power consumption in the synthesized data paths. This thesis investigates the effects of resource sharing on the power consumption in the data path, provides techniques to efficiently estimate power consumption during resource sharing, and resource sharing algorithms to minimize power consumption. The RTL circuit that is obtained from the high-level synthesis process can be further optimized for power by applying power-reducing RTL transformations. This thesis presents macro-modeling and estimation techniques for switching

  20. Software Template for Instruction in Mathematics

    NASA Technical Reports Server (NTRS)

    Shelton, Robert O.; Moebes, Travis A.; Beall, Anna

    2005-01-01

    Intelligent Math Tutor (IMT) is a software system that serves as a template for creating software for teaching mathematics. IMT can be easily connected to artificial-intelligence software and other analysis software through input and output of files. IMT provides an easy-to-use interface for generating courses that include tests that contain both multiple-choice and fill-in-the-blank questions, and enables tracking of test scores. IMT makes it easy to generate software for Web-based courses or to manufacture compact disks containing executable course software. IMT also can function as a Web-based application program, with features that run quickly on the Web, while retaining the intelligence of a high-level language application program with many graphics. IMT can be used to write application programs in text, graphics, and/or sound, so that the programs can be tailored to the needs of most handicapped persons. The course software generated by IMT follows a "back to basics" approach of teaching mathematics by inducing the student to apply creative mathematical techniques in the process of learning. Students are thereby made to discover mathematical fundamentals and thereby come to understand mathematics more deeply than they could through simple memorization.

  1. New stereo matching algorithm

    NASA Astrophysics Data System (ADS)

    Ahmed, Yasser A.; Afifi, Hossam; Rubino, Gerardo

    1999-05-01

    This paper present a new algorithm for stereo matching. The main idea is to decompose the original problem into independent hierarchical and more elementary problems that can be solved faster without any complicated mathematics using BBD. To achieve that, we use a new image feature called 'continuity feature' instead of classical noise. This feature can be extracted from any kind of images by a simple process and without using a searching technique. A new matching technique is proposed to match the continuity feature. The new algorithm resolves the main disadvantages of feature based stereo matching algorithms.

  2. AN EXPRESSION TEMPLATE AWARE LAMBDA FUNCTION

    SciTech Connect

    S. A. SMITH; J. STRIEGNITZ

    2000-09-19

    The authors show how the paradigms of lambda functions and expression templates fit together in order to provide a means to increase the expressiveness of existing STL algorithms. They demonstrate how the expression templates approach could be extended in order to work with built-in types. To be portable, their solution is based on the Portable Expression Template Engine (PETE), which is a framework that enables the development of expression template aware classes.

  3. Inexpensive casing-supported drilling templates

    SciTech Connect

    Harrington, J.P.; Williams, L.M.

    1986-06-01

    Three types of inexpensive casing-supported templates have been designed and developed for use as spacer templates for tieback operations or for simple subsea completions. Two of the template concepts have been used in three areas of the southern North Sea. The design principles and running procedures are described, and the design of ancillary equipment used in the diverless installation of these templates is also illustrated.

  4. Balancing act of template bank construction: Inspiral waveform template banks for gravitational-wave detectors and optimizations at fixed computational cost

    NASA Astrophysics Data System (ADS)

    Keppel, Drew

    2013-06-01

    Gravitational-wave searches for signals from inspiraling compact binaries have relied on matched filtering banks of waveforms (called template banks) to try to extract the signal waveforms from the detector data. These template banks have been constructed using four main considerations, the region of parameter space of interest, the sensitivity of the detector, the matched filtering bandwidth, and the sensitivity one is willing to lose due to the granularity of template placement, the latter of which is governed by the minimal match. In this work we describe how the choice of the lower frequency cutoff, the lower end of the matched filter frequency band, can be optimized for detection. We also show how the minimal match can be optimally chosen in the case of limited computational resources. These techniques are applied to searches for binary neutron star signals that have been previously performed when analyzing Initial LIGO and Virgo data and will be performed analyzing Advanced LIGO and Advanced Virgo data using the expected detector sensitivity. By following the algorithms put forward here, the volume sensitivity of these searches is predicted to improve without increasing the computational cost of performing the search.

  5. Verifying likelihoods for low template DNA profiles using multiple replicates

    PubMed Central

    Steele, Christopher D.; Greenhalgh, Matthew; Balding, David J.

    2014-01-01

    To date there is no generally accepted method to test the validity of algorithms used to compute likelihood ratios (LR) evaluating forensic DNA profiles from low-template and/or degraded samples. An upper bound on the LR is provided by the inverse of the match probability, which is the usual measure of weight of evidence for standard DNA profiles not subject to the stochastic effects that are the hallmark of low-template profiles. However, even for low-template profiles the LR in favour of a true prosecution hypothesis should approach this bound as the number of profiling replicates increases, provided that the queried contributor is the major contributor. Moreover, for sufficiently many replicates the standard LR for mixtures is often surpassed by the low-template LR. It follows that multiple LTDNA replicates can provide stronger evidence for a contributor to a mixture than a standard analysis of a good-quality profile. Here, we examine the performance of the likeLTD software for up to eight replicate profiling runs. We consider simulated and laboratory-generated replicates as well as resampling replicates from a real crime case. We show that LRs generated by likeLTD usually do exceed the mixture LR given sufficient replicates, are bounded above by the inverse match probability and do approach this bound closely when this is expected. We also show good performance of likeLTD even when a large majority of alleles are designated as uncertain, and suggest that there can be advantages to using different profiling sensitivities for different replicates. Overall, our results support both the validity of the underlying mathematical model and its correct implementation in the likeLTD software. PMID:25082140

  6. High Level Waste System Impacts from Acid Dissolution of Sludge

    SciTech Connect

    KETUSKY, EDWARD

    2006-04-20

    This research evaluates the ability of OLI{copyright} equilibrium based software to forecast Savannah River Site High Level Waste system impacts from oxalic acid dissolution of Tank 1-15 sludge heels. Without further laboratory and field testing, only the use of oxalic acid can be considered plausible to support sludge heel dissolution on multiple tanks. Using OLI{copyright} and available test results, a dissolution model is constructed and validated. Material and energy balances, coupled with the model, identify potential safety concerns. Overpressurization and overheating are shown to be unlikely. Corrosion induced hydrogen could, however, overwhelm the tank ventilation. While pH adjustment can restore the minimal hydrogen generation, resultant precipitates will notably increase the sludge volume. OLI{copyright} is used to develop a flowsheet such that additional sludge vitrification canisters and other negative system impacts are minimized. Sensitivity analyses are used to assess the processability impacts from variations in the sludge/quantities of acids.

  7. High level radioactive waste vitrification process equipment component testing

    NASA Astrophysics Data System (ADS)

    Siemens, D. H.; Health, W. C.; Larson, D. E.; Craig, S. N.; Berger, D. N.; Goles, R. W.

    1985-04-01

    Remote operability and maintainability of vitrification equipment were assessment under shielded cell conditions. The equipment tested will be applied to immobilize high level and transuranic liquid waste slurries that resulted from plutonium production for defense weapons. Equipment tested included: a turntable for handling waste canisters under the melter; a removable discharge cone in the melter overflow section; a thermocouple jumper that extends into a shielded cell; remote instrument and electrical connectors; remote, mechanical, and heat transfer aspects of the melter glass overflow section; a reamer to clean out plugged nozzles in the melter top; a closed circuit camera to view the melter interior; and a device to retrieve samples of the glass product. A test was also conduucted to evaluate liquid metals for use in a liquid metal sealing system.

  8. Characterization of composite ceramic high level waste forms.

    SciTech Connect

    Frank, S. M.; Bateman, K. J.; DiSanto, T.; Johnson, S. G.; Moschetti, T. L.; Noy, M. H.; O'Holleran, T. P.

    1997-12-05

    Argonne National Laboratory has developed a composite ceramic waste form for the disposition of high level radioactive waste produced during electrometallurgical conditioning of spent nuclear fuel. The electrorefiner LiCl/KCl eutectic salt, containing fission products and transuranics in the chloride form, is contacted with a zeolite material which removes the fission products from the salt. After salt contact, the zeolite is mixed with a glass binder. The zeolite/glass mixture is then hot isostatic pressed (HIPed) to produce the composite ceramic waste form. The ceramic waste form provides a durable medium that is well suited to incorporate fission products and transuranics in the chloride form. Presented are preliminary results of the process qualification and characterization studies, which include chemical and physical measurements and product durability testing, of the ceramic waste form.

  9. Calculates Neutron Production in Canisters of High-level Waste

    1993-01-15

    ALPHN calculates the (alpha,n) neutron production rate of a canister of vitrified high-level waste. The user supplies the chemical composition of the glass or glass-ceramic and the curies of the alpha-emitting actinides present. The output of the program gives the (alpha,n) neutron production of each actinide in neutrons per second and the total for the canister. The (alpha,n) neutron production rates are source terms only; that is, they are production rates within the glass andmore » do not take into account the shielding effect of the glass. For a given glass composition, the user can calculate up to eight cases simultaneously; these cases are based on the same glass composition but contain different quantities of actinides per canister.« less

  10. A high-level language for rule-based modelling.

    PubMed

    Pedersen, Michael; Phillips, Andrew; Plotkin, Gordon D

    2015-01-01

    Rule-based languages such as Kappa excel in their support for handling the combinatorial complexities prevalent in many biological systems, including signalling pathways. But Kappa provides little structure for organising rules, and large models can therefore be hard to read and maintain. This paper introduces a high-level, modular extension of Kappa called LBS-κ. We demonstrate the constructs of the language through examples and three case studies: a chemotaxis switch ring, a MAPK cascade, and an insulin signalling pathway. We then provide a formal definition of LBS-κ through an abstract syntax and a translation to plain Kappa. The translation is implemented in a compiler tool which is available as a web application. We finally demonstrate how to increase the expressivity of LBS-κ through embedded scripts in a general-purpose programming language, a technique which we view as generally applicable to other domain specific languages. PMID:26043208

  11. Review of High Level Waste Tanks Ultrasonic Inspection Data

    SciTech Connect

    Wiersma, B

    2006-03-09

    A review of the data collected during ultrasonic inspection of the Type I high level waste tanks has been completed. The data was analyzed for relevance to the possibility of vapor space corrosion and liquid/air interface corrosion. The review of the Type I tank UT inspection data has confirmed that the vapor space general corrosion is not an unusually aggressive phenomena and correlates well with predicted corrosion rates for steel exposed to bulk solution. The corrosion rates are seen to decrease with time as expected. The review of the temperature data did not reveal any obvious correlations between high temperatures and the occurrences of leaks. The complex nature of temperature-humidity interaction, particularly with respect to vapor corrosion requires further understanding to infer any correlation. The review of the waste level data also did not reveal any obvious correlations.

  12. Remote ignitability analysis of high-level radioactive waste

    SciTech Connect

    Lundholm, C.W.; Morgan, J.M.; Shurtliff, R.M.; Trejo, L.E.

    1992-09-01

    The Idaho Chemical Processing Plant (ICPP), was used to reprocess nuclear fuel from government owned reactors to recover the unused uranium-235. These processes generated highly radioactive liquid wastes which are stored in large underground tanks prior to being calcined into a granular solid. The Resource Conservation and Recovery Act (RCRA) and state/federal clean air statutes require waste characterization of these high level radioactive wastes for regulatory permitting and waste treatment purposes. The determination of the characteristic of ignitability is part of the required analyses prior to calcination and waste treatment. To perform this analysis in a radiologically safe manner, a remoted instrument was needed. The remote ignitability Method and Instrument will meet the 60 deg. C. requirement as prescribed for the ignitability in method 1020 of SW-846. The method for remote use will be equivalent to method 1020 of SW-846.

  13. High-Level Language Production in Parkinson's Disease: A Review

    PubMed Central

    Altmann, Lori J. P.; Troche, Michelle S.

    2011-01-01

    This paper discusses impairments of high-level, complex language production in Parkinson's disease (PD), defined as sentence and discourse production, and situates these impairments within the framework of current psycholinguistic theories of language production. The paper comprises three major sections, an overview of the effects of PD on the brain and cognition, a review of the literature on language production in PD, and a discussion of the stages of the language production process that are impaired in PD. Overall, the literature converges on a few common characteristics of language production in PD: reduced information content, impaired grammaticality, disrupted fluency, and reduced syntactic complexity. Many studies also document the strong impact of differences in cognitive ability on language production. Based on the data, PD affects all stages of language production including conceptualization and functional and positional processing. Furthermore, impairments at all stages appear to be exacerbated by impairments in cognitive abilities. PMID:21860777

  14. High level radioactive waste vitrification process equipment component testing

    SciTech Connect

    Siemens, D.H.; Heath, W.O.; Larson, D.E.; Craig, S.N.; Berger, D.N.; Goles, R.W.

    1985-04-01

    Remote operability and maintainability of vitrification equipment were assessed under shielded-cell conditions. The equipment tested will be applied to immobilize high-level and transuranic liquid waste slurries that resulted from plutonium production for defense weapons. Equipment tested included: a turntable for handling waste canisters under the melter; a removable discharge cone in the melter overflow section; a thermocouple jumper that extends into a shielded cell; remote instrument and electrical connectors; remote, mechanical, and heat transfer aspects of the melter glass overflow section; a reamer to clean out plugged nozzles in the melter top; a closed circuit camera to view the melter interior; and a device to retrieve samples of the glass product. A test was also conducted to evaluate liquid metals for use in a liquid metal sealing system.

  15. Linearization of the Fermilab recycler high level RF

    SciTech Connect

    Joseph E Dey; Tom Kubicki; John Reid

    2003-05-28

    In studying the Recycler high level RF, it was found that at 89 kHz, the lowest frequency required by the system, some nonlinearities in magnitude and phase were discovered. The visible evidence of this was that beam injected in a barrier bucket had a definite slope at the top. Using a network analyzer, the S-parameter S{sub 21} was realized for the overall system and from mathematical modeling a second order numerator and denominator transfer function was found. The inverse of this transfer function gives their linearization transfer function. The linearization transfer function was realized in hardware by summing a high pass, band pass and low pass filter together. The resulting magnitude and phase plots, along with actual beam response will be shown.

  16. 4.5 Meter high level waste canister study

    SciTech Connect

    Calmus, R. B.

    1997-10-01

    The Tank Waste Remediation System (TWRS) Storage and Disposal Project has established the Immobilized High-Level Waste (IBLW) Storage Sub-Project to provide the capability to store Phase I and II BLW products generated by private vendors. A design/construction project, Project W-464, was established under the Sub-Project to provide the Phase I capability. Project W-464 will retrofit the Hanford Site Canister Storage Building (CSB) to accommodate the Phase I I-ILW products. Project W-464 conceptual design is currently being performed to interim store 3.0 m-long BLW stainless steel canisters with a 0.61 in diameter, DOE is considering using a 4.5 in canister of the same diameter to reduce permanent disposal costs. This study was performed to assess the impact of replacing the 3.0 in canister with the 4.5 in canister. The summary cost and schedule impacts are described.

  17. A high-level language for rule-based modelling.

    PubMed

    Pedersen, Michael; Phillips, Andrew; Plotkin, Gordon D

    2015-01-01

    Rule-based languages such as Kappa excel in their support for handling the combinatorial complexities prevalent in many biological systems, including signalling pathways. But Kappa provides little structure for organising rules, and large models can therefore be hard to read and maintain. This paper introduces a high-level, modular extension of Kappa called LBS-κ. We demonstrate the constructs of the language through examples and three case studies: a chemotaxis switch ring, a MAPK cascade, and an insulin signalling pathway. We then provide a formal definition of LBS-κ through an abstract syntax and a translation to plain Kappa. The translation is implemented in a compiler tool which is available as a web application. We finally demonstrate how to increase the expressivity of LBS-κ through embedded scripts in a general-purpose programming language, a technique which we view as generally applicable to other domain specific languages.

  18. SIMULANT DEVELOPMENT FOR SAVANNAH RIVER SITE HIGH LEVEL WASTE

    SciTech Connect

    Stone, M; Russell Eibling, R; David Koopman, D; Dan Lambert, D; Paul Burket, P

    2007-09-04

    The Defense Waste Processing Facility (DWPF) at the Savannah River Site vitrifies High Level Waste (HLW) for repository internment. The process consists of three major steps: waste pretreatment, vitrification, and canister decontamination/sealing. The HLW consists of insoluble metal hydroxides (primarily iron, aluminum, magnesium, manganese, and uranium) and soluble sodium salts (carbonate, hydroxide, nitrite, nitrate, and sulfate). The HLW is processed in large batches through DWPF; DWPF has recently completed processing Sludge Batch 3 (SB3) and is currently processing Sludge Batch 4 (SB4). The composition of metal species in SB4 is shown in Table 1 as a function of the ratio of a metal to iron. Simulants remove radioactive species and renormalize the remaining species. Supernate composition is shown in Table 2.

  19. High-level waste tank farm set point document

    SciTech Connect

    Anthony, J.A. III

    1995-01-15

    Setpoints for nuclear safety-related instrumentation are required for actions determined by the design authorization basis. Minimum requirements need to be established for assuring that setpoints are established and held within specified limits. This document establishes the controlling methodology for changing setpoints of all classifications. The instrumentation under consideration involve the transfer, storage, and volume reduction of radioactive liquid waste in the F- and H-Area High-Level Radioactive Waste Tank Farms. The setpoint document will encompass the PROCESS AREA listed in the Safety Analysis Report (SAR) (DPSTSA-200-10 Sup 18) which includes the diversion box HDB-8 facility. In addition to the PROCESS AREAS listed in the SAR, Building 299-H and the Effluent Transfer Facility (ETF) are also included in the scope.

  20. High levels of subgenomic HCV plasma RNA in immunosilent infections

    PubMed Central

    Bernardin, Flavien; Stramer, Susan; Rehermann, Barbara; Page-Shafer, Kimberly; Cooper, Stewart; Bangsberg, David; Hahn, Judith; Tobler, Leslie; Busch, Michael; Delwart, Eric

    2007-01-01

    A genetic analysis of hepatitis C virus (HCV) in rare blood donors who remained HCV seronegative despite long-term high-level viremia revealed the chronic presence of HCV genomes with large in frame deletions in their structural genes. Full-length HCV genomes were only detected as minority variants. In one immunodeficiency virus (HIV) co-infected donor the truncated HCV genome transiently decreased in frequency concomitant with delayed seroconversion and re-emerged following partial seroreversion. The long-term production of heavily truncated HCV genomes in vivo suggests that these viruses retained the necessary elements for RNA replication while the deleted structural functions necessary for their spread in vivo was provided in trans by wild type helper virus in co-infected cells. The absence of immunological pressure and a high viral load may therefore promote the emergence of truncated HCV subgenomic replicons in vivo. PMID:17493654

  1. Exceptionally high levels of multiple mating in an army ant

    NASA Astrophysics Data System (ADS)

    Denny, A. Jay; Franks, Nigel R.; Powell, Scott; Edwards, Keith J.

    Most species of social insects have singly mated queens, although there are notable exceptions. Competing hypotheses have been proposed to explain the evolution of high levels of multiple mating, but this issue is far from resolved. Here we use microsatellites to investigate mating frequency in the army ant Eciton burchellii and show that queens mate with an exceptionally large number of males, eclipsing all but one other social insect species for which data are available. In addition we present evidence that suggests that mating is serial, continuing throughout the lifetime of the queen. This is the first demonstration of serial mating among social hymenoptera. We propose that high paternity within colonies is most likely to have evolved to increase genetic diversity and to counter high pathogen and parasite loads.

  2. Socioeconomic studies of high-level nuclear waste disposal.

    PubMed Central

    White, G F; Bronzini, M S; Colglazier, E W; Dohrenwend, B; Erikson, K; Hansen, R; Kneese, A V; Moore, R; Page, E B; Rappaport, R A

    1994-01-01

    The socioeconomic investigations of possible impacts of the proposed repository for high-level nuclear waste at Yucca Mountain, Nevada, have been unprecedented in several respects. They bear on the public decision that sooner or later will be made as to where and how to dispose permanently of the waste presently at military weapons installations and that continues to accumulate at nuclear power stations. No final decision has yet been made. There is no clear precedent from other countries. The organization of state and federal studies is unique. The state studies involve more disciplines than any previous efforts. They have been carried out in parallel to federal studies and have pioneered in defining some problems and appropriate research methods. A recent annotated bibliography provides interested scientists with a compact guide to the 178 published reports, as well as to relevant journal articles and related documents. PMID:7971963

  3. DOE Matching Grant Program

    SciTech Connect

    Dr Marvin Adams

    2002-03-01

    OAK 270 - The DOE Matching Grant Program provided $50,000.00 to the Dept of N.E. at TAMU, matching a gift of $50,000.00 from TXU Electric. The $100,000.00 total was spent on scholarships, departmental labs, and computing network.

  4. High-level hepatitis B virus replication in transgenic mice.

    PubMed Central

    Guidotti, L G; Matzke, B; Schaller, H; Chisari, F V

    1995-01-01

    Hepatitis B virus (HBV) transgenic mice whose hepatocytes replicate the virus at levels comparable to that in the infected livers of patients with chronic hepatitis have been produced, without any evidence of cytopathology. High-level viral gene expression was obtained in the liver and kidney tissues in three independent lineages. These animals were produced with a terminally redundant viral DNA construct (HBV 1.3) that starts just upstream of HBV enhancer I, extends completely around the circular viral genome, and ends just downstream of the unique polyadenylation site in HBV. In these animals, the viral mRNA is more abundant in centrilobular hepatocytes than elsewhere in the hepatic lobule. High-level viral DNA replication occurs inside viral nucleocapsid particles that preferentially form in the cytoplasm of these centrilobular hepatocytes, suggesting that an expression threshold must be reached for nucleocapsid assembly and viral replication to occur. Despite the restricted distribution of the viral replication machinery in centrilobular cytoplasmic nucleocapsids, nucleocapsid particles are detectable in the vast majority of hepatocyte nuclei throughout the hepatic lobule. The intranuclear nucleocapsid particles are empty, however, suggesting that viral nucleocapsid particle assembly occurs independently in the nucleus and the cytoplasm of the hepatocyte and implying that cytoplasmic nucleocapsid particles do not transport the viral genome across the nuclear membrane into the nucleus during the viral life cycle. This model creates the opportunity to examine the influence of viral and host factors on HBV pathogenesis and replication and to assess the antiviral potential of pharmacological agents and physiological processes, including the immune response. PMID:7666518

  5. High Level Information Fusion (HLIF) with nested fusion loops

    NASA Astrophysics Data System (ADS)

    Woodley, Robert; Gosnell, Michael; Fischer, Amber

    2013-05-01

    Situation modeling and threat prediction require higher levels of data fusion in order to provide actionable information. Beyond the sensor data and sources the analyst has access to, the use of out-sourced and re-sourced data is becoming common. Through the years, some common frameworks have emerged for dealing with information fusion—perhaps the most ubiquitous being the JDL Data Fusion Group and their initial 4-level data fusion model. Since these initial developments, numerous models of information fusion have emerged, hoping to better capture the human-centric process of data analyses within a machine-centric framework. 21st Century Systems, Inc. has developed Fusion with Uncertainty Reasoning using Nested Assessment Characterizer Elements (FURNACE) to address challenges of high level information fusion and handle bias, ambiguity, and uncertainty (BAU) for Situation Modeling, Threat Modeling, and Threat Prediction. It combines JDL fusion levels with nested fusion loops and state-of-the-art data reasoning. Initial research has shown that FURNACE is able to reduce BAU and improve the fusion process by allowing high level information fusion (HLIF) to affect lower levels without the double counting of information or other biasing issues. The initial FURNACE project was focused on the underlying algorithms to produce a fusion system able to handle BAU and repurposed data in a cohesive manner. FURNACE supports analyst's efforts to develop situation models, threat models, and threat predictions to increase situational awareness of the battlespace. FURNACE will not only revolutionize the military intelligence realm, but also benefit the larger homeland defense, law enforcement, and business intelligence markets.

  6. Matched-pair classification

    SciTech Connect

    Theiler, James P

    2009-01-01

    Following an analogous distinction in statistical hypothesis testing, we investigate variants of machine learning where the training set comes in matched pairs. We demonstrate that even conventional classifiers can exhibit improved performance when the input data has a matched-pair structure. Online algorithms, in particular, converge quicker when the data is presented in pairs. In some scenarios (such as the weak signal detection problem), matched pairs can be generated from independent samples, with the effect not only doubling the nominal size of the training set, but of providing the structure that leads to better learning. A family of 'dipole' algorithms is introduced that explicitly takes advantage of matched-pair structure in the input data and leads to further performance gains. Finally, we illustrate the application of matched-pair learning to chemical plume detection in hyperspectral imagery.

  7. The matching law

    PubMed Central

    Killeen, Peter

    1972-01-01

    The matching law may be viewed either as an empirical generalization, and therby subject to disproof, or as part of a system of equations used to define the utility (“value”) of a reinforcer. In the latter case it is tautologous, and not subject to disproof within the defining context. A failure to obtain matching will most often be a signal that the independent variables have not been properly scaled. If, however, the proper transformations have been made on the independent variables, and matching is not obtained, the experimental paradigm may be outside the purview of the matching law. At that point, reinterpretations or revisions of the law are called for. The theoretical matching law is but one of many possible ways to define utility, and it may eventually be rejected in favor of a more useful definition. PMID:16811604

  8. Metal nanodisks using bicellar templates

    SciTech Connect

    Song, Yujiang; Shelnutt, John A

    2013-12-03

    Metallic nanodisks and a method of making them. The metallic nanodisks are wheel-shaped structures that that provide large surface areas for catalytic applications. The metallic nanodisks are grown within bicelles (disk-like micelles) that template the growth of the metal in the form of approximately circular dendritic sheets. The zero-valent metal forming the nanodisks is formed by reduction of a metal ion using a suitable electron donor species.

  9. Distorted colloidal arrays as designed template

    NASA Astrophysics Data System (ADS)

    Yu, Ye; Zhou, Ziwei; Möhwald, Helmuth; Ai, Bin; Zhao, Zhiyuan; Ye, Shunsheng; Zhang, Gang

    2015-01-01

    In this paper, a novel type of colloidal template with broken symmetry was generated using commercial, inductively coupled plasma reactive ion etching (ICP-RIE). With proper but simple treatment, the traditional symmetric non-close-packed colloidal template evolves into an elliptical profile with high uniformity. This unique feature can add flexibility to colloidal lithography and/or other lithography techniques using colloidal particles as building blocks to fabricate nano-/micro-structures with broken symmetry. Beyond that the novel colloidal template we developed possesses on-site tunability, i.e. the transformability from a symmetric into an asymmetric template. Sandwich-type particles with eccentric features were fabricated utilizing this tunable template. This distinguishing feature will provide the possibility to fabricate structures with unique asymmetric features using one set of colloidal template, providing flexibility and broad tunability to enable nano-/micro-structure fabrication with colloidal templates.

  10. Titanium template for scaphoid reconstruction.

    PubMed

    Haefeli, M; Schaefer, D J; Schumacher, R; Müller-Gerbl, M; Honigmann, P

    2015-06-01

    Reconstruction of a non-united scaphoid with a humpback deformity involves resection of the non-union followed by bone grafting and fixation of the fragments. Intraoperative control of the reconstruction is difficult owing to the complex three-dimensional shape of the scaphoid and the other carpal bones overlying the scaphoid on lateral radiographs. We developed a titanium template that fits exactly to the surfaces of the proximal and distal scaphoid poles to define their position relative to each other after resection of the non-union. The templates were designed on three-dimensional computed tomography reconstructions and manufactured using selective laser melting technology. Ten conserved human wrists were used to simulate the reconstruction. The achieved precision measured as the deviation of the surface of the reconstructed scaphoid from its virtual counterpart was good in five cases (maximal difference 1.5 mm), moderate in one case (maximal difference 3 mm) and inadequate in four cases (difference more than 3 mm). The main problems were attributed to the template design and can be avoided by improved pre-operative planning, as shown in a clinical case. PMID:25167978

  11. Template synthesis of nanophase mesocarbon.

    PubMed

    Yang, Nancy Y; Jian, Kengqing; Külaots, Indrek; Crawford, Gregory P; Hurt, Robert H

    2003-10-01

    Templating techniques are used increasingly to create carbon materials with precisely engineered pore systems. This article presents a new templating technique that achieves simultaneous control of pore structure and molecular (crystal) structure in a single synthesis step. With the use of discotic liquid crystalline precursors, unique carbon structures can be engineered by selecting the size and geometry of the confining spaces and selecting the template material to induce edge-on or face-on orientation of the discotic precursor. Here mesophase pitch is infiltrated by capillary forces into a nanoporous glass followed by slow carbonization and NaOH etching. The resulting porous carbon material exhibits interconnected solid grains about 100 nm in size, a monodisperse pore size of 60 nm, 42% total porosity, and an abundance of edge-plane inner surfaces that reflect the favored edge-on anchoring of the mesophase precursor on glass. This new carbon form is potentially interesting for a number of important applications in which uniform large pores, active-site-rich surfaces, and easy access to interlayer spaces in nanometric grains are advantageous.

  12. Securing Biometric Templates Where Similarity Is Measured with Set Intersection

    NASA Astrophysics Data System (ADS)

    Socek, Daniel; Božović, Vladimir; Ćulibrk, Dubravko

    A novel scheme for securing biometric templates of variable size and order is proposed. The proposed scheme is based on a new similarity measure approach, namely the set intersection, which strongly resembles the methodology used in most of the current state-of-the-art biometrics matching systems. The applicability of the new scheme is compared with that of the existing principal schemes, and it is shown that the new scheme has definite advantages over the existing approaches. The proposed scheme is analyzed both in terms of security and performance.

  13. Latent fingerprint matching.

    PubMed

    Jain, Anil K; Feng, Jianjiang

    2011-01-01

    Latent fingerprint identification is of critical importance to law enforcement agencies in identifying suspects: Latent fingerprints are inadvertent impressions left by fingers on surfaces of objects. While tremendous progress has been made in plain and rolled fingerprint matching, latent fingerprint matching continues to be a difficult problem. Poor quality of ridge impressions, small finger area, and large nonlinear distortion are the main difficulties in latent fingerprint matching compared to plain or rolled fingerprint matching. We propose a system for matching latent fingerprints found at crime scenes to rolled fingerprints enrolled in law enforcement databases. In addition to minutiae, we also use extended features, including singularity, ridge quality map, ridge flow map, ridge wavelength map, and skeleton. We tested our system by matching 258 latents in the NIST SD27 database against a background database of 29,257 rolled fingerprints obtained by combining the NIST SD4, SD14, and SD27 databases. The minutiae-based baseline rank-1 identification rate of 34.9 percent was improved to 74 percent when extended features were used. In order to evaluate the relative importance of each extended feature, these features were incrementally used in the order of their cost in marking by latent experts. The experimental results indicate that singularity, ridge quality map, and ridge flow map are the most effective features in improving the matching accuracy.

  14. Spent nuclear fuel project high-level information management plan

    SciTech Connect

    Main, G.C.

    1996-09-13

    This document presents the results of the Spent Nuclear Fuel Project (SNFP) Information Management Planning Project (IMPP), a short-term project that identified information management (IM) issues and opportunities within the SNFP and outlined a high-level plan to address them. This high-level plan for the SNMFP IM focuses on specific examples from within the SNFP. The plan`s recommendations can be characterized in several ways. Some recommendations address specific challenges that the SNFP faces. Others form the basis for making smooth transitions in several important IM areas. Still others identify areas where further study and planning are indicated. The team`s knowledge of developments in the IM industry and at the Hanford Site were crucial in deciding where to recommend that the SNFP act and where they should wait for Site plans to be made. Because of the fast pace of the SNFP and demands on SNFP staff, input and interaction were primarily between the IMPP team and members of the SNFP Information Management Steering Committee (IMSC). Key input to the IMPP came from a workshop where IMSC members and their delegates developed a set of draft IM principles. These principles, described in Section 2, became the foundation for the recommendations found in the transition plan outlined in Section 5. Availability of SNFP staff was limited, so project documents were used as a basis for much of the work. The team, realizing that the status of the project and the environment are continually changing, tried to keep abreast of major developments since those documents were generated. To the extent possible, the information contained in this document is current as of the end of fiscal year (FY) 1995. Programs and organizations on the Hanford Site as a whole are trying to maximize their return on IM investments. They are coordinating IM activities and trying to leverage existing capabilities. However, the SNFP cannot just rely on Sitewide activities to meet its IM requirements

  15. CEMENTITIOUS GROUT FOR CLOSING SRS HIGH LEVEL WASTE TANKS - #12315

    SciTech Connect

    Langton, C.; Burns, H.; Stefanko, D.

    2012-01-10

    In 1997, the first two United States Department of Energy (US DOE) high level waste tanks (Tanks 17-F and 20-F: Type IV, single shell tanks) were taken out of service (permanently closed) at the Savannah River Site (SRS). In 2012, the DOE plans to remove from service two additional Savannah River Site (SRS) Type IV high-level waste tanks, Tanks 18-F and 19-F. These tanks were constructed in the late 1950's and received low-heat waste and do not contain cooling coils. Operational closure of Tanks 18-F and 19-F is intended to be consistent with the applicable requirements of the Resource Conservation and Recovery Act (RCRA) and the Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA) and will be performed in accordance with South Carolina Department of Health and Environmental Control (SCDHEC). The closure will physically stabilize two 4.92E+04 cubic meter (1.3 E+06 gallon) carbon steel tanks and isolate and stabilize any residual contaminants left in the tanks. The closure will also fill, physically stabilize and isolate ancillary equipment abandoned in the tanks. A Performance Assessment (PA) has been developed to assess the long-term fate and transport of residual contamination in the environment resulting from the operational closure of the F-Area Tank Farm (FTF) waste tanks. Next generation flowable, zero-bleed cementitious grouts were designed, tested, and specified for closing Tanks 18-F and 19-F and for filling the abandoned equipment. Fill requirements were developed for both the tank and equipment grouts. All grout formulations were required to be alkaline with a pH of 12.4 and chemically reduction potential (Eh) of -200 to -400 to stabilize selected potential contaminants of concern. This was achieved by including Portland cement and Grade 100 slag in the mixes, respectively. Ingredients and proportions of cementitious reagents were selected and adjusted, respectively, to support the mass placement strategy developed by closure

  16. Crystalline plutonium hosts derived from high-level waste formulations.

    SciTech Connect

    O'Holleran, T. P.

    1998-04-24

    The Department of Energy has selected immobilization for disposal in a repository as one approach for disposing of excess plutonium (1). Materials for immobilizing weapons-grade plutonium for repository disposal must meet the ''spent fuel standard'' by providing a radiation field similar to spent fuel (2). Such a radiation field can be provided by incorporating fission products from high-level waste into the waste form. Experiments were performed to evaluate the feasibility of incorporating high-level waste (HLW) stored at the Idaho Chemical Processing Plant (ICPP) into plutonium dispositioning materials to meet the spent fuel standard. A variety of materials and preparation techniques were evaluated based on prior experience developing waste forms for immobilizing HLW. These included crystalline ceramic compositions prepared by conventional sintering and hot isostatic pressing (HIP), and glass formulations prepared by conventional melting. Because plutonium solubility in silicate melts is limited, glass formulations were intentionally devitrified to partition plutonium into crystalline host phases, thereby allowing increased overall plutonium loading. Samarium, added as a representative rare earth neutron absorber, also tended to partition into the plutonium host phases. Because the crystalline plutonium host phases are chemically more inert, the plutonium is more effectively isolated from the environment, and its attractiveness for proliferation is reduced. In the initial phase of evaluating each material and preparation method, cerium was used as a surrogate for plutonium. For promising materials, additional preparation experiments were performed using plutonium to verify the behavior of cerium as a surrogate. These experiments demonstrated that cerium performed well as a surrogate for plutonium. For the most part, cerium and plutonium partitioned onto the same crystalline phases, and no anomalous changes in oxidation state were observed. The only observed

  17. Interventions for Individuals With High Levels of Needle Fear

    PubMed Central

    Noel, Melanie; Taddio, Anna; Antony, Martin M.; Asmundson, Gordon J.G.; Riddell, Rebecca Pillai; Chambers, Christine T.; Shah, Vibhuti

    2015-01-01

    Background: This systematic review evaluated the effectiveness of exposure-based psychological and physical interventions for the management of high levels of needle fear and/or phobia and fainting in children and adults. Design/Methods: A systematic review identified relevant randomized and quasi-randomized controlled trials of children, adults, or both with high levels of needle fear, including phobia (if not available, then populations with other specific phobias were included). Critically important outcomes were self-reported fear specific to the feared situation and stimulus (psychological interventions) or fainting (applied muscle tension). Data were pooled using standardized mean difference (SMD) or relative risk with 95% confidence intervals. Results: The systematic review included 11 trials. In vivo exposure-based therapy for children 7 years and above showed benefit on specific fear (n=234; SMD: −1.71 [95% CI: −2.72, −0.7]). In vivo exposure-based therapy with adults reduced fear of needles posttreatment (n=20; SMD: −1.09 [−2.04, −0.14]) but not at 1-year follow-up (n=20; SMD: −0.28 [−1.16, 0.6]). Compared with single session, a benefit was observed for multiple sessions of exposure-based therapy posttreatment (n=93; SMD: −0.66 [−1.08, −0.24]) but not after 1 year (n=83; SMD: −0.37 [−0.87, 0.13]). Non in vivo e.g., imaginal exposure-based therapy in children reduced specific fear posttreatment (n=41; SMD: −0.88 [−1.7, −0.05]) and at 3 months (n=24; SMD: −0.89 [−1.73, −0.04]). Non in vivo exposure-based therapy for adults showed benefit on specific fear (n=68; SMD: −0.62 [−1.11, −0.14]) but not procedural fear (n=17; SMD: 0.18 [−0.87, 1.23]). Applied tension showed benefit on fainting posttreatment (n=20; SMD: −1.16 [−2.12, −0.19]) and after 1 year (n=20; SMD: −0.97 [−1.91, −0.03]) compared with exposure alone. Conclusions: Exposure-based psychological interventions and applied muscle tension show

  18. Defense High-Level Waste Leaching Mechanisms Program. Final report

    SciTech Connect

    Mendel, J.E.

    1984-08-01

    The Defense High-Level Waste Leaching Mechanisms Program brought six major US laboratories together for three years of cooperative research. The participants reached a consensus that solubility of the leached glass species, particularly solubility in the altered surface layer, is the dominant factor controlling the leaching behavior of defense waste glass in a system in which the flow of leachant is constrained, as it will be in a deep geologic repository. Also, once the surface of waste glass is contacted by ground water, the kinetics of establishing solubility control are relatively rapid. The concentrations of leached species reach saturation, or steady-state concentrations, within a few months to a year at 70 to 90/sup 0/C. Thus, reaction kinetics, which were the main subject of earlier leaching mechanisms studies, are now shown to assume much less importance. The dominance of solubility means that the leach rate is, in fact, directly proportional to ground water flow rate. Doubling the flow rate doubles the effective leach rate. This relationship is expected to obtain in most, if not all, repository situations.

  19. Attenuation of high-level impulses by earmuffs.

    PubMed

    Zera, Jan; Mlynski, Rafal

    2007-10-01

    Attenuation of high-level acoustic impulses (noise reduction) by various types of earmuffs was measured using a laboratory source of type A impulses and an artificial test fixture compatible with the ISO 4869-3 standard. The measurements were made for impulses of peak sound-pressure levels (SPLs) from 150 to 170 dB. The rise time and A duration of the impulses depended on their SPL and were within a range of 12-400 mus (rise time) and 0.4-1.1 ms (A duration). The results showed that earmuff peak level attenuation increases by about 10 dB when the impulse's rise time and the A duration are reduced. The results also demonstrated that the signals under the earmuff cup have a longer rise and A duration than the original impulses recorded outside the earmuff. Results of the measurements were used to check the validity of various hearing damage risk criteria that specify the maximum permissible exposure to impulse noise. The present data lead to the conclusion that procedures in which hearing damage risk is assessed only from signal attenuation, without taking into consideration changes in the signal waveform under the earmuff, tend to underestimate the risk of hearing damage. PMID:17902846

  20. ATW system impact on high-level waste

    SciTech Connect

    Arthur, E.D.

    1992-12-01

    This report discusses the Accelerator Transmutation of Waste (ATW) concept which aims at destruction of key long-lived radionuclides in high-level nuclear waste (HLW), both fission products and actinides. This focus makes it different from most other transmutation concepts which concentrate primarily on actinide burning. The ATW system uses an accelerator-driven, sub-critical assembly to create an intense thermal neutron environment for radionuclide transmutation. This feature allows rapid transmutation under low-inventory system conditions, which in turn, has a direct impact on the size of chemical separations and materials handling components of the system. Inventories in ATW are factors of eight to thirty times smaller than reactor systems of equivalent thermal power. Chemical separations systems are relatively small in scale and can be optimized to achieve high decontamination factors and minimized waste streams. The low-inventory feature also directly impacts material amounts remaining in the system at its end of life. In addition to its low-inventory operation, the accelerator-driven neutron source features of ATW are key to providing a sufficient level of neutrons to allow transmutation of long-lived fission products.

  1. NOx AND HETEROGENEITY EFFECTS IN HIGH LEVEL WASTE (HLW)

    SciTech Connect

    Meisel, Dan; Camaioni, Donald M.; Orlando, Thom

    2000-06-01

    We summarize contributions from our EMSP supported research to several field operations of the Office of Environmental Management (EM). In particular we emphasize its impact on safety programs at the Hanford and other EM sites where storage, maintenance and handling of HLW is a major mission. In recent years we were engaged in coordinated efforts to understand the chemistry initiated by radiation in HLW. Three projects of the EMSP (''The NOx System in Nuclear Waste,'' ''Mechanisms and Kinetics of Organic Aging in High Level Nuclear Wastes, D. Camaioni--PI'' and ''Interfacial Radiolysis Effects in Tanks Waste, T. Orlando--PI'') were involved in that effort, which included a team at Argonne, later moved to the University of Notre Dame, and two teams at the Pacific Northwest National Laboratory. Much effort was invested in integrating the results of the scientific studies into the engineering operations via coordination meetings and participation in various stages of the resolution of some of the outstanding safety issues at the sites. However, in this Abstract we summarize the effort at Notre Dame.

  2. Cytotoxicity assessment of residual high-level disinfectants.

    PubMed

    Ryu, Mizuyuki; Kobayashi, Toshihiro; Kawamukai, Emiko; Quan, Glenlelyn; Furuta, Taro

    2013-01-01

    Some studies show the uptake of disinfectants on medical devices but no studies on their cytotoxicity have been reported. This study aimed to assess that cytotoxicity in a 3-dimensional culture system using HeLa cells grown in matrices composed of collagen. Plastic materials were soaked in the use solutions of the widely used high-level disinfectants, glutaraldehyde (GA), ortho-phthalaldehyde (OPA) and peracetic acid (PAA). After being rinsed, they were allowed to dry and were embedded into the cell medium to investigate the cytotoxicity of the residual disinfectants. Cytotoxicity was observed with the polyvinyl chloride, polyurethane and silicon tubes soaked in GA and OPA, indicating that both disinfectants were absorbed in the test pieces, whereas for PAA, none was observed. As for the polytetrafluoroethylene (PTFE) tubes, no disinfectant displayed cytotoxicity. GA and OPA are primary irritants, having a potential to cause anaphylaxis and other forms of allergic reactions. There should be consideration not only about the toxicity of the residual disinfectant from poor rinsing, but also about the toxicity that would result from the disinfectants that were absorbed and consequently released from the medical devices or materials.

  3. Wind resource quality affected by high levels of renewables

    DOE PAGES

    Diakov, Victor

    2015-06-17

    For solar photovoltaic (PV) and wind resources, the capacity factor is an important parameter describing the quality of the resource. As the share of variable renewable resources (such as PV and wind) on the electric system is increasing, so does curtailment (and the fraction of time when it cannot be avoided). At high levels of renewable generation, curtailments effectively change the practical measure of resource quality from capacity factor to the incremental capacity factor. The latter accounts only for generation during hours of no curtailment and is directly connected with the marginal capital cost of renewable generators for a givenmore » level of renewable generation during the year. The Western U.S. wind generation is analyzed hourly for a system with 75% of annual generation from wind, and it is found that the value for the system of resources with equal capacity factors can vary by a factor of 2, which highlights the importance of using the incremental capacity factor instead. Finally, the effect is expected to be more pronounced in smaller geographic areas (or when transmission limitations imposed) and less pronounced at lower levels of renewable energy in the system with less curtailment.« less

  4. Wind resource quality affected by high levels of renewables

    SciTech Connect

    Diakov, Victor

    2015-06-17

    For solar photovoltaic (PV) and wind resources, the capacity factor is an important parameter describing the quality of the resource. As the share of variable renewable resources (such as PV and wind) on the electric system is increasing, so does curtailment (and the fraction of time when it cannot be avoided). At high levels of renewable generation, curtailments effectively change the practical measure of resource quality from capacity factor to the incremental capacity factor. The latter accounts only for generation during hours of no curtailment and is directly connected with the marginal capital cost of renewable generators for a given level of renewable generation during the year. The Western U.S. wind generation is analyzed hourly for a system with 75% of annual generation from wind, and it is found that the value for the system of resources with equal capacity factors can vary by a factor of 2, which highlights the importance of using the incremental capacity factor instead. Finally, the effect is expected to be more pronounced in smaller geographic areas (or when transmission limitations imposed) and less pronounced at lower levels of renewable energy in the system with less curtailment.

  5. Pupil dilation dynamics track attention to high-level information.

    PubMed

    Kang, Olivia E; Huffer, Katherine E; Wheatley, Thalia P

    2014-01-01

    It has long been thought that the eyes index the inner workings of the mind. Consistent with this intuition, empirical research has demonstrated that pupils dilate as a consequence of attentional effort. Recently, Smallwood et al. (2011) demonstrated that pupil dilations not only provide an index of overall attentional effort, but are time-locked to stimulus changes during attention (but not during mind-wandering). This finding suggests that pupil dilations afford a dynamic readout of conscious information processing. However, because stimulus onsets in their study involved shifts in luminance as well as information, they could not determine whether this coupling of stimulus and pupillary dynamics reflected attention to low-level (luminance) or high-level (information) changes. Here, we replicated the methodology and findings of Smallwood et al. (2011) while controlling for luminance changes. When presented with isoluminant digit sequences, participants' pupillary dilations were synchronized with stimulus onsets when attending, but not when mind-wandering. This replicates Smallwood et al. (2011) and clarifies their finding by demonstrating that stimulus-pupil coupling reflects online cognitive processing beyond sensory gain.

  6. THERMAL ANALYSIS OF GEOLOGIC HIGH-LEVEL RADIOACTIVE WASTE PACKAGES

    SciTech Connect

    Hensel, S.; Lee, S.

    2010-04-20

    The engineering design of disposal of the high level waste (HLW) packages in a geologic repository requires a thermal analysis to provide the temperature history of the packages. Calculated temperatures are used to demonstrate compliance with criteria for waste acceptance into the geologic disposal gallery system and as input to assess the transient thermal characteristics of the vitrified HLW Package. The objective of the work was to evaluate the thermal performance of the supercontainer containing the vitrified HLW in a non-backfilled and unventilated underground disposal gallery. In order to achieve the objective, transient computational models for a geologic vitrified HLW package were developed by using a computational fluid dynamics method, and calculations for the HLW disposal gallery of the current Belgian geological repository reference design were performed. An initial two-dimensional model was used to conduct some parametric sensitivity studies to better understand the geologic system's thermal response. The effect of heat decay, number of co-disposed supercontainers, domain size, humidity, thermal conductivity and thermal emissivity were studied. Later, a more accurate three-dimensional model was developed by considering the conduction-convection cooling mechanism coupled with radiation, and the effect of the number of supercontainers (3, 4 and 8) was studied in more detail, as well as a bounding case with zero heat flux at both ends. The modeling methodology and results of the sensitivity studies will be presented.

  7. High levels of molecular chlorine in the Arctic atmosphere

    NASA Astrophysics Data System (ADS)

    Liao, Jin; Huey, L. Gregory; Liu, Zhen; Tanner, David J.; Cantrell, Chris A.; Orlando, John J.; Flocke, Frank M.; Shepson, Paul B.; Weinheimer, Andrew J.; Hall, Samuel R.; Ullmann, Kirk; Beine, Harry J.; Wang, Yuhang; Ingall, Ellery D.; Stephens, Chelsea R.; Hornbrook, Rebecca S.; Apel, Eric C.; Riemer, Daniel; Fried, Alan; Mauldin, Roy L.; Smith, James N.; Staebler, Ralf M.; Neuman, J. Andrew; Nowak, John B.

    2014-02-01

    Chlorine radicals can function as a strong atmospheric oxidant, particularly in polar regions, where levels of hydroxyl radicals are low. In the atmosphere, chlorine radicals expedite the degradation of methane and tropospheric ozone, and the oxidation of mercury to more toxic forms. Here we present direct measurements of molecular chlorine levels in the Arctic marine boundary layer in Barrow, Alaska, collected in the spring of 2009 over a six-week period using chemical ionization mass spectrometry. We report high levels of molecular chlorine, of up to 400 pptv. Concentrations peaked in the early morning and late afternoon, and fell to near-zero levels at night. Average daytime molecular chlorine levels were correlated with ozone concentrations, suggesting that sunlight and ozone are required for molecular chlorine formation. Using a time-dependent box model, we estimate that the chlorine radicals produced from the photolysis of molecular chlorine oxidized more methane than hydroxyl radicals, on average, and enhanced the abundance of short-lived peroxy radicals. Elevated hydroperoxyl radical levels, in turn, promoted the formation of hypobromous acid, which catalyses mercury oxidation and the breakdown of tropospheric ozone. We therefore suggest that molecular chlorine exerts a significant effect on the atmospheric chemistry of the Arctic.

  8. Why consider subseabed disposal of high-level nuclear waste

    SciTech Connect

    Heath, G. R.; Hollister, C. D.; Anderson, D. R.; Leinen, M.

    1980-01-01

    Large areas of the deep seabed warrant assessment as potential disposal sites for high-level radioactive waste because: (1) they are far from seismically and tectonically active lithospheric plate boundaries; (2) they are far from active or young volcanos; (3) they contain thick layers of very uniform fine-grained clays; (4) they are devoid of natural resources likely to be exploited in the forseeable future; (5) the geologic and oceanographic processes governing the deposition of sediments in such areas are well understood, and are remarkably insensitive to past oceanographic and climatic changes; and (6) sedmentary records of tens of millions of years of slow, uninterrupted deposition of fine grained clay support predictions of the future stability of such sites. Data accumulated to date on the permeability, ion-retardation properties, and mechanical strength of pelagic clay sediments indicate that they can act as a primary barrier to the escape of buried nuclides. Work in progress should determine within the current decade whether subseabed disposal is environmentally acceptable and technically feasible, as well as address the legal, political and social issues raised by this new concept.

  9. High-level PC-based laser system modeling

    NASA Astrophysics Data System (ADS)

    Taylor, Michael S.

    1991-05-01

    Since the inception of the Strategic Defense Initiative (SDI) there have been a multitude of comparison studies done in an attempt to evaluate the effectiveness and relative sizes of complementary, and sometimes competitive, laser weapon systems. It became more and more apparent that what the systems analyst needed was not only a fast, but a cost effective way to perform high-level trade studies. In the present investigation, a general procedure is presented for the development of PC-based algorithmic systems models for laser systems. This procedure points out all of the major issues that should be addressed in the design and development of such a model. Issues addressed include defining the problem to be modeled, defining a strategy for development, and finally, effective use of the model once developed. Being a general procedure, it will allow a systems analyst to develop a model to meet specific needs. To illustrate this method of model development, a description of the Strategic Defense Simulation - Design To (SDS-DT) model developed and used by Science Applications International Corporation (SAIC) is presented. SDS-DT is a menu-driven, fast executing, PC-based program that can be used to either calculate performance, weight, volume, and cost values for a particular design or, alternatively, to run parametrics on particular system parameters to perhaps optimize a design.

  10. Application of SYNROC to high-level defense wastes

    SciTech Connect

    Tewhey, J.D.; Hoenig, C.L.; Newkirk, H.W.; Rozsa, R.B.; Coles, D.G.; Ryerson, F.J.

    1981-01-01

    The SYNROC method for immobilization of high-level nuclear reactor wastes is currently being applied to US defense wastes in tank storage at Savannah River, South Carolina. The minerals zirconolite, perovskite, and hollandite are used in SYNROC D formulations to immobilize fission products and actinides that comprise up to 10% of defense waste sludges and coexisting solutions. Additional phases in SYNROC D are nepheline, the host phase for sodium; and spinel, the host for excess aluminum and iron. Up to 70 wt % of calcined sludge can be incorporated with 30 wt % of SYNROC additives to produce a waste form consisting of 10% nepheline, 30% spinel, and approximately 20% each of the radioactive waste-bearing phases. Urea coprecipitation and spray drying/calcining methods have been used in the laboratory to produce homogeneous, reactive ceramic powders. Hot pressing and sintering at temperatures from 1000 to 1100/sup 0/C result in waste form products with greater than 97% of theoretical density. Hot isostatic pressing has recently been implemented as a processing alternative. Characterization of waste-form mineralogy has been done by means of XRD, SEM, and electron microprobe. Leaching of SYNROC D samples is currently being carried out. Assessment of radiation damage effects and physical properties of SYNROC D will commence in FY 81.

  11. The GRAVITY instrument software/high-level software

    NASA Astrophysics Data System (ADS)

    Burtscher, Leonard; Wieprecht, Ekkehard; Ott, Thomas; Kok, Yitping; Yazici, Senol; Anugu, Narsireddy; Dembet, Roderick; Fedou, Pierre; Lacour, Sylvestre; Ott, Jürgen; Paumard, Thibaut; Lapeyrere, Vincent; Kervella, Pierre; Abuter, Roberto; Pozna, Eszter; Eisenhauer, Frank; Blind, Nicolas; Genzel, Reinhard; Gillessen, Stefan; Hans, Oliver; Haug, Marcus; Haussmann, Frank; Kellner, Stefan; Lippa, Magdalena; Pfuhl, Oliver; Sturm, Eckhard; Weber, Johannes; Amorim, Antonio; Brandner, Wolfgang; Rousselet-Perraut, Karine; Perrin, Guy S.; Straubmeier, Christian; Schöller, Markus

    2014-07-01

    GRAVITY is the four-beam, near-infrared, AO-assisted, fringe tracking, astrometric and imaging instrument for the Very Large Telescope Interferometer (VLTI). It is requiring the development of one of the most complex instrument software systems ever built for an ESO instrument. Apart from its many interfaces and interdependencies, one of the most challenging aspects is the overall performance and stability of this complex system. The three infrared detectors and the fast reflective memory network (RMN) recorder contribute a total data rate of up to 20 MiB/s accumulating to a maximum of 250 GiB of data per night. The detectors, the two instrument Local Control Units (LCUs) as well as the five LCUs running applications under TAC (Tools for Advanced Control) architecture, are interconnected with fast Ethernet, RMN fibers and dedicated fiber connections as well as signals for the time synchronization. Here we give a simplified overview of all subsystems of GRAVITY and their interfaces and discuss two examples of high-level applications during observations: the acquisition procedure and the gathering and merging of data to the final FITS file.

  12. High-Level Performance Modeling of SAR Systems

    NASA Technical Reports Server (NTRS)

    Chen, Curtis

    2006-01-01

    SAUSAGE (Still Another Utility for SAR Analysis that s General and Extensible) is a computer program for modeling (see figure) the performance of synthetic- aperture radar (SAR) or interferometric synthetic-aperture radar (InSAR or IFSAR) systems. The user is assumed to be familiar with the basic principles of SAR imaging and interferometry. Given design parameters (e.g., altitude, power, and bandwidth) that characterize a radar system, the software predicts various performance metrics (e.g., signal-to-noise ratio and resolution). SAUSAGE is intended to be a general software tool for quick, high-level evaluation of radar designs; it is not meant to capture all the subtleties, nuances, and particulars of specific systems. SAUSAGE was written to facilitate the exploration of engineering tradeoffs within the multidimensional space of design parameters. Typically, this space is examined through an iterative process of adjusting the values of the design parameters and examining the effects of the adjustments on the overall performance of the system at each iteration. The software is designed to be modular and extensible to enable consideration of a variety of operating modes and antenna beam patterns, including, for example, strip-map and spotlight SAR acquisitions, polarimetry, burst modes, and squinted geometries.

  13. Ultrafilter Conditions for High Level Waste Sludge Processing

    SciTech Connect

    Geeting, John GH; Hallen, Richard T.; Peterson, Reid A.

    2006-08-28

    An evaluation of the optimal filtration conditions was performed based on test data obtained from filtration of a High Level Waste Sludge sample from the Hanford tank farms. This evaluation was performed using the anticipated configuration for the Waste Treatment Plant at the Hanford site. Testing was performed to identify the optimal pressure drop and cross flow velocity for filtration at both high and low solids loading. However, this analysis indicates that the actual filtration rate achieved is relatively insensitive to these conditions under anticipated operating conditions. The maximum filter flux was obtained by adjusting the system control valve pressure from 400 to 650 kPa while the filter feed concentration increased from 5 to 20 wt%. However, operating the system with a constant control valve pressure drop of 500 kPa resulted in a less than 1% reduction in the average filter flux. Also note that allowing the control valve pressure to swing as much as +/- 20% resulted in less than a 5% decrease in filter flux.

  14. High-level fluorescence labeling of gram-positive pathogens.

    PubMed

    Aymanns, Simone; Mauerer, Stefanie; van Zandbergen, Ger; Wolz, Christiane; Spellerberg, Barbara

    2011-01-01

    Fluorescence labeling of bacterial pathogens has a broad range of interesting applications including the observation of living bacteria within host cells. We constructed a novel vector based on the E. coli streptococcal shuttle plasmid pAT28 that can propagate in numerous bacterial species from different genera. The plasmid harbors a promoterless copy of the green fluorescent variant gene egfp under the control of the CAMP-factor gene (cfb) promoter of Streptococcus agalactiae and was designated pBSU101. Upon transfer of the plasmid into streptococci, the bacteria show a distinct and easily detectable fluorescence using a standard fluorescence microscope and quantification by FACS-analysis demonstrated values that were 10-50 times increased over the respective controls. To assess the suitability of the construct for high efficiency fluorescence labeling in different gram-positive pathogens, numerous species were transformed. We successfully labeled Streptococcus pyogenes, Streptococcus agalactiae, Streptococcus dysgalactiae subsp. equisimilis, Enterococcus faecalis, Enterococcus faecium, Streptococcus mutans, Streptococcus anginosus and Staphylococcus aureus strains utilizing the EGFP reporter plasmid pBSU101. In all of these species the presence of the cfb promoter construct resulted in high-level EGFP expression that could be further increased by growing the streptococcal and enterococcal cultures under high oxygen conditions through continuous aeration.

  15. PLUTONIUM/HIGH-LEVEL VITRIFIED WASTE BDBE DOSE CALCULATION

    SciTech Connect

    J.A. Ziegler

    2000-11-20

    The purpose of this calculation is to provide a dose consequence analysis of high-level waste (HLW) consisting of plutonium immobilized in vitrified HLW to be handled at the proposed Monitored Geologic Repository at Yucca Mountain for a beyond design basis event (BDBE) under expected conditions using best estimate values for each calculation parameter. In addition to the dose calculation, a plutonium respirable particle size for dose calculation use is derived. The current concept for this waste form is plutonium disks enclosed in cans immobilized in canisters of vitrified HLW (i.e., glass). The plutonium inventory at risk used for this calculation is selected from Plutonium Immobilization Project Input for Yucca Mountain Total Systems Performance Assessment (Shaw 1999). The BDBE examined in this calculation is a nonmechanistic initiating event and the sequence of events that follow to cause a radiological release. This analysis will provide the radiological releases and dose consequences for a postulated BDBE. Results may be considered in other analyses to determine or modify the safety classification and quality assurance level of repository structures, systems, and components. This calculation uses best available technical information because the BDBE frequency is very low (i.e., less than 1.0E-6 events/year) and is not required for License Application for the Monitored Geologic Repository. The results of this calculation will not be used as part of a licensing or design basis.

  16. The ALICE High Level Trigger: status and plans

    NASA Astrophysics Data System (ADS)

    Krzewicki, Mikolaj; Rohr, David; Gorbunov, Sergey; Breitner, Timo; Lehrbach, Johannes; Lindenstruth, Volker; Berzano, Dario

    2015-12-01

    The ALICE High Level Trigger (HLT) is an online reconstruction, triggering and data compression system used in the ALICE experiment at CERN. Unique among the LHC experiments, it extensively uses modern coprocessor technologies like general purpose graphic processing units (GPGPU) and field programmable gate arrays (FPGA) in the data flow. Realtime data compression is performed using a cluster finder algorithm implemented on FPGA boards. These data, instead of raw clusters, are used in the subsequent processing and storage, resulting in a compression factor of around 4. Track finding is performed using a cellular automaton and a Kalman filter algorithm on GPGPU hardware, where both CUDA and OpenCL technologies can be used interchangeably. The ALICE upgrade requires further development of online concepts to include detector calibration and stronger data compression. The current HLT farm will be used as a test bed for online calibration and both synchronous and asynchronous processing frameworks already before the upgrade, during Run 2. For opportunistic use as a Grid computing site during periods of inactivity of the experiment a virtualisation based setup is deployed.

  17. The LHCb Data Acquisition and High Level Trigger Processing Architecture

    NASA Astrophysics Data System (ADS)

    Frank, M.; Gaspar, C.; Jost, B.; Neufeld, N.

    2015-12-01

    The LHCb experiment at the LHC accelerator at CERN collects collisions of particle bunches at 40 MHz. After a first level of hardware trigger with an output rate of 1 MHz, the physically interesting collisions are selected by running dedicated trigger algorithms in the High Level Trigger (HLT) computing farm. This farm consists of up to roughly 25000 CPU cores in roughly 1750 physical nodes each equipped with up to 4 TB local storage space. This work describes the LHCb online system with an emphasis on the developments implemented during the current long shutdown (LS1). We will elaborate the architecture to treble the available CPU power of the HLT farm and the technicalities to determine and verify precise calibration and alignment constants which are fed to the HLT event selection procedure. We will describe how the constants are fed into a two stage HLT event selection facility using extensively the local disk buffering capabilities on the worker nodes. With the installed disk buffers, the CPU resources can be used during periods of up to ten days without beams. These periods in the past accounted to more than 70% of the total time.

  18. Cytotoxicity assessment of residual high-level disinfectants.

    PubMed

    Ryu, Mizuyuki; Kobayashi, Toshihiro; Kawamukai, Emiko; Quan, Glenlelyn; Furuta, Taro

    2013-01-01

    Some studies show the uptake of disinfectants on medical devices but no studies on their cytotoxicity have been reported. This study aimed to assess that cytotoxicity in a 3-dimensional culture system using HeLa cells grown in matrices composed of collagen. Plastic materials were soaked in the use solutions of the widely used high-level disinfectants, glutaraldehyde (GA), ortho-phthalaldehyde (OPA) and peracetic acid (PAA). After being rinsed, they were allowed to dry and were embedded into the cell medium to investigate the cytotoxicity of the residual disinfectants. Cytotoxicity was observed with the polyvinyl chloride, polyurethane and silicon tubes soaked in GA and OPA, indicating that both disinfectants were absorbed in the test pieces, whereas for PAA, none was observed. As for the polytetrafluoroethylene (PTFE) tubes, no disinfectant displayed cytotoxicity. GA and OPA are primary irritants, having a potential to cause anaphylaxis and other forms of allergic reactions. There should be consideration not only about the toxicity of the residual disinfectant from poor rinsing, but also about the toxicity that would result from the disinfectants that were absorbed and consequently released from the medical devices or materials. PMID:24366628

  19. Anthropometric characteristics of high level European junior basketball players.

    PubMed

    Jelicić, M; Sekulić, D; Marinović, M

    2002-12-01

    The purpose of the research was to assess anthropometric status of European high-level junior basketball players and to determine anthropometric differences between the players playing in different game positions (guards, forwards, centers). The sample consisted of 132 young basketball players, participants of the European Junior Basketball Championship, Zadar, 2000. Participants were measured with 31 measures (anthropometric variables), on the basis of which two body composition measures (BMI and relative body fat) and somatotype were calculated. The basic statistical parameters were computed. The analysis of variance and discriminant canonical analysis were employed to determine the differences between positions in play. Results indicate that prominent longitudinal and transversal skeletal dimensions as well as circumference measures characterize players on the position of centers, but they do not have significantly larger skinfold measures in relation to forwards. Centers are also predominantly ectomorphic compared with other players. Guards achieved significantly lower values in all spaces and they are predominantly mesomorphic. Further investigations are necessary in order to assess potential changes in status of these parameters when the participants will reach the age of senior players and afterwards, as well as to determine relations between anthropometric status and skill related variables.

  20. Gravitational waves from inspiralling compact binaries: Hexagonal template placement and its efficiency in detecting physical signals

    SciTech Connect

    Cokelaer, T.

    2007-11-15

    Matched filtering is used to search for gravitational waves emitted by inspiralling compact binaries in data from the ground-based interferometers. One of the key aspects of the detection process is the design of a template bank that covers the astrophysically pertinent parameter space. In an earlier paper, we described a template bank that is based on a square lattice. Although robust, we showed that the square placement is overefficient, with the implication that it is computationally more demanding than required. In this paper, we present a template bank based on an hexagonal lattice, which size is reduced by 40% with respect to the proposed square placement. We describe the practical aspects of the hexagonal template bank implementation, its size, and computational cost. We have also performed exhaustive simulations to characterize its efficiency and safeness. We show that the bank is adequate to search for a wide variety of binary systems (primordial black holes, neutron stars, and stellar-mass black holes) and in data from both current detectors (initial LIGO, Virgo and GEO600) as well as future detectors (advanced LIGO and EGO). Remarkably, although our template bank placement uses a metric arising from a particular template family, namely, stationary phase approximation, we show that it can be used successfully with other template families (e.g., Pade resummation and effective one-body approximation). This quality of being effective for different template families makes the proposed bank suitable for a search that would use several of them in parallel (e.g., in a binary black hole search). The hexagonal template bank described in this paper is currently used to search for nonspinning inspiralling compact binaries in data from the Laser Interferometer Gravitational-Wave Observatory (LIGO)

  1. Template protection and its implementation in 3D face recognition systems

    NASA Astrophysics Data System (ADS)

    Zhou, Xuebing

    2007-04-01

    As biometric recognition systems are widely applied in various application areas, security and privacy risks have recently attracted the attention of the biometric community. Template protection techniques prevent stored reference data from revealing private biometric information and enhance the security of biometrics systems against attacks such as identity theft and cross matching. This paper concentrates on a template protection algorithm that merges methods from cryptography, error correction coding and biometrics. The key component of the algorithm is to convert biometric templates into binary vectors. It is shown that the binary vectors should be robust, uniformly distributed, statistically independent and collision-free so that authentication performance can be optimized and information leakage can be avoided. Depending on statistical character of the biometric template, different approaches for transforming biometric templates into compact binary vectors are presented. The proposed methods are integrated into a 3D face recognition system and tested on the 3D facial images of the FRGC database. It is shown that the resulting binary vectors provide an authentication performance that is similar to the original 3D face templates. A high security level is achieved with reasonable false acceptance and false rejection rates of the system, based on an efficient statistical analysis. The algorithm estimates the statistical character of biometric templates from a number of biometric samples in the enrollment database. For the FRGC 3D face database, the small distinction of robustness and discriminative power between the classification results under the assumption of uniquely distributed templates and the ones under the assumption of Gaussian distributed templates is shown in our tests.

  2. The molecular matching problem

    NASA Technical Reports Server (NTRS)

    Kincaid, Rex K.

    1993-01-01

    Molecular chemistry contains many difficult optimization problems that have begun to attract the attention of optimizers in the Operations Research community. Problems including protein folding, molecular conformation, molecular similarity, and molecular matching have been addressed. Minimum energy conformations for simple molecular structures such as water clusters, Lennard-Jones microclusters, and short polypeptides have dominated the literature to date. However, a variety of interesting problems exist and we focus here on a molecular structure matching (MSM) problem.

  3. A new algorithm for distorted fingerprints matching based on normalized fuzzy similarity measure.

    PubMed

    Chen, Xinjian; Tian, Jie; Yang, Xin

    2006-03-01

    Coping with nonlinear distortions in fingerprint matching is a challenging task. This paper proposes a novel algorithm, normalized fuzzy similarity measure (NFSM), to deal with the nonlinear distortions. The proposed algorithm has two main steps. First, the template and input fingerprints were aligned. In this process, the local topological structure matching was introduced to improve the robustness of global alignment. Second, the method NFSM was introduced to compute the similarity between the template and input fingerprints. The proposed algorithm was evaluated on fingerprints databases of FVC2004. Experimental results confirm that NFSM is a reliable and effective algorithm for fingerprint matching with nonliner distortions. The algorithm gives considerably higher matching scores compared to conventional matching algorithms for the deformed fingerprints. PMID:16519361

  4. Stability of High-Level Radioactive Waste Forms

    SciTech Connect

    Besmann, T.M.

    2001-06-22

    High-level waste (HLW) glass compositions, processing schemes, limits on waste content, and corrosion/dissolution release models are dependent on an accurate knowledge of melting temperatures and thermochemical values. Unfortunately, existing models for predicting these temperatures are empirically-based, depending on extrapolations of experimental information. In addition, present models of leaching behavior of glass waste forms use simplistic assumptions or experimentally measured values obtained under non-realistic conditions. There is thus a critical need for both more accurate and more widely applicable models for HLW glass behavior, which this project addressed. Significant progress was made in this project on modeling HLW glass. Borosilicate glass was accurately represented along with the additional important components that contain iron, lithium, potassium, magnesium, and calcium. The formation of crystalline inclusions in the glass, an issue in Hanford HLW formulations, was modeled and shown to be predictive. Thus the results of this work have already demonstrated practical benefits with the ability to map compositional regions where crystalline material forms, and therefore avoid that detrimental effect. With regard to a fundamental understanding, added insights on the behavior of the components of glass have been obtained, including the potential formation of molecular clusters. The EMSP project had very significant effects beyond the confines of Environmental Management. The models developed for glass have been used to solve a very costly problem in the corrosion of refractories for glass production. The effort resulted in another laboratory, Sandia National Laboratories-Livermore, to become conversant in the techniques and to apply those through a DOE Office of Industrial Technologies project joint with PPG Industries. The glass industry as a whole is now cognizant of these capabilities, and there is a Glass Manufacturer's Research Institute proposal

  5. High-level waste issues and resolutions document

    SciTech Connect

    Not Available

    1994-05-01

    The High-Level Waste (HLW) Issues and Resolutions Document recognizes US Department of Energy (DOE) complex-wide HLW issues and offers potential corrective actions for resolving these issues. Westinghouse Management and Operations (M&O) Contractors are effectively managing HLW for the Department of Energy at four sites: Idaho National Engineering Laboratory (INEL), Savannah River Site (SRS), West Valley Demonstration Project (WVDP), and Hanford Reservation. Each site is at varying stages of processing HLW into a more manageable form. This HLW Issues and Resolutions Document identifies five primary issues that must be resolved in order to reach the long-term objective of HLW repository disposal. As the current M&O contractor at DOE`s most difficult waste problem sites, Westinghouse recognizes that they have the responsibility to help solve some of the complexes` HLW problems in a cost effective manner by encouraging the M&Os to work together by sharing expertise, eliminating duplicate efforts, and sharing best practices. Pending an action plan, Westinghouse M&Os will take the initiative on those corrective actions identified as the responsibility of an M&O. This document captures issues important to the management of HLW. The proposed resolutions contained within this document set the framework for the M&Os and DOE work cooperatively to develop an action plan to solve some of the major complex-wide problems. Dialogue will continue between the M&Os, DOE, and other regulatory agencies to work jointly toward the goal of storing, treating, and immobilizing HLW for disposal in an environmentally sound, safe, and cost effective manner.

  6. Deep borehole disposal of high-level radioactive waste.

    SciTech Connect

    Stein, Joshua S.; Freeze, Geoffrey A.; Brady, Patrick Vane; Swift, Peter N.; Rechard, Robert Paul; Arnold, Bill Walter; Kanney, Joseph F.; Bauer, Stephen J.

    2009-07-01

    Preliminary evaluation of deep borehole disposal of high-level radioactive waste and spent nuclear fuel indicates the potential for excellent long-term safety performance at costs competitive with mined repositories. Significant fluid flow through basement rock is prevented, in part, by low permeabilities, poorly connected transport pathways, and overburden self-sealing. Deep fluids also resist vertical movement because they are density stratified. Thermal hydrologic calculations estimate the thermal pulse from emplaced waste to be small (less than 20 C at 10 meters from the borehole, for less than a few hundred years), and to result in maximum total vertical fluid movement of {approx}100 m. Reducing conditions will sharply limit solubilities of most dose-critical radionuclides at depth, and high ionic strengths of deep fluids will prevent colloidal transport. For the bounding analysis of this report, waste is envisioned to be emplaced as fuel assemblies stacked inside drill casing that are lowered, and emplaced using off-the-shelf oilfield and geothermal drilling techniques, into the lower 1-2 km portion of a vertical borehole {approx}45 cm in diameter and 3-5 km deep, followed by borehole sealing. Deep borehole disposal of radioactive waste in the United States would require modifications to the Nuclear Waste Policy Act and to applicable regulatory standards for long-term performance set by the US Environmental Protection Agency (40 CFR part 191) and US Nuclear Regulatory Commission (10 CFR part 60). The performance analysis described here is based on the assumption that long-term standards for deep borehole disposal would be identical in the key regards to those prescribed for existing repositories (40 CFR part 197 and 10 CFR part 63).

  7. PLUTONIUM/HIGH-LEVEL VITRIFIED WASTE BDBE DOSE CALCULATION

    SciTech Connect

    D.C. Richardson

    2003-03-19

    In accordance with the Nuclear Waste Policy Amendments Act of 1987, Yucca Mountain was designated as the site to be investigated as a potential repository for the disposal of high-level radioactive waste. The Yucca Mountain site is an undeveloped area located on the southwestern edge of the Nevada Test Site (NTS), about 100 miles northwest of Las Vegas. The site currently lacks rail service or an existing right-of-way. If the Yucca Mountain site is found suitable for the repository, rail service is desirable to the Office of Civilian Waste Management (OCRWM) Program because of the potential of rail transportation to reduce costs and to reduce the number of shipments relative to highway transportation. A Preliminary Rail Access Study evaluated 13 potential rail spur options. Alternative routes within the major options were also developed. Each of these options was then evaluated for potential land use conflicts and access to regional rail carriers. Three potential routes having few land use conflicts and having access to regional carriers were recommended for further investigation. Figure 1-1 shows these three routes. The Jean route is estimated to be about 120 miles long, the Carlin route to be about 365 miles long, and Caliente route to be about 365 miles long. The remaining ten routes continue to be monitored and should any of the present conflicts change, a re-evaluation of that route will be made. Complete details of the evaluation of the 13 routes can be found in the previous study. The DOE has not identified any preferred route and recognizes that the transportation issues need a full and open treatment under the National Environmental Policy Act. The issue of transportation will be included in public hearings to support development of the Environmental Impact Statement (EIS) proceedings for either the Monitored Retrievable Storage Facility or the Yucca Mountain Project or both.

  8. Liver regeneration in rats administered high levels of carbohydrates.

    PubMed

    Gershbein, L L

    1976-01-01

    Partially hepatectomized male rats were administered high levels of carbohydrates by drinker, in a casein-cellulose synthetic medium and in a commercial meal over a period of 10 days after surgery and the amount of liver regenerating or the increment ascertained; representative hepatic glycogen changes were also followed. Of the carbohydrate solutions, 5% levulose, 5% levulose+5% glucose and 10% sucrose increased the extent of liver regeneration as was also the case with the synthetic diet suplemented with 30 and 60% glucose, 30 and 60% levulose, 30% levulose+30% glucose, 30% each of galactose and the arabinoglactan, Stractan and 60% each of sucrose, honey and unsulphured molasses. The liver increments and glycogen contents were in the control range for animals fed the synthetic diet containing 30% each of lactose, sorbitol, corn starch and raffinose pentahydrate, 5% ascorbic acid and 15% L-arabinose but the liver glycogen was depressed with 30% xylose, a diet which was poorly tolerated; 15% mannitol caused a decrease inthe increment. The incorporation of several sugars into the commercial rat meal, including xylose (11%), raffinose (15%), L-arabinose (8%), D-arabinose (5%), L-sorbose (17%), galactosamine (0.20%) and galactono-gamma-lactone (10%), led to little change over the control increments. In intact rats fed the synthetic diet containing 30% each of glucose, lactose, galactose, sucrose and levulose for an interval of 10 days, the wet and dry liver--body weight ratios were significantly elevated only with the last two sugars but liver glycogen was increased in each instance.

  9. Review of high-level waste form properties. [146 bibliographies

    SciTech Connect

    Rusin, J.M.

    1980-12-01

    This report is a review of waste form options for the immobilization of high-level-liquid wastes from the nuclear fuel cycle. This review covers the status of international research and development on waste forms as of May 1979. Although the emphasis in this report is on waste form properties, process parameters are discussed where they may affect final waste form properties. A summary table is provided listing properties of various nuclear waste form options. It is concluded that proposed waste forms have properties falling within a relatively narrow range. In regard to crystalline versus glass waste forms, the conclusion is that either glass of crystalline materials can be shown to have some advantage when a single property is considered; however, at this date no single waste form offers optimum properties over the entire range of characteristics investigated. A long-term effort has been applied to the development of glass and calcine waste forms. Several additional waste forms have enough promise to warrant continued research and development to bring their state of development up to that of glass and calcine. Synthetic minerals, the multibarrier approach with coated particles in a metal matrix, and high pressure-high temperature ceramics offer potential advantages and need further study. Although this report discusses waste form properties, the total waste management system should be considered in the final selection of a waste form option. Canister design, canister materials, overpacks, engineered barriers, and repository characteristics, as well as the waste form, affect the overall performance of a waste management system. These parameters were not considered in this comparison.

  10. High-level disinfection of gastrointestinal endoscope reprocessing

    PubMed Central

    Chiu, King-Wah; Lu, Lung-Sheng; Chiou, Shue-Shian

    2015-01-01

    High level disinfection (HLD) of the gastrointestinal (GI) endoscope is not simply a slogan, but rather is a form of experimental monitoring-based medicine. By definition, GI endoscopy is a semicritical medical device. Hence, such medical devices require major quality assurance for disinfection. And because many of these items are temperature sensitive, low-temperature chemical methods, such as liquid chemical germicide, must be used rather than steam sterilization. In summarizing guidelines for infection prevention and control for GI endoscopy, there are three important steps that must be highlighted: manual washing, HLD with automated endoscope reprocessor, and drying. Strict adherence to current guidelines is required because compared to any other medical device, the GI endoscope is associated with more outbreaks linked to inadequate cleaning or disinfecting during HLD. Both experimental evaluation on the surveillance bacterial cultures and in-use clinical results have shown that, the monitoring of the stringent processes to prevent and control infection is an essential component of the broader strategy to ensure the delivery of safe endoscopy services, because endoscope reprocessing is a multistep procedure involving numerous factors that can interfere with its efficacy. Based on our years of experience in the surveillance of culture monitoring of endoscopic reprocessing, we aim in this study to carefully describe what details require attention in the GI endoscopy disinfection and to share our experience so that patients can be provided with high quality and safe medical practices. Quality management encompasses all aspects of pre- and post-procedural care including the efficiency of the endoscopy unit and reprocessing area, as well as the endoscopic procedure itself. PMID:25699232

  11. High Levels of Molecular Chlorine found in the Arctic Atmosphere

    NASA Astrophysics Data System (ADS)

    Liao, J.; Huey, L. G.; Liu, Z.; Tanner, D.; Cantrell, C. A.; Orlando, J. J.; Flocke, F. M.; Shepson, P. B.; Weinheimer, A. J.; Hall, S. R.; Beine, H.; Wang, Y.; Ingall, E. D.; Thompson, C. R.; Hornbrook, R. S.; Apel, E. C.; Fried, A.; Mauldin, L.; Smith, J. N.; Staebler, R. M.; Neuman, J. A.; Nowak, J. B.

    2014-12-01

    Chlorine radicals are a strong atmospheric oxidant, particularly in polar regions where levels of hydroxyl radicals can be quite low. In the atmosphere, chlorine radicals expedite the degradation of methane and tropospheric ozone and the oxidation of mercury to more toxic forms. Here, we present direct measurements of molecular chlorine levels in the Arctic marine boundary layer in Barrow, Alaska, collected in the spring of 2009 over a six-week period using chemical ionization mass spectrometry. We detected high levels of molecular chlorine of up to 400 pptv. Concentrations peaked in the early morning and late afternoon and fell to near-zero levels at night. Average daytime molecular chlorine levels were correlated with ozone concentrations, suggesting that sunlight and ozone are required for molecular chlorine formation. Using a time-dependent box model, we estimated that the chlorine radicals produced from the photolysis of molecular chlorine on average oxidized more methane than hydroxyl radicals and enhanced the abundance of short-lived peroxy radicals. Elevated hydroperoxyl radical levels, in turn, promoted the formation of hypobromous acid, which catalyzed mercury oxidation and the breakdown of tropospheric ozone. Therefore, we propose that molecular chlorine exerts a significant effect on the atmospheric chemistry in the Arctic. While the formation mechanisms of molecular chlorine are not yet understood, the main potential sources of chlorine include snowpack, sea salt, and sea ice. There is recent evidence of molecular halogen (Br2 and Cl2) formation in the Arctic snowpack. The coverage and composition of the snow may control halogen chemistry in the Arctic. Changes of sea ice and snow cover in the changing climate may affect air-snow-ice interaction and have a significant impact on the levels of radicals, ozone, mercury and methane in the Arctic troposphere.

  12. Radiative Lifetimes for High Levels of Neutral Fe

    NASA Astrophysics Data System (ADS)

    Lawler, James E.; Den Hartog, E.; Guzman, A.

    2013-01-01

    New radiative lifetime measurements for ~ 50 high lying levels of Fe I are reported. Laboratory astrophysics faces a challenge to provide basic spectroscopic data, especially reliable atomic transition probabilities, in the IR region for abundance studies. The availability of HgCdTe (HAWAII) detector arrays has opened IR spectral regions for extensive new spectroscopic studies. The SDSS III APOGEE project in the H-Band is an important example which will penetrate the dust obscuring the Galactic bulge. APOGEE will survey elemental abundances of 100,000 red giant stars in the bulge, bar, disk, and halo of the Milky Way. Many stellar spectra in the H-Band are, as expected, dominated by transitions of Fe I. Most of these IR transitions connect high levels of Fe. Our program has started an effort to meet this challenge with new radiative lifetime measurements on high lying levels of Fe I using time resolved laser induced fluorescence (TRLIF). The TRLIF method is typically accurate to 5% and is efficient. Our goal is to combine these accurate, absolute radiative lifetimes with emission branching fractions [1] to determine log(gf) values of the highest quality for Fe I lines in the UV, visible, and IR. This method was used very successfully by O’Brian et al. [2] on lower levels of Fe I. This method is still the best available for all but very simple spectra for which ab-initio theory is more accurate. Supported by NSF grant AST-0907732. [1] Branching fractions are being measured by M. Ruffoni and J. C. Pickering at Imperial College London. [2] O'Brian, T. R., Wickliffe, M. E., Lawler, J. E., Whaling, W., & Brault, J. W. 1991, J. Opt. Soc. Am. B 8, 1185

  13. High-level disinfection of gastrointestinal endoscope reprocessing.

    PubMed

    Chiu, King-Wah; Lu, Lung-Sheng; Chiou, Shue-Shian

    2015-02-20

    High level disinfection (HLD) of the gastrointestinal (GI) endoscope is not simply a slogan, but rather is a form of experimental monitoring-based medicine. By definition, GI endoscopy is a semicritical medical device. Hence, such medical devices require major quality assurance for disinfection. And because many of these items are temperature sensitive, low-temperature chemical methods, such as liquid chemical germicide, must be used rather than steam sterilization. In summarizing guidelines for infection prevention and control for GI endoscopy, there are three important steps that must be highlighted: manual washing, HLD with automated endoscope reprocessor, and drying. Strict adherence to current guidelines is required because compared to any other medical device, the GI endoscope is associated with more outbreaks linked to inadequate cleaning or disinfecting during HLD. Both experimental evaluation on the surveillance bacterial cultures and in-use clinical results have shown that, the monitoring of the stringent processes to prevent and control infection is an essential component of the broader strategy to ensure the delivery of safe endoscopy services, because endoscope reprocessing is a multistep procedure involving numerous factors that can interfere with its efficacy. Based on our years of experience in the surveillance of culture monitoring of endoscopic reprocessing, we aim in this study to carefully describe what details require attention in the GI endoscopy disinfection and to share our experience so that patients can be provided with high quality and safe medical practices. Quality management encompasses all aspects of pre- and post-procedural care including the efficiency of the endoscopy unit and reprocessing area, as well as the endoscopic procedure itself.

  14. High-level disinfection of gastrointestinal endoscope reprocessing.

    PubMed

    Chiu, King-Wah; Lu, Lung-Sheng; Chiou, Shue-Shian

    2015-02-20

    High level disinfection (HLD) of the gastrointestinal (GI) endoscope is not simply a slogan, but rather is a form of experimental monitoring-based medicine. By definition, GI endoscopy is a semicritical medical device. Hence, such medical devices require major quality assurance for disinfection. And because many of these items are temperature sensitive, low-temperature chemical methods, such as liquid chemical germicide, must be used rather than steam sterilization. In summarizing guidelines for infection prevention and control for GI endoscopy, there are three important steps that must be highlighted: manual washing, HLD with automated endoscope reprocessor, and drying. Strict adherence to current guidelines is required because compared to any other medical device, the GI endoscope is associated with more outbreaks linked to inadequate cleaning or disinfecting during HLD. Both experimental evaluation on the surveillance bacterial cultures and in-use clinical results have shown that, the monitoring of the stringent processes to prevent and control infection is an essential component of the broader strategy to ensure the delivery of safe endoscopy services, because endoscope reprocessing is a multistep procedure involving numerous factors that can interfere with its efficacy. Based on our years of experience in the surveillance of culture monitoring of endoscopic reprocessing, we aim in this study to carefully describe what details require attention in the GI endoscopy disinfection and to share our experience so that patients can be provided with high quality and safe medical practices. Quality management encompasses all aspects of pre- and post-procedural care including the efficiency of the endoscopy unit and reprocessing area, as well as the endoscopic procedure itself. PMID:25699232

  15. Engineering Escherichia coli for high-level production of propionate.

    PubMed

    Akawi, Lamees; Srirangan, Kajan; Liu, Xuejia; Moo-Young, Murray; Perry Chou, C

    2015-07-01

    Mounting environmental concerns associated with the use of petroleum-based chemical manufacturing practices has generated significant interest in the development of biological alternatives for the production of propionate. However, biological platforms for propionate production have been limited to strict anaerobes, such as Propionibacteria and select Clostridia. In this work, we demonstrated high-level heterologous production of propionate under microaerobic conditions in engineered Escherichia coli. Activation of the native Sleeping beauty mutase (Sbm) operon not only transformed E. coli to be propionogenic (i.e., propionate-producing) but also introduced an intracellular "flux competition" between the traditional C2-fermentative pathway and the novel C3-fermentative pathway. Dissimilation of the major carbon source of glycerol was identified to critically affect such "flux competition" and, therefore, propionate synthesis. As a result, the propionogenic E. coli was further engineered by inactivation or overexpression of various genes involved in the glycerol dissimilation pathways and their individual genetic effects on propionate production were investigated. Generally, knocking out genes involved in glycerol dissimilation (except glpA) can minimize levels of solventogenesis and shift more dissimilated carbon flux toward the C3-fermentative pathway. For optimal propionate production with high C3:C2-fermentative product ratios, glycerol dissimilation should be channeled through the respiratory pathway and, upon suppressed solventogenesis with minimal production of highly reduced alcohols, the alternative NADH-consuming route associated with propionate synthesis can be critical for more flexible redox balancing. With the implementation of various biochemical and genetic strategies, high propionate titers of more than 11 g/L with high yields up to 0.4 g-propionate/g-glycerol (accounting for ~50 % of dissimilated glycerol) were achieved, demonstrating the

  16. Scale and Rotation Invariant Matching Using Linearly Augmented Trees.

    PubMed

    Jiang, Hao; Tian, Tai-Peng; Sclaroff, Stan

    2015-12-01

    We propose a novel linearly augmented tree method for efficient scale and rotation invariant object matching. The proposed method enforces pairwise matching consistency defined on trees, and high-order constraints on all the sites of a template. The pairwise constraints admit arbitrary metrics while the high-order constraints use L1 norms and therefore can be linearized. Such a linearly augmented tree formulation introduces hyperedges and loops into the basic tree structure. But, different from a general loopy graph, its special structure allows us to relax and decompose the optimization into a sequence of tree matching problems that are efficiently solvable by dynamic programming. The proposed method also works on continuous scale and rotation parameters; we can match with a scale up to any large value with the same efficiency. Our experiments on ground truth data and a variety of real images and videos show that the proposed method is efficient, accurate and reliable. PMID:26539858

  17. Latent palmprint matching.

    PubMed

    Jain, Anil K; Feng, Jianjiang

    2009-06-01

    The evidential value of palmprints in forensic applications is clear as about 30 percent of the latents recovered from crime scenes are from palms. While biometric systems for palmprint-based personal authentication in access control type of applications have been developed, they mostly deal with low-resolution (about 100 ppi) palmprints and only perform full-to-full palmprint matching. We propose a latent-to-full palmprint matching system that is needed in forensic applications. Our system deals with palmprints captured at 500 ppi (the current standard in forensic applications) or higher resolution and uses minutiae as features to be compatible with the methodology used by latent experts. Latent palmprint matching is a challenging problem because latent prints lifted at crime scenes are of poor image quality, cover only a small area of the palm, and have a complex background. Other difficulties include a large number of minutiae in full prints (about 10 times as many as fingerprints), and the presence of many creases in latents and full prints. A robust algorithm to reliably estimate the local ridge direction and frequency in palmprints is developed. This facilitates the extraction of ridge and minutiae features even in poor quality palmprints. A fixed-length minutia descriptor, MinutiaCode, is utilized to capture distinctive information around each minutia and an alignment-based minutiae matching algorithm is used to match two palmprints. Two sets of partial palmprints (150 live-scan partial palmprints and 100 latent palmprints) are matched to a background database of 10,200 full palmprints to test the proposed system. Despite the inherent difficulty of latent-to-full palmprint matching, rank-1 recognition rates of 78.7 and 69 percent, respectively, were achieved in searching live-scan partial palmprints and latent palmprints against the background database.

  18. Stereolithographic Surgical Template: A Review

    PubMed Central

    Dandekeri, Shilpa Sudesh; Sowmya, M.K.; Bhandary, Shruthi

    2013-01-01

    Implant placement has become a routine modality of dental care.Improvements in surgical reconstructive methods as well as increased prosthetic demands,require a highly accurate diagnosis, planning and placement. Recently,computer-aided design and manufacturing have made it possible to use data from computerised tomography to not only plan implant rehabilitation,but also transfer this information to the surgery.A review on one of this technique called Stereolithography is presented in this article.It permits graphic and complex 3D implant placement and fabrication of stereolithographic surgical templates. Also offers many significant benefits over traditional procedures. PMID:24179955

  19. Vertical Carbon Nanotube Device in Nanoporous Templates

    NASA Technical Reports Server (NTRS)

    Maschmann, Matthew Ralph (Inventor); Fisher, Timothy Scott (Inventor); Sands, Timothy (Inventor); Bashir, Rashid (Inventor)

    2014-01-01

    A modified porous anodic alumina template (PAA) containing a thin CNT catalyst layer directly embedded into the pore walls. CNT synthesis using the template selectively catalyzes SWNTs and DWNTs from the embedded catalyst layer to the top PAA surface, creating a vertical CNT channel within the pores. Subsequent processing allows for easy contact metallization and adaptable functionalization of the CNTs and template for a myriad of applications.

  20. High level waste interim storge architecture selection - decision report

    SciTech Connect

    Calmus, R.B.

    1996-09-27

    The U.S. Department of Energy (DOE) has embarked upon a course to acquire Hanford Site tank waste treatment and immobilization services using privatized facilities (RL 1996a). This plan contains a two-phased approach. Phase I is a proof-of-principle/connnercial demonstration- scale effort and Phase II is a fiill-scale production effort. In accordance with the planned approach, interim storage and disposal of various products from privatized facilities are to be DOE fumished. The high-level waste (BLW) interim storage options, or alternative architectures, were identified and evaluated to provide the framework from which to select the most viable method of Phase I BLW interim storage (Calmus 1996). This evaluation, hereafter referred to as the Alternative Architecture Evaluation, was performed to established performance and risk criteria (technical merit, cost, schedule, etc.). Based on evaluation results, preliminary architectures and path forward reconunendations were provided for consideration in the architecture decision- maldng process. The decision-making process used for selection of a Phase I solidified BLW interim storage architecture was conducted in accordance with an approved Decision Plan (see the attachment). This decision process was based on TSEP-07,Decision Management Procedure (WHC 1995). The established decision process entailed a Decision Board, consisting of Westinghouse Hanford Company (VY`HC) management staff, and included appointment of a VTHC Decision Maker. The Alternative Architecture Evaluation results and preliminary recommendations were presented to the Decision Board members for their consideration in the decision-making process. The Alternative Architecture Evaluation was prepared and issued before issuance of @C-IP- 123 1, Alternatives Generation and Analysis Procedure (WI-IC 1996a), but was deemed by the Board to fully meet the intent of WHC-IP-1231. The Decision Board members concurred with the bulk of the Alternative Architecture

  1. JET MIXING ANALYSIS FOR SRS HIGH-LEVEL WASTE RECOVERY

    SciTech Connect

    Lee, S.

    2011-07-05

    The process of recovering the waste in storage tanks at the Savannah River Site (SRS) typically requires mixing the contents of the tank to ensure uniformity of the discharge stream. Mixing is accomplished with one to four slurry pumps located within the tank liquid. The slurry pump may be fixed in position or they may rotate depending on the specific mixing requirements. The high-level waste in Tank 48 contains insoluble solids in the form of potassium tetraphenyl borate compounds (KTPB), monosodium titanate (MST), and sludge. Tank 48 is equipped with 4 slurry pumps, which are intended to suspend the insoluble solids prior to transfer of the waste to the Fluidized Bed Steam Reformer (FBSR) process. The FBSR process is being designed for a normal feed of 3.05 wt% insoluble solids. A chemical characterization study has shown the insoluble solids concentration is approximately 3.05 wt% when well-mixed. The project is requesting a Computational Fluid Dynamics (CFD) mixing study from SRNL to determine the solids behavior with 2, 3, and 4 slurry pumps in operation and an estimate of the insoluble solids concentration at the suction of the transfer pump to the FBSR process. The impact of cooling coils is not considered in the current work. The work consists of two principal objectives by taking a CFD approach: (1) To estimate insoluble solids concentration transferred from Tank 48 to the Waste Feed Tank in the FBSR process and (2) To assess the impact of different combinations of four slurry pumps on insoluble solids suspension and mixing in Tank 48. For this work, several different combinations of a maximum of four pumps are considered to determine the resulting flow patterns and local flow velocities which are thought to be associated with sludge particle mixing. Two different elevations of pump nozzles are used for an assessment of the flow patterns on the tank mixing. Pump design and operating parameters used for the analysis are summarized in Table 1. The baseline

  2. High-level microsatellite instability in appendiceal carcinomas.

    PubMed

    Taggart, Melissa W; Galbincea, John; Mansfield, Paul F; Fournier, Keith F; Royal, Richard E; Overman, Michael J; Rashid, Asif; Abraham, Susan C

    2013-08-01

    High-level microsatellite instability (MSI-high) is found in approximately 15% of all colorectal adenocarcinomas (CRCs) and in at least 20% of right-sided cancers. It is most commonly due to somatic hypermethylation of the MLH1 gene promoter region, with familial cases (Lynch syndrome) representing only 2% to 3% of CRCs overall. In contrast to CRC, MSI-high in appendiceal adenocarcinomas is rare. Only 4 MSI-high appendiceal carcinomas and 1 MSI-high appendiceal serrated adenoma have been previously reported, and the prevalence of MSI in the appendix is unknown. We identified 108 appendiceal carcinomas from MD Anderson Cancer Center in which MSI status had been assessed by immunohistochemistry for the DNA mismatch-repair proteins MLH1, MSH2, MSH6, and PMS2 (n=83), polymerase chain reaction (n=7), or both (n=18). Three cases (2.8%) were MSI-high, and 1 was MSI-low. The 3 MSI-high cases included: (1) a poorly differentiated nonmucinous adenocarcinoma with loss of MLH1/PMS2 expression, lack of MLH1 promoter methylation, and lack of BRAF gene mutation, but no detected germline mutation in MLH1 from a 39-year-old man; (2) an undifferentiated carcinoma with loss of MSH2/MSH6, but no detected germline mutation in MSH2 or TACSTD1, from a 59-year-old woman; and (3) a moderately differentiated mucinous adenocarcinoma arising in a villous adenoma with loss of MSH2/MSH6 expression, in a 38-year-old man with a strong family history of CRC who declined germline testing. When the overall group of appendiceal carcinomas was classified according to histologic features and precursor lesions, the frequencies of MSI-high were: 3 of 108 (2.8%) invasive carcinomas, 3 of 96 (3.1%) invasive carcinomas that did not arise from a background of goblet cell carcinoid tumors, and 0 of 12 (0%) signet ring and mucinous carcinomas arising in goblet cell carcinoid tumors. These findings, in conjunction with the previously reported MSI-high appendiceal carcinomas, highlight the low prevalence of MSI

  3. Reusable, Extensible High-Level Data-Distribution Concept

    NASA Technical Reports Server (NTRS)

    James, Mark; Zima, Hans; Diaconescua, Roxana

    2007-01-01

    A framework for high-level specification of data distributions in data-parallel application programs has been conceived. [As used here, distributions signifies means to express locality (more specifically, locations of specified pieces of data) in a computing system composed of many processor and memory components connected by a network.] Inasmuch as distributions exert a great effect on the performances of application programs, it is important that a distribution strategy be flexible, so that distributions can be adapted to the requirements of those programs. At the same time, for the sake of productivity in programming and execution, it is desirable that users be shielded from such error-prone, tedious details as those of communication and synchronization. As desired, the present framework enables a user to refine a distribution type and adjust it to optimize the performance of an application program and conceals, from the user, the low-level details of communication and synchronization. The framework provides for a reusable, extensible, data-distribution design, denoted the design pattern, that is independent of a concrete implementation. The design pattern abstracts over coding patterns that have been found to be commonly encountered in both manually and automatically generated distributed parallel programs. The following description of the present framework is necessarily oversimplified to fit within the space available for this article. Distributions are among the elements of a conceptual data-distribution machinery, some of the other elements being denoted domains, index sets, and data collections (see figure). Associated with each domain is one index set and one distribution. A distribution class interface (where "class" is used in the object-oriented-programming sense) includes operations that enable specification of the mapping of an index to a unit of locality. Thus, "Map(Index)" specifies a unit, while "LocalLayout(Index)" specifies the local address

  4. Computer-assisted total knee arthroplasty using patient-specific templating.

    PubMed

    Hafez, M A; Chelule, K L; Seedhom, B B; Sherman, K P

    2006-03-01

    Current techniques used for total knee arthroplasty rely on conventional instrumentation that violates the intramedullary canals. Accuracy of the instrumentation is questionable, and assembly and disposal of the numerous pieces is time consuming. Navigation techniques are more accurate, but their broad application is limited by cost and complexity. We aimed to prove a new concept of computer-assisted preoperative planning to provide patient-specific templates that can replace conventional instruments. Computed tomography-based planning was used to design two virtual templates. Using rapid prototyping technology, virtual templates were transferred into physical templates (cutting blocks) with surfaces that matched the distal femur and proximal tibia. We performed 45 total knee arthroplasties on 16 cadaveric and 29 plastic knees, including a comparative trial against conventional instrumentations. All operations were performed using patient-specific templates with no conventional instrumentations, intramedullary perforation, tracking, or registration. The mean time for bone cutting was 9 minutes with a surgical assistant and 11 minutes without an assistant. Computer-assisted analyses of six random computed tomography scans showed mean errors for alignment and bone resection within 1.7 degrees and 0.8 mm (maximum, 2.3 degrees and 1.2 mm, respectively). Patient-specific templates are a practical alternative to conventional instrumentations, but additional clinical validation is required before clinical use.

  5. Activation of new attentional templates for real-world objects in visual search.

    PubMed

    Nako, Rebecca; Smith, Tim J; Eimer, Martin

    2015-05-01

    Visual search is controlled by representations of target objects (attentional templates). Such templates are often activated in response to verbal descriptions of search targets, but it is unclear whether search can be guided effectively by such verbal cues. We measured ERPs to track the activation of attentional templates for new target objects defined by word cues. On each trial run, a word cue was followed by three search displays that contained the cued target object among three distractors. Targets were detected more slowly in the first display of each trial run, and the N2pc component (an ERP marker of attentional target selection) was attenuated and delayed for the first relative to the two successive presentations of a particular target object, demonstrating limitations in the ability of word cues to activate effective attentional templates. N2pc components to target objects in the first display were strongly affected by differences in object imageability (i.e., the ability of word cues to activate a target-matching visual representation). These differences were no longer present for the second presentation of the same target objects, indicating that a single perceptual encounter is sufficient to activate a precise attentional template. Our results demonstrate the superiority of visual over verbal target specifications in the control of visual search, highlight the fact that verbal descriptions are more effective for some objects than others, and suggest that the attentional templates that guide search for particular real-world target objects are analog visual representations.

  6. Nonlinear matching measure for the analysis of on-off type DNA microarray images

    NASA Astrophysics Data System (ADS)

    Kim, Jong D.; Park, Misun; Kim, Jongwon

    2003-07-01

    In this paper, we propose a new nonlinear matching measure for automatic analysis of the on-off type DNA microarray images in which the hybridized spots are detected by the template matching method. The targeting spots of HPV DNA chips are designed for genotyping the human papilloma virus(HPV). The proposed measure is obtained by binarythresholding over the whole template region and taking the number of white pixels inside the spotted area. This measure is evaluated in terms of the accuracy of the estimated marker location to show better performance than the normalized covariance.

  7. Influence of template fill in graphoepitaxy DSA

    NASA Astrophysics Data System (ADS)

    Doise, Jan; Bekaert, Joost; Chan, Boon Teik; Hong, SungEun; Lin, Guanyang; Gronheid, Roel

    2016-03-01

    Directed self-assembly (DSA) of block copolymers (BCP) is considered a promising patterning approach for the 7 nm node and beyond. Specifically, a grapho-epitaxy process using a cylindrical phase BCP may offer an efficient solution for patterning randomly distributed contact holes with sub-resolution pitches, such as found in via and cut mask levels. In any grapho-epitaxy process, the pattern density impacts the template fill (local BCP thickness inside the template) and may cause defects due to respectively over- or underfilling of the template. In order to tackle this issue thoroughly, the parameters that determine template fill and the influence of template fill on the resulting pattern should be investigated. In this work, using three process flow variations (with different template surface energy), template fill is experimentally characterized as a function of pattern density and film thickness. The impact of these parameters on template fill is highly dependent on the process flow, and thus pre-pattern surface energy. Template fill has a considerable effect on the pattern transfer of the DSA contact holes into the underlying layer. Higher fill levels give rise to smaller contact holes and worse critical dimension uniformity. These results are important towards DSA-aware design and show that fill is a crucial parameter in grapho-epitaxy DSA.

  8. Method of installing sub-sea templates

    SciTech Connect

    Hampton, J.E.

    1984-03-06

    A subsea template is installed by a method which includes the steps of securing the template in a position beneath the deck of a semi-submersible drilling vessel, moving the semi-submersible drilling vessel to an appropriate offshore site and subsequently lowering the template from the semi-submersible to the sea bed. In addition, at least three anchorage templates may be loaded onto one or both of the pontoons of the semi-submersible drilling vessel at its original position and are subsequently lowered from the pontoons to their respective locations on the sea bed after the semi-submersible has moved to the offshore site.

  9. Cementitious Grout for Closing SRS High Level Waste Tanks - 12315

    SciTech Connect

    Langton, C.A.; Stefanko, D.B.; Burns, H.H.; Waymer, J.; Mhyre, W.B.; Herbert, J.E.; Jolly, J.C. Jr.

    2012-07-01

    In 1997, the first two United States Department of Energy (US DOE) high level waste tanks (Tanks 17-F and 20-F: Type IV, single shell tanks) were taken out of service (permanently closed) at the Savannah River Site (SRS). In 2012, the DOE plans to remove from service two additional Savannah River Site (SRS) Type IV high-level waste tanks, Tanks 18-F and 19-F. These tanks were constructed in the late 1950's and received low-heat waste and do not contain cooling coils. Operational closure of Tanks 18-F and 19-F is intended to be consistent with the applicable requirements of the Resource Conservation and Recovery Act (RCRA) and the Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA) and will be performed in accordance with South Carolina Department of Health and Environmental Control (SCDHEC). The closure will physically stabilize two 4.92E+04 cubic meter (1.3 E+06 gallon) carbon steel tanks and isolate and stabilize any residual contaminants left in the tanks. Ancillary equipment abandoned in the tanks will also be filled to the extent practical. A Performance Assessment (PA) has been developed to assess the long-term fate and transport of residual contamination in the environment resulting from the operational closure of the F-Area Tank Farm (FTF) waste tanks. Next generation flowable, zero-bleed cementitious grouts were designed, tested, and specified for closing Tanks 18-F and 19-F and for filling the abandoned equipment. Fill requirements were developed for both the tank and equipment grouts. All grout formulations were required to be alkaline with a pH of 12.4 and to be chemically reducing with a reduction potential (Eh) of -200 to -400. Grouts with this chemistry stabilize potential contaminants of concern. This was achieved by including Portland cement and Grade 100 slag in the mixes, respectively. Ingredients and proportions of cementitious reagents were selected and adjusted to support the mass placement strategy developed by

  10. Colloidal assembly by ice templating.

    PubMed

    Kumaraswamy, Guruswamy; Biswas, Bipul; Choudhury, Chandan Kumar

    2016-01-01

    We investigate ice templating of aqueous dispersions of polymer coated colloids and crosslinkers, at particle concentrations far below that required to form percolated monoliths. Freezing the aqueous dispersions forces the particles into close proximity to form clusters, that are held together as the polymer chains coating the particles are crosslinked. We observe that, with an increase in the particle concentration from about 10(6) to 10(8) particles per ml, there is a transition from isolated single particles to increasingly larger clusters. In this concentration range, most of the colloidal clusters formed are linear or sheet like particle aggregates. Remarkably, the cluster size distribution for clusters smaller than about 30 particles, as well as the size distribution of linear clusters, is only weakly dependent on the dispersion concentration in the range that we investigate. We demonstrate that the main features of cluster formation are captured by kinetic simulations that do not consider hydrodynamics or instabilities at the growing ice front due to particle concentration gradients. Thus, clustering of colloidal particles by ice templating dilute dispersions appears to be governed only by particle exclusion by the growing ice crystals that leads to their accumulation at ice crystal boundaries.

  11. Advanced High-Level Waste Glass Research and Development Plan

    SciTech Connect

    Peeler, David K.; Vienna, John D.; Schweiger, Michael J.; Fox, Kevin M.

    2015-07-01

    The U.S. Department of Energy Office of River Protection (ORP) has implemented an integrated program to increase the loading of Hanford tank wastes in glass while meeting melter lifetime expectancies and process, regulatory, and product quality requirements. The integrated ORP program is focused on providing a technical, science-based foundation from which key decisions can be made regarding the successful operation of the Hanford Tank Waste Treatment and Immobilization Plant (WTP) facilities. The fundamental data stemming from this program will support development of advanced glass formulations, key process control models, and tactical processing strategies to ensure safe and successful operations for both the low-activity waste (LAW) and high-level waste (HLW) vitrification facilities with an appreciation toward reducing overall mission life. The purpose of this advanced HLW glass research and development plan is to identify the near-, mid-, and longer-term research and development activities required to develop and validate advanced HLW glasses and their associated models to support facility operations at WTP, including both direct feed and full pretreatment flowsheets. This plan also integrates technical support of facility operations and waste qualification activities to show the interdependence of these activities with the advanced waste glass (AWG) program to support the full WTP mission. Figure ES-1 shows these key ORP programmatic activities and their interfaces with both WTP facility operations and qualification needs. The plan is a living document that will be updated to reflect key advancements and mission strategy changes. The research outlined here is motivated by the potential for substantial economic benefits (e.g., significant increases in waste throughput and reductions in glass volumes) that will be realized when advancements in glass formulation continue and models supporting facility operations are implemented. Developing and applying advanced

  12. Partial hue-matching.

    PubMed

    Logvinenko, Alexander D; Beattie, Lesley L

    2011-01-01

    It is widely believed that color can be decomposed into a small number of component colors. Particularly, each hue can be described as a combination of a restricted set of component hues. Methods, such as color naming and hue scaling, aim at describing color in terms of the relative amount of the component hues. However, there is no consensus on the nomenclature of component hues. Moreover, the very notion of hue (not to mention component hue) is usually defined verbally rather than perceptually. In this paper, we make an attempt to operationalize such a fundamental attribute of color as hue without the use of verbal terms. Specifically, we put forth a new method--partial hue-matching--that is based on judgments of whether two colors have some hue in common. It allows a set of component hues to be established objectively, without resorting to verbal definitions. Specifically, the largest sets of color stimuli, all of which partially match each other (referred to as chromaticity classes), can be derived from the observer's partial hue-matches. A chromaticity class proves to consist of all color stimuli that contain a particular component hue. Thus, the chromaticity classes fully define the set of component hues. Using samples of Munsell papers, a few experiments on partial hue-matching were carried out with twelve inexperienced normal trichromatic observers. The results reinforce the classical notion of four component hues (yellow, blue, red, and green). Black and white (but not gray) were also found to be component colors. PMID:21742961

  13. Inter-image matching

    NASA Technical Reports Server (NTRS)

    Wolfe, R. H., Jr.; Juday, R. D.

    1982-01-01

    Interimage matching is the process of determining the geometric transformation required to conform spatially one image to another. In principle, the parameters of that transformation are varied until some measure of some difference between the two images is minimized or some measure of sameness (e.g., cross-correlation) is maximized. The number of such parameters to vary is faily large (six for merely an affine transformation), and it is customary to attempt an a priori transformation reducing the complexity of the residual transformation or subdivide the image into small enough match zones (control points or patches) that a simple transformation (e.g., pure translation) is applicable, yet large enough to facilitate matching. In the latter case, a complex mapping function is fit to the results (e.g., translation offsets) in all the patches. The methods reviewed have all chosen one or both of the above options, ranging from a priori along-line correction for line-dependent effects (the high-frequency correction) to a full sensor-to-geobase transformation with subsequent subdivision into a grid of match points.

  14. MATCH PLAY, SOAP HOPE.

    PubMed

    Rigby, Perry G; Gururaja, Ramnarayan Paragi; Hilton, Charles

    2015-01-01

    The Medical Education Commission (MEC) has published Graduate Medical Education (GME) data since 1997, including the National Residency Matching Program (NRMP) and the Supplemental Offer and Acceptance Program (SOAP), and totals all GME in Louisiana for annual publication. The NRMP provides the quotas and filled positions by institution. Following the NRMP, SOAP attempts to place unmatched candidates with slots that are unfilled. The NRMP Fellowship match also comes close to filling quotas and has a significant SOAP. Thus, an accurate number of total filled positions is best obtained in July of the same match year. All GME programs in Louisiana are represented for 2014, and the number trend 2005 to 2014 shows that the only dip was post-Katrina in 2005-2006. The March match after SOAP 2014 is at the peak for both senior medical students and post graduate year one (PGY-1) residents. A significant and similar number stay in Louisiana GME institutions after graduation. Also noteworthy is that a lower percentage are staying in state, due to increased enrollment in all Louisiana medical schools. PMID:27159458

  15. Derivatives of Matching.

    ERIC Educational Resources Information Center

    Herrnstein, R. J.

    1979-01-01

    The matching law for reinforced behavior solves a differential equation relating infinitesimal changes in behavior to infinitesimal changes in reinforcement. The equation expresses plausible conceptions of behavior and reinforcement, yields a simple nonlinear operator model for acquisition, and suggests a alternative to the economic law of…

  16. Is Matching Innate?

    ERIC Educational Resources Information Center

    Gallistel, C. R.; King, Adam Philip; Gottlieb, Daniel; Balci, Fuat; Papachristos, Efstathios B.; Szalecki, Matthew; Carbone, Kimberly S.

    2007-01-01

    Experimentally naive mice matched the proportions of their temporal investments (visit durations) in two feeding hoppers to the proportions of the food income (pellets per unit session time) derived from them in three experiments that varied the coupling between the behavioral investment and food income, from no coupling to strict coupling.…

  17. Visual Templates in Pattern Generalization Activity

    ERIC Educational Resources Information Center

    Rivera, F. D.

    2010-01-01

    In this research article, I present evidence of the existence of visual templates in pattern generalization activity. Such templates initially emerged from a 3-week design-driven classroom teaching experiment on pattern generalization involving linear figural patterns and were assessed for existence in a clinical interview that was conducted four…

  18. Indexing Images: Testing an Image Description Template.

    ERIC Educational Resources Information Center

    Jorgensen, Corinne

    1996-01-01

    A template for pictorial image description to be used by novice image searchers in recording their descriptions of images was tested; image attribute classes derived in previous research were used to model the template. Results indicated that users may need training and/or more guidance to correctly assign descriptors to higher-level classes.…

  19. Air Sampling System Evaluation Template

    2000-05-09

    The ASSET1.0 software provides a template with which a user can evaluate an Air Sampling System against the latest version of ANSI N13.1 "Sampling and Monitoring Releases of Airborne Radioactive Substances from the Stacks and Ducts of Nuclear Facilities". The software uses the ANSI N13.1 PIC levels to establish basic design criteria for the existing or proposed sampling system. The software looks at such criteria as PIC level, type of radionuclide emissions, physical state ofmore » the radionuclide, nozzle entrance effects, particulate transmission effects, system and component accuracy and precision evaluations, and basic system operations to provide a detailed look at the subsystems of a monitoring and sampling system/program. A GAP evaluation can then be completed which leads to identification of design and operational flaws in the proposed systems. Corrective measures can then be limited to the GAPs.« less

  20. Solvable model for template coexistence in protocells

    NASA Astrophysics Data System (ADS)

    Fontanari, J. F.; Serva, M.

    2013-02-01

    Compartmentalization of self-replicating molecules (templates) in protocells is a necessary step towards the evolution of modern cells. However, coexistence between distinct template types inside a protocell can be achieved only if there is a selective pressure favoring protocells with a mixed template composition. Here we study analytically a group selection model for the coexistence between two template types using the diffusion approximation of population genetics. The model combines competition at the template and protocell levels as well as genetic drift inside protocells. At the steady state, we find a continuous phase transition separating the coexistence and segregation regimes, with the order parameter vanishing linearly with the distance to the critical point. In addition, we derive explicit analytical expressions for the critical steady-state probability density of protocell compositions.

  1. Nanoimprint lithography using disposable biomass template

    NASA Astrophysics Data System (ADS)

    Hanabata, Makoto; Takei, Satoshi; Sugahara, Kigen; Nakajima, Shinya; Sugino, Naoto; Kameda, Takao; Fukushima, Jiro; Matsumoto, Yoko; Sekiguchi, Atsushi

    2016-04-01

    A novel nanoimprint lithography process using disposable biomass template having gas permeability was investigated. It was found that a disposable biomass template derived from cellulose materials shows an excellent gas permeability and decreases transcriptional defects in conventional templates such as quartz, PMDS, DLC that have no gas permeability. We believe that outgasses from imprinted materials are easily removed through the template. The approach to use a cellulose for template material is suitable as the next generation of clean separation technology. It is expected to be one of the defect-less thermal nanoimprint lithographic technologies. It is also expected that volatile materials and solvent including materials become available that often create defects and peelings in conventional temples that have no gas permeability.

  2. High-level waste program progress report, April 1, 1980-June 30, 1980

    SciTech Connect

    1980-08-01

    The highlights of this report are on: waste management analysis for nuclear fuel cycles; fixation of waste in concrete; study of ceramic and cermet waste forms; alternative high-level waste forms development; and high-level waste container development.

  3. Tunable compression of template banks for fast gravitational-wave detection and localization

    NASA Astrophysics Data System (ADS)

    Chua, Alvin J. K.; Gair, Jonathan R.

    2016-06-01

    One strategy for reducing the online computational cost of matched-filter searches for gravitational waves is to introduce a compressed basis for the waveform template bank in a grid-based search. In this paper, we propose and investigate several tunable compression schemes for a general template bank. Through offline compression, such schemes are shown to yield faster detection and localization of signals, along with moderately improved sensitivity and accuracy over coarsened banks at the same level of computational cost. This is potentially useful for any search involving template banks, and especially in the analysis of data from future space-based detectors such as eLISA, for which online grid searches are difficult due to the long-duration waveforms and large parameter spaces.

  4. Analysis of tumor template from multiple compartments in a blood sample provides complementary access to peripheral tumor biomarkers

    PubMed Central

    Strauss, William M.; Carter, Chris; Simmons, Jill; Klem, Erich; Goodman, Nathan; Vahidi, Behrad; Romero, Juan; Masterman-Smith, Michael; O'Regan, Ruth; Gogineni, Keerthi; Schwartzberg, Lee; Austin, Laura K.; Dempsey, Paul W.; Cristofanilli, Massimo

    2016-01-01

    Targeted cancer therapeutics are promised to have a major impact on cancer treatment and survival. Successful application of these novel treatments requires a molecular definition of a patient's disease typically achieved through the use of tissue biopsies. Alternatively, allowing longitudinal monitoring, biomarkers derived from blood, isolated either from circulating tumor cell derived DNA (ctcDNA) or circulating cell-free tumor DNA (ccfDNA) may be evaluated. In order to use blood derived templates for mutational profiling in clinical decisions, it is essential to understand the different template qualities and how they compare to biopsy derived template DNA as both blood-based templates are rare and distinct from the gold-standard. Using a next generation re-sequencing strategy, concordance of the mutational spectrum was evaluated in 32 patient-matched ctcDNA and ccfDNA templates with comparison to tissue biopsy derived DNA template. Different CTC antibody capture systems for DNA isolation from patient blood samples were also compared. Significant overlap was observed between ctcDNA, ccfDNA and tissue derived templates. Interestingly, if the results of ctcDNA and ccfDNA template sequencing were combined, productive samples showed similar detection frequency (56% vs 58%), were temporally flexible, and were complementary both to each other and the gold standard. These observations justify the use of a multiple template approach to the liquid biopsy, where germline, ctcDNA, and ccfDNA templates are employed for clinical diagnostic purposes and open a path to comprehensive blood derived biomarker access. PMID:27049831

  5. 21 CFR 880.6885 - Liquid chemical sterilants/high level disinfectants.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Liquid chemical sterilants/high level... and Personal Use Miscellaneous Devices § 880.6885 Liquid chemical sterilants/high level disinfectants. (a) Identification. A liquid chemical sterilant/high level disinfectant is a germicide that...

  6. 21 CFR 880.6885 - Liquid chemical sterilants/high level disinfectants.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 21 Food and Drugs 8 2014-04-01 2014-04-01 false Liquid chemical sterilants/high level... and Personal Use Miscellaneous Devices § 880.6885 Liquid chemical sterilants/high level disinfectants. (a) Identification. A liquid chemical sterilant/high level disinfectant is a germicide that...

  7. 21 CFR 880.6885 - Liquid chemical sterilants/high level disinfectants.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 21 Food and Drugs 8 2011-04-01 2011-04-01 false Liquid chemical sterilants/high level... and Personal Use Miscellaneous Devices § 880.6885 Liquid chemical sterilants/high level disinfectants. (a) Identification. A liquid chemical sterilant/high level disinfectant is a germicide that...

  8. 21 CFR 880.6885 - Liquid chemical sterilants/high level disinfectants.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 21 Food and Drugs 8 2012-04-01 2012-04-01 false Liquid chemical sterilants/high level... and Personal Use Miscellaneous Devices § 880.6885 Liquid chemical sterilants/high level disinfectants. (a) Identification. A liquid chemical sterilant/high level disinfectant is a germicide that...

  9. 21 CFR 880.6885 - Liquid chemical sterilants/high level disinfectants.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 21 Food and Drugs 8 2013-04-01 2013-04-01 false Liquid chemical sterilants/high level... and Personal Use Miscellaneous Devices § 880.6885 Liquid chemical sterilants/high level disinfectants. (a) Identification. A liquid chemical sterilant/high level disinfectant is a germicide that...

  10. Multiple template-based fluoroscopic tracking of lung tumor mass without implanted fiducial markers

    NASA Astrophysics Data System (ADS)

    Cui, Ying; Dy, Jennifer G.; Sharp, Gregory C.; Alexander, Brian; Jiang, Steve B.

    2007-10-01

    Precise lung tumor localization in real time is particularly important for some motion management techniques, such as respiratory gating or beam tracking with a dynamic multi-leaf collimator, due to the reduced clinical tumor volume (CTV) to planning target volume (PTV) margin and/or the escalated dose. There might be large uncertainties in deriving tumor position from external respiratory surrogates. While tracking implanted fiducial markers has sufficient accuracy, this procedure may not be widely accepted due to the risk of pneumothorax. Previously, we have developed a technique to generate gating signals from fluoroscopic images without implanted fiducial markers using a template matching method (Berbeco et al 2005 Phys. Med. Biol. 50 4481-90, Cui et al 2007 Phys. Med. Biol. 52 741-55). In this paper, we present an extension of this method to multiple-template matching for directly tracking the lung tumor mass in fluoroscopy video. The basic idea is as follows: (i) during the patient setup session, a pair of orthogonal fluoroscopic image sequences are taken and processed off-line to generate a set of reference templates that correspond to different breathing phases and tumor positions; (ii) during treatment delivery, fluoroscopic images are continuously acquired and processed; (iii) the similarity between each reference template and the processed incoming image is calculated; (iv) the tumor position in the incoming image is then estimated by combining the tumor centroid coordinates in reference templates with proper weights based on the measured similarities. With different handling of image processing and similarity calculation, two such multiple-template tracking techniques have been developed: one based on motion-enhanced templates and Pearson's correlation score while the other based on eigen templates and mean-squared error. The developed techniques have been tested on six sequences of fluoroscopic images from six lung cancer patients against the reference

  11. Multiple template-based fluoroscopic tracking of lung tumor mass without implanted fiducial markers.

    PubMed

    Cui, Ying; Dy, Jennifer G; Sharp, Gregory C; Alexander, Brian; Jiang, Steve B

    2007-10-21

    Precise lung tumor localization in real time is particularly important for some motion management techniques, such as respiratory gating or beam tracking with a dynamic multi-leaf collimator, due to the reduced clinical tumor volume (CTV) to planning target volume (PTV) margin and/or the escalated dose. There might be large uncertainties in deriving tumor position from external respiratory surrogates. While tracking implanted fiducial markers has sufficient accuracy, this procedure may not be widely accepted due to the risk of pneumothorax. Previously, we have developed a technique to generate gating signals from fluoroscopic images without implanted fiducial markers using a template matching method (Berbeco et al 2005 Phys. Med. Biol. 50 4481-90, Cui et al 2007 Phys. Med. Biol. 52 741-55). In this paper, we present an extension of this method to multiple-template matching for directly tracking the lung tumor mass in fluoroscopy video. The basic idea is as follows: (i) during the patient setup session, a pair of orthogonal fluoroscopic image sequences are taken and processed off-line to generate a set of reference templates that correspond to different breathing phases and tumor positions; (ii) during treatment delivery, fluoroscopic images are continuously acquired and processed; (iii) the similarity between each reference template and the processed incoming image is calculated; (iv) the tumor position in the incoming image is then estimated by combining the tumor centroid coordinates in reference templates with proper weights based on the measured similarities. With different handling of image processing and similarity calculation, two such multiple-template tracking techniques have been developed: one based on motion-enhanced templates and Pearson's correlation score while the other based on eigen templates and mean-squared error. The developed techniques have been tested on six sequences of fluoroscopic images from six lung cancer patients against the reference

  12. Template optimization and transfer in perceptual learning.

    PubMed

    Kurki, Ilmari; Hyvärinen, Aapo; Saarinen, Jussi

    2016-08-01

    We studied how learning changes the processing of a low-level Gabor stimulus, using a classification-image method (psychophysical reverse correlation) and a task where observers discriminated between slight differences in the phase (relative alignment) of a target Gabor in visual noise. The method estimates the internal "template" that describes how the visual system weights the input information for decisions. One popular idea has been that learning makes the template more like an ideal Bayesian weighting; however, the evidence has been indirect. We used a new regression technique to directly estimate the template weight change and to test whether the direction of reweighting is significantly different from an optimal learning strategy. The subjects trained the task for six daily sessions, and we tested the transfer of training to a target in an orthogonal orientation. Strong learning and partial transfer were observed. We tested whether task precision (difficulty) had an effect on template change and transfer: Observers trained in either a high-precision (small, 60° phase difference) or a low-precision task (180°). Task precision did not have an effect on the amount of template change or transfer, suggesting that task precision per se does not determine whether learning generalizes. Classification images show that training made observers use more task-relevant features and unlearn some irrelevant features. The transfer templates resembled partially optimized versions of templates in training sessions. The template change direction resembles ideal learning significantly but not completely. The amount of template change was highly correlated with the amount of learning. PMID:27559720

  13. Templated Growth of Magnetic Recording Media

    NASA Astrophysics Data System (ADS)

    Sundar, Vignesh

    Current and potential next-generation magnetic recording technologies are based on the writing and reading of bits on a magnetic thin film with a granular microstructure, with grains of the magnetic material surrounded by an amorphous segregant. In order to realize the highest achievable data storage capabilities, there is a need for better control of the magnetic media microstructure, particularly in terms of minimizing grain size and grain boundary thickness distributions. In this work, a guided magnetic media growth is attempted by creating a pre-fabricated template with a specific material and morphology. The template is designed in such a way that, when magnetic media consisting of the magnetic alloy and segregant are sputtered, the sites on the template result in a controlled two-phase growth of magnetic media. The template is fabricated using self-assembling block copolymers, which can be used to fabricate nanostructures with a regular hexagonal lattice of spheres of one block in the other's matrix. These are then used as etch-masks to fabricate the template. In this thesis, we describe the approach used to fabricate these templates and demonstrate the two-phase growth of magnetic recording media. In such an approach, the magnetic grain size is defined by the uniform pitch of the block copolymer pattern, resulting in a uniform microstructure with much better grain size distribution than can be obtained with conventional un-templated media growth. The templated growth technique is also a suitable additive technique for the fabrication of Bit Patterned Media, another potential next-generation technology wherein the magnetic bits are isolated patterned islands. Combining nanoimprint lithography with templated growth, we can generate a long range spatially ordered array of magnetic islands with no etching of the magnetic material.

  14. Beer promotes high levels of alcohol intake in adolescent and adult alcohol-preferring rats.

    PubMed

    Hargreaves, Garth A; Wang, Emyo Y J; Lawrence, Andrew J; McGregor, Iain S

    2011-08-01

    Previous studies suggest that high levels of alcohol consumption can be obtained in laboratory rats by using beer as a test solution. The present study extended these observations to examine the intake of beer and equivalent dilute ethanol solutions with an inbred line of alcohol-preferring P rats. In Experiment 1, male adolescent P rats and age-matched Wistar rats had access to either beer or equivalent ethanol solutions for 1h daily in a custom-built lickometer apparatus. In subsequent experiments, adolescent (Experiment 2) and adult (Experiment 3) male P rats were given continuous 24-h home cage access to beer or dilute ethanol solutions, with concomitant access to lab chow and water. In each experiment, the alcohol content of the beer and dilute ethanol solutions was gradually increased from 0.4, 1.4, 2.4, 3.4, 4.4, 5 to 10% EtOH (vol/vol). All three experiments showed a major augmentation of alcohol intake when rats were given beer compared with equivalent ethanol solutions. In Experiment 1, the overall intake of beer was higher in P rats compared with Wistar rats, but no strain difference was found during the 1-h sessions with plain ethanol consumption. Experiment 1 also showed that an alcohol deprivation effect was more readily obtained in rats with a history of consuming beer rather than plain ethanol solutions. In Experiments 2 and 3, voluntary beer intake in P rats represented ethanol intake of 10-15 g/kg/day, among the highest reported in any study with rats. This excessive consumption was most apparent in adolescent rats. Beer consumption markedly exceeded plain ethanol intake in these experiments except at the highest alcohol concentration (10%) tested. The advantage of using beer rather than dilute ethanol solutions in both selected and nonselected rat strains is therefore confirmed. Our findings encourage the use of beer with alcohol-preferring rats in future research that seeks to obtain high levels of alcohol self-administration.

  15. Feasibility study of patient-specific surgical templates for the fixation of pedicle screws.

    PubMed

    Salako, F; Aubin, C-E; Fortin, C; Labelle, H

    2002-01-01

    Surgery for scoliosis, as well as other posterior spinal surgeries, frequently uses pedicle screws to fix an instrumentation on the spine. Misplacement of a screw can lead to intra- and post-operative complications. The objective of this study is to design patient-specific surgical templates to guide the drilling operation. From the CT-scan of a vertebra, the optimal drilling direction and limit angles are computed from an inverse projection of the pedicle limits. The first template design uses a surface-to-surface registration method and was constructed in a CAD system by subtracting the vertebra from a rectangular prism and a cylinder with the optimal orientation. This template and the vertebra were built using rapid prototyping. The second design uses a point-to-surface registration method and has 6 adjustable screws to adjust the orientation and length of the drilling support device. A mechanism was designed to hold it in place on the spinal process. A virtual prototype was build with CATIA software. During the operation, the surgeon places either template on patient's vertebra until a perfect match is obtained before drilling. The second design seems better than the first one because it can be reused on different vertebra and is less sensible to registration errors. The next step is to build the second design and make experimental and simulations tests to evaluate the benefits of this template during a scoliosis operation.

  16. Quantum Matching Pennies Game

    NASA Astrophysics Data System (ADS)

    Iqbal, Azhar; Abbott, Derek

    2009-01-01

    A quantum version of the matching pennies (MP) game is proposed that is played using an Einstein-Podolsky-Rosen-Bohm (EPR-Bohm) setting. We construct the quantum game without using state vectors, while considering only the quantum mechanical joint probabilities relevant to the EPR-Bohm setting. We embed the classical game within the quantum game such that the classical MP game results when the quantum mechanical joint probabilities become factorizable. We report new Nash equilibria in the quantum MP game that emerge when the quantum mechanical joint probabilities maximally violate the Clauser-Horne-Shimony-Holt form of Bell’s inequality.

  17. Apfel's excellent match

    NASA Technical Reports Server (NTRS)

    1997-01-01

    Apfel's excellent match: This series of photos shows a water drop containing a surfactant (Triton-100) as it experiences a complete cycle of superoscillation on U.S. Microgravity Lab-2 (USML-2; October 1995). The time in seconds appears under the photos. The figures above the photos are the oscillation shapes predicted by a numerical model. The time shown with the predictions is nondimensional. Robert Apfel (Yale University) used the Drop Physics Module on USML-2 to explore the effect of surfactants on liquid drops. Apfel's research of surfactants may contribute to improvements in a variety of industrial processes, including oil recovery and environmental cleanup.

  18. Templated Native Silk Smectic Gels

    NASA Technical Reports Server (NTRS)

    Jin, Hyoung-Joon (Inventor); Park, Jae-Hyung (Inventor); Valluzzi, Regina (Inventor)

    2016-01-01

    One aspect of the present invention relates to a method of preparing a fibrous protein smectic hydrogel by way of a solvent templating process, comprising the steps of pouring an aqueous fibrous protein solution into a container comprising a solvent that is not miscible with water; sealing the container and allowing it to age at about room temperature; and collecting the resulting fibrous protein smectic hydrogel and allowing it to dry. Another aspect of the present invention relates to a method of obtaining predominantly one enantiomer from a racemic mixture, comprising the steps of pouring an aqueous fibrous protein solution into a container comprising a solvent that is not miscible with water; sealing the container and allowing it to age at about room temperature; allowing the enantiomers of racemic mixture to diffuse selectively into the smectic hydrogel in solution; removing the smectic hydrogel from the solution; rinsing predominantly one enantiomer from the surface of the smectic hydrogel; and extracting predominantly one enantiomer from the interior of the smectic hydrogel. The present invention also relates to a smectic hydrogel prepared according to an aforementioned method.

  19. Templated native silk smectic gels

    NASA Technical Reports Server (NTRS)

    Jin, Hyoung-Joon (Inventor); Park, Jae-Hyung (Inventor); Valluzzi, Regina (Inventor)

    2009-01-01

    One aspect of the present invention relates to a method of preparing a fibrous protein smectic hydrogel by way of a solvent templating process, comprising the steps of pouring an aqueous fibrous protein solution into a container comprising a solvent that is not miscible with water; sealing the container and allowing it to age at about room temperature; and collecting the resulting fibrous protein smectic hydrogel and allowing it to dry. Another aspect of the present invention relates to a method of obtaining predominantly one enantiomer from a racemic mixture, comprising the steps of pouring an aqueous fibrous protein solution into a container comprising a solvent that is not miscible with water; sealing the container and allowing it to age at about room temperature; allowing the enantiomers of racemic mixture to diffuse selectively into the smectic hydrogel in solution; removing the smectic hydrogel from the solution; rinsing predominantly one enantiomer from the surface of the smectic hydrogel; and extracting predominantly one enantiomer from the interior of the smectic hydrogel. The present invention also relates to a smectic hydrogel prepared according to an aforementioned method.

  20. Templated Native Silk Smectic Gels

    NASA Technical Reports Server (NTRS)

    Jin, Hyoung-Joon (Inventor); Park, Jae-Hyung (Inventor); Valluzzi, Regina (Inventor)

    2013-01-01

    One aspect of the present invention relates to a method of preparing a fibrous protein smectic hydrogel by way of a solvent templating process, comprising the steps of pouring an aqueous fibrous protein solution into a container comprising a solvent that is not miscible with water; sealing the container and allowing it to age at about room temperature; and collecting the resulting fibrous protein smectic hydrogel and allowing it to dry. Another aspect of the present invention relates to a method of obtaining predominantly one enantiomer from a racemic mixture, comprising the steps of pouring an aqueous fibrous protein solution into a container comprising a solvent that is not miscible with water; sealing the container and allowing it to age at about room temperature; allowing the enantiomers of racemic mixture to diffuse selectively into the smectic hydrogel in solution; removing the smectic hydrogel from the solution; rinsing predominantly one enantiomer from the surface of the smectic hydrogel; and extracting predominantly one enantiomer from the interior of the smectic hydrogel. The present invention also relates to a smectic hydrogel prepared according to an aforementioned method.

  1. Biomineralization Guided by Paper Templates

    PubMed Central

    Camci-Unal, Gulden; Laromaine, Anna; Hong, Estrella; Derda, Ratmir; Whitesides, George M.

    2016-01-01

    This work demonstrates the fabrication of partially mineralized scaffolds fabricated in 3D shapes using paper by folding, and by supporting deposition of calcium phosphate by osteoblasts cultured in these scaffolds. This process generates centimeter-scale free-standing structures composed of paper supporting regions of calcium phosphate deposited by osteoblasts. This work is the first demonstration that paper can be used as a scaffold to induce template-guided mineralization by osteoblasts. Because paper has a porous structure, it allows transport of O2 and nutrients across its entire thickness. Paper supports a uniform distribution of cells upon seeding in hydrogel matrices, and allows growth, remodelling, and proliferation of cells. Scaffolds made of paper make it possible to construct 3D tissue models easily by tuning material properties such as thickness, porosity, and density of chemical functional groups. Paper offers a new approach to study mechanisms of biomineralization, and perhaps ultimately new techniques to guide or accelerate the repair of bone. PMID:27277575

  2. Recycling nanowire templates for multiplex templating synthesis: a green and sustainable strategy.

    PubMed

    Wang, Jin-Long; Liu, Jian-Wei; Lu, Bing-Zhang; Lu, Yi-Ruo; Ge, Jin; Wu, Zhen-Yu; Wang, Zhi-Hua; Arshad, Muhammad Nadeem; Yu, Shu-Hong

    2015-03-23

    Template-directed synthesis of nanostructures has been emerging as one of the most important synthetic methodologies. A pristine nanotemplate is usually chemically transformed into other compounds and sacrificed after templating or only acts as an inert physical template to support the new components. If a nanotemplate is costly or toxic as waste, to recycle such a nanotemplate becomes highly desirable. Recently, ultrathin tellurium nanowires (TeNWs) have been demonstrated as versatile chemical or physical templates for the synthesis of a diverse family of uniform 1D nanostructures. However, ultrathin TeNWs as template are usually costly and are discarded as toxic waste in ionic species after chemical reactions or erosion. To solve the above problem, we conceptually demonstrate that such a nanotemplate can be economically recycled from waste solutions and repeatedly used as template.

  3. A color based face detection system using multiple templates.

    PubMed

    Wang, Tao; Bu, Jia-Jun; Chen, Chun

    2003-01-01

    A color based system using multiple templates was developed and implemented for detecting human faces in color images. The algorithm consists of three image processing steps. The first step is human skin color statistics. Then it separates skin regions from non-skin regions. After that, it locates the frontal human face(s) within the skin regions. In the first step, 250 skin samples from persons of different ethnicities are used to determine the color distribution of human skin in chromatic color space in order to get a chroma chart showing likelihoods of skin colors. This chroma chart is used to generate, from the original color image, a gray scale image whose gray value at a pixel shows its likelihood of representing the skin. The algorithm uses an adaptive thresholding process to achieve the optimal threshold value for dividing the gray scale image into separate skin regions from non skin regions. Finally, multiple face templates matching is used to determine if a given skin region represents a frontal human face or not. Test of the system with more than 400 color images showed that the resulting detection rate was 83%, which is better than most color-based face detection systems. The average speed for face detection is 0.8 second/image (400 x 300 pixels) on a Pentium 3 (800MHz) PC.

  4. Amesos2 Templated Direct Sparse Solver Package

    2011-05-24

    Amesos2 is a templated direct sparse solver package. Amesos2 provides interfaces to direct sparse solvers, rather than providing native solver capabilities. Amesos2 is a derivative work of the Trilinos package Amesos.

  5. Template-based prediction of protein function.

    PubMed

    Petrey, Donald; Chen, T Scott; Deng, Lei; Garzon, Jose Ignacio; Hwang, Howook; Lasso, Gorka; Lee, Hunjoong; Silkov, Antonina; Honig, Barry

    2015-06-01

    We discuss recent approaches for structure-based protein function annotation. We focus on template-based methods where the function of a query protein is deduced from that of a template for which both the structure and function are known. We describe the different ways of identifying a template. These are typically based on sequence analysis but new methods based on purely structural similarity are also being developed that allow function annotation based on structural relationships that cannot be recognized by sequence. The growing number of available structures of known function, improved homology modeling techniques and new developments in the use of structure allow template-based methods to be applied on a proteome-wide scale and in many different biological contexts. This progress significantly expands the range of applicability of structural information in function annotation to a level that previously was only achievable by sequence comparison.

  6. Compiling high-level languages for configurable computers: applying lessons from heterogeneous processing

    NASA Astrophysics Data System (ADS)

    Weaver, Glen E.; Weems, Charles C.; McKinley, Kathryn S.

    1996-10-01

    Configurable systems offer increased performance by providing hardware that matches the computational structure of a problem. This hardware is currently programmed with CAD tools and explicit library calls. To attain widespread acceptance, configurable computing must become transparently accessible from high-level programming languages, but the changeable nature of the target hardware presents a major challenge to traditional compiler technology. A compiler for a configurable computer should optimize the use of functions embedded in hardware and schedule hardware reconfigurations. The hurdles to be overcome in achieving this capability are similar in some ways to those facing compilation for heterogeneous systems. For example, current traditional compilers have neither an interface to accept new primitive operators, nor a mechanism for applying optimizations to new operators. We are building a compiler for heterogeneous computing, called Scale, which replaces the traditional monolithic compiler architecture with a flexible framework. Scale has three main parts: translation director, compilation library, and a persistent store which holds our intermediate representation as well as other data structures. The translation director exploits the framework's flexibility by using architectural information to build a plan to direct each compilation. The translation library serves as a toolkit for use by the translation director. Our compiler intermediate representation, Score, facilities the addition of new IR nodes by distinguishing features used in defining nodes from properties on which transformations depend. In this paper, we present an overview of the scale architecture and its capabilities for dealing with heterogeneity, followed by a discussion of how those capabilities apply to problems in configurable computing. We then address aspects of configurable computing that are likely to require extensions to our approach and propose some extensions.

  7. The Template: A Way To Control

    ERIC Educational Resources Information Center

    Schueneman, Margot

    1977-01-01

    When beginning students first attempt coil pots, there is a tendency to rely on the design of the coil to cover up any irregularities in form. One of the ways to help students see whether or not a form is getting away from then is to use a template. Explains and demonstrates how the contour of the template helps to guide the placement of the…

  8. Lipid bilayers on nano-templates

    DOEpatents

    Noy, Aleksandr; Artyukhin, Alexander B.; Bakajin, Olgica; Stoeve, Pieter

    2009-08-04

    A lipid bilayer on a nano-template comprising a nanotube or nanowire and a lipid bilayer around the nanotube or nanowire. One embodiment provides a method of fabricating a lipid bilayer on a nano-template comprising the steps of providing a nanotube or nanowire and forming a lipid bilayer around the polymer cushion. One embodiment provides a protein pore in the lipid bilayer. In one embodiment the protein pore is sensitive to specific agents

  9. The Earliest Matches

    PubMed Central

    Goren-Inbar, Naama; Freikman, Michael; Garfinkel, Yosef; Goring-Morris, Nigel A.; Grosman, Leore

    2012-01-01

    Cylindrical objects made usually of fired clay but sometimes of stone were found at the Yarmukian Pottery Neolithic sites of Sha‘ar HaGolan and Munhata (first half of the 8th millennium BP) in the Jordan Valley. Similar objects have been reported from other Near Eastern Pottery Neolithic sites. Most scholars have interpreted them as cultic objects in the shape of phalli, while others have referred to them in more general terms as “clay pestles,” “clay rods,” and “cylindrical clay objects.” Re-examination of these artifacts leads us to present a new interpretation of their function and to suggest a reconstruction of their technology and mode of use. We suggest that these objects were components of fire drills and consider them the earliest evidence of a complex technology of fire ignition, which incorporates the cylindrical objects in the role of matches. PMID:22870306

  10. A Template-Matching Method For Measuring Energy Depositions In TES Films

    NASA Astrophysics Data System (ADS)

    Shank, Benjamin; Yen, Jeffrey; Cabrera, Blas; Kreikebaum, John Mark; Moffatt, Robert; Redl, Peter; Young, Betty; Brink, Paul; Cherry, Matthew; Tomada, Astrid

    2014-03-01

    Transition edge sensors (TES) have a wide variety of applications in particle ∖astrophysics for detecting incoming particles with high energy resolution. In TES design, the need for sufficient heat capacity to avoid saturation limits the ultimate energy resolution. Building on the TES model developed for SuperCDMS by Yen et al. for tungsten TESs deposited next to aluminum collection fins, we outline a time-domain non-linear optimal filter method for reconstructing energy depositions in TES films. This allows us to operate devices into their saturation region while taking into account changing noise performance and loss of energy collection. We show how this method has improved our understanding of quasiparticle diffusion and energy collection in our superconducting sensors.

  11. Synergy characterization for Enterococcus faecalis strains displaying moderately high-level gentamicin and streptomycin resistance.

    PubMed Central

    Bantar, C E; Micucci, M; Fernandez Canigia, L; Smayevsky, J; Bianchini, H M

    1993-01-01

    Synergy of 14 Enterococcus faecalis strains displaying moderately high-level aminoglycoside resistance (MICs, 500 and 256 to 1,000 micrograms/ml for gentamicin and streptomycin, respectively) was characterized by time-kill studies. All strains proved resistant to penicillin plus the respective aminoglycoside. Strains with moderately high-level aminoglycoside resistance should be considered to exhibit high-level resistance in severe infections. PMID:8349776

  12. Synergy characterization for Enterococcus faecalis strains displaying moderately high-level gentamicin and streptomycin resistance.

    PubMed

    Bantar, C E; Micucci, M; Fernandez Canigia, L; Smayevsky, J; Bianchini, H M

    1993-07-01

    Synergy of 14 Enterococcus faecalis strains displaying moderately high-level aminoglycoside resistance (MICs, 500 and 256 to 1,000 micrograms/ml for gentamicin and streptomycin, respectively) was characterized by time-kill studies. All strains proved resistant to penicillin plus the respective aminoglycoside. Strains with moderately high-level aminoglycoside resistance should be considered to exhibit high-level resistance in severe infections.

  13. Implementation of Accelerated Beam-Specific Matched-Filter-Based Optical Alignment

    SciTech Connect

    Awwal, A S; Rice, K L; Taha, T M

    2009-01-29

    Accurate automated alignment of laser beams in the National Ignition Facility (NIF) is essential for achieving extreme temperature and pressure required for inertial confinement fusion. The alignment achieved by the integrated control systems relies on algorithms processing video images to determine the position of the laser beam images in real-time. Alignment images that exhibit wide variations in beam quality require a matched-filter algorithm for position detection. One challenge in designing a matched-filter based algorithm is to construct a filter template that is resilient to variations in imaging conditions while guaranteeing accurate position determination. A second challenge is to process the image as fast as possible. This paper describes the development of a new analytical template that captures key recurring features present in the beam image to accurately estimate the beam position under good image quality conditions. Depending on the features present in a particular beam, the analytical template allows us to create a highly tailored template containing only those selected features. The second objective is achieved by exploiting the parallelism inherent in the algorithm to accelerate processing using parallel hardware that provides significant performance improvement over conventional processors. In particular, a Xilinx Virtex II Pro FPGA hardware implementation processing 32 templates provided a speed increase of about 253 times over an optimized software implementation running on a 2.0 GHz AMD Opteron core.

  14. Gravitational wave chirp search: Economization of post-Newtonian matched filter bank via cardinal interpolation

    NASA Astrophysics Data System (ADS)

    Croce, R. P.; Demma, Th.; Pierro, V.; Pinto, I. M.; Churches, D.; Sathyaprakash, B. S.

    2000-12-01

    The final inspiral phase in the evolution of a compact binary consisting of black holes and/or neutron stars is among the most probable events that a network of ground-based interferometric gravitational wave detectors is likely to observe. Gravitational radiation emitted during this phase will have to be dug out of noise by matched-filtering (correlating) the detector output with a bank of several 105 templates, making the computational resources required quite demanding, though not formidable. We propose an interpolation method for evaluating the correlation between template waveforms and the detector output, and show that the method is effective in substantially reducing the number of templates required. Indeed, the number of templates needed could be a factor of ~4 smaller than required by the usual approach, when the minimal overlap between the template bank and an arbitrary signal (the so-called minimal match) is 0.97. The method is amenable to easy implementation, and the various detector projects might benefit by adopting it to reduce the computational costs of inspiraling neutron star and black hole binary searches.

  15. Improvement of the performance of a classical matched filter by an independent component analysis preprocessing

    NASA Astrophysics Data System (ADS)

    de Rosa, R.; Forte, L. A.; Garufi, F.; Milano, L.

    2012-02-01

    Current gravitational wave searches for compact binaries coalescence are done using a bank of templates (matched filters) on each running detector. Given a network of interferometers, we propose to use a denoising strategy based on an independent component analysis which considers two interferometers at a time and then to use a standard matched filter on the processed data. We show that this method allows to lower the level of noise and increases the signal-to-nose ratio at the output of the matched filter.

  16. Multifractal signatures of complexity matching.

    PubMed

    Delignières, Didier; Almurad, Zainy M H; Roume, Clément; Marmelat, Vivien

    2016-10-01

    The complexity matching effect supposes that synchronization between complex systems could emerge from multiple interactions across multiple scales and has been hypothesized to underlie a number of daily-life situations. Complexity matching suggests that coupled systems tend to share similar scaling properties, and this phenomenon is revealed by a statistical matching between the scaling exponents that characterize the respective behaviors of both systems. However, some recent papers suggested that this statistical matching could originate from local adjustments or corrections, rather than from a genuine complexity matching between systems. In the present paper, we propose an analysis method based on correlation between multifractal spectra, considering different ranges of time scales. We analyze several datasets collected in various situations (bimanual coordination, interpersonal coordination, and walking in synchrony with a fractal metronome). Our results show that this method is able to distinguish between situations underlain by genuine statistical matching and situations where statistical matching results from local adjustments. PMID:27225255

  17. Conversion of Radiology Reporting Templates to the MRRT Standard.

    PubMed

    Kahn, Charles E; Genereaux, Brad; Langlotz, Curtis P

    2015-10-01

    In 2013, the Integrating the Healthcare Enterprise (IHE) Radiology workgroup developed the Management of Radiology Report Templates (MRRT) profile, which defines both the format of radiology reporting templates using an extension of Hypertext Markup Language version 5 (HTML5), and the transportation mechanism to query, retrieve, and store these templates. Of 200 English-language report templates published by the Radiological Society of North America (RSNA), initially encoded as text and in an XML schema language, 168 have been converted successfully into MRRT using a combination of automated processes and manual editing; conversion of the remaining 32 templates is in progress. The automated conversion process applied Extensible Stylesheet Language Transformation (XSLT) scripts, an XML parsing engine, and a Java servlet. The templates were validated for proper HTML5 and MRRT syntax using web-based services. The MRRT templates allow radiologists to share best-practice templates across organizations and have been uploaded to the template library to supersede the prior XML-format templates. By using MRRT transactions and MRRT-format templates, radiologists will be able to directly import and apply templates from the RSNA Report Template Library in their own MRRT-compatible vendor systems. The availability of MRRT-format reporting templates will stimulate adoption of the MRRT standard and is expected to advance the sharing and use of templates to improve the quality of radiology reports. PMID:25776768

  18. Conversion of Radiology Reporting Templates to the MRRT Standard.

    PubMed

    Kahn, Charles E; Genereaux, Brad; Langlotz, Curtis P

    2015-10-01

    In 2013, the Integrating the Healthcare Enterprise (IHE) Radiology workgroup developed the Management of Radiology Report Templates (MRRT) profile, which defines both the format of radiology reporting templates using an extension of Hypertext Markup Language version 5 (HTML5), and the transportation mechanism to query, retrieve, and store these templates. Of 200 English-language report templates published by the Radiological Society of North America (RSNA), initially encoded as text and in an XML schema language, 168 have been converted successfully into MRRT using a combination of automated processes and manual editing; conversion of the remaining 32 templates is in progress. The automated conversion process applied Extensible Stylesheet Language Transformation (XSLT) scripts, an XML parsing engine, and a Java servlet. The templates were validated for proper HTML5 and MRRT syntax using web-based services. The MRRT templates allow radiologists to share best-practice templates across organizations and have been uploaded to the template library to supersede the prior XML-format templates. By using MRRT transactions and MRRT-format templates, radiologists will be able to directly import and apply templates from the RSNA Report Template Library in their own MRRT-compatible vendor systems. The availability of MRRT-format reporting templates will stimulate adoption of the MRRT standard and is expected to advance the sharing and use of templates to improve the quality of radiology reports.

  19. Assessment, evaluation, and testing of technologies for environmental restoration, decontamination, and decommissioning and high level waste management. Progress report

    SciTech Connect

    Uzochukwu, G.A.

    1997-12-31

    Nuclear and commercial non-nuclear technologies that have the potential of meeting the environmental restoration, decontamination and decommissioning, and high-level waste management objectives are being assessed and evaluated. A detailed comparison of innovative technologies available will be performed to determine the safest and most economical technology for meeting these objectives. Information derived from this effort will be matched with the multi-objectives of the environmental restoration, decontamination and decommissioning, and high-level waste management effort to ensure that the best, most economical, and the safest technologies are used in decision making at USDOE-SRS. Technology-related variables will be developed and the resulting data formatted and computerized for multimedia systems. The multimedia system will be made available to technology developers and evaluators to ensure that the best, most economical, and the safest technologies are used in decision making at USDOE-SRS. Technology-related variables will be developed and the resulting data formatted and computerized for multimedia systems. The multimedia system will be made available to technology developers and evaluators to ensure that the safest and most economical technologies are developed for use at SRS and other DOE sites.

  20. Quantum image matching

    NASA Astrophysics Data System (ADS)

    Jiang, Nan; Dang, Yijie; Wang, Jian

    2016-09-01

    Quantum image processing (QIP) means the quantum-based methods to speed up image processing algorithms. Many quantum image processing schemes claim that their efficiency is theoretically higher than their corresponding classical schemes. However, most of them do not consider the problem of measurement. As we all know, measurement will lead to collapse. That is to say, executing the algorithm once, users can only measure the final state one time. Therefore, if users want to regain the results (the processed images), they must execute the algorithms many times and then measure the final state many times to get all the pixels' values. If the measurement process is taken into account, whether or not the algorithms are really efficient needs to be reconsidered. In this paper, we try to solve the problem of measurement and give a quantum image matching algorithm. Unlike most of the QIP algorithms, our scheme interests only one pixel (the target pixel) instead of the whole image. It modifies the probability of pixels based on Grover's algorithm to make the target pixel to be measured with higher probability, and the measurement step is executed only once. An example is given to explain the algorithm more vividly. Complexity analysis indicates that the quantum scheme's complexity is O(2n) in contradistinction to the classical scheme's complexity O(2^{2n+2m}), where m and n are integers related to the size of images.

  1. Image Segmentation, Registration, Compression, and Matching

    NASA Technical Reports Server (NTRS)

    Yadegar, Jacob; Wei, Hai; Yadegar, Joseph; Ray, Nilanjan; Zabuawala, Sakina

    2011-01-01

    A novel computational framework was developed of a 2D affine invariant matching exploiting a parameter space. Named as affine invariant parameter space (AIPS), the technique can be applied to many image-processing and computer-vision problems, including image registration, template matching, and object tracking from image sequence. The AIPS is formed by the parameters in an affine combination of a set of feature points in the image plane. In cases where the entire image can be assumed to have undergone a single affine transformation, the new AIPS match metric and matching framework becomes very effective (compared with the state-of-the-art methods at the time of this reporting). No knowledge about scaling or any other transformation parameters need to be known a priori to apply the AIPS framework. An automated suite of software tools has been created to provide accurate image segmentation (for data cleaning) and high-quality 2D image and 3D surface registration (for fusing multi-resolution terrain, image, and map data). These tools are capable of supporting existing GIS toolkits already in the marketplace, and will also be usable in a stand-alone fashion. The toolkit applies novel algorithmic approaches for image segmentation, feature extraction, and registration of 2D imagery and 3D surface data, which supports first-pass, batched, fully automatic feature extraction (for segmentation), and registration. A hierarchical and adaptive approach is taken for achieving automatic feature extraction, segmentation, and registration. Surface registration is the process of aligning two (or more) data sets to a common coordinate system, during which the transformation between their different coordinate systems is determined. Also developed here are a novel, volumetric surface modeling and compression technique that provide both quality-guaranteed mesh surface approximations and compaction of the model sizes by efficiently coding the geometry and connectivity

  2. Toward automated forensic fracture matching of snap-off blade knives

    NASA Astrophysics Data System (ADS)

    Hollevoet, Davy; De Smet, Patrick; De Bock, Johan; Philips, Wilfried

    2008-08-01

    An interesting problem that has concerned forensic scientist for many years, is their need for accurate, reliable and objective methods for performing fracture matching examinations. The aim of these fracture matching methods is to determine if two broken object halves can be matched together, e.g., when one half is recovered at a crime scene, while the other half is found in the possession of a suspect. In this paper we discuss the use of a commercial white-light profilometer system for obtaining 2D/3D image surface scans of multiple fractured objects. More specifically, we explain the use of this system for digitizing the fracture surface of multiple facing halves of several snap-off blade knives. Next, we discuss the realization and evaluation of several image processing methods for trying to match the obtained image scans corresponding to each of the broken off blade elements used in our experiments. The algorithms that were tested and evaluated include: global template matching based on image correlation and multiple template matching based on local image correlation, using so-called "vote-map" computation. Although many avenues for further research still remain possible, we show that the second method yields very good results for allowing automated searching and matching of the imaged fracture surfaces for each of the examined blade elements.

  3. Synthesis of RNA oligomers on heterogeneous templates

    NASA Technical Reports Server (NTRS)

    Ertem, G.; Ferris, J. P.

    1996-01-01

    The concept of an RNA world in the chemical origin of life is appealing, as nucleic acids are capable of both information storage and acting as templates that catalyse the synthesis of complementary molecules. Template-directed synthesis has been demonstrated for homogeneous oligonucleotides that, like natural nucleic acids, have 3',5' linkages between the nucleotide monomers. But it seems likely that prebiotic routes to RNA-like molecules would have produced heterogeneous molecules with various kinds of phosphodiester linkages and both linear and cyclic nucleotide chains. Here we show that such heterogeneity need be no obstacle to the templating of complementary molecules. Specifically, we show that heterogeneous oligocytidylates, formed by the montmorillonite clay-catalysed condensation of actuated monomers, can serve as templates for the synthesis of oligoguanylates. Furthermore, we show that oligocytidylates that are exclusively 2',5'-linked can also direct synthesis of oligoguanylates. Such heterogeneous templating reactions could have increased the diversity of the pool of protonucleic acids from which life ultimately emerged.

  4. Automated Template Quantification for DNA Sequencing Facilities

    PubMed Central

    Ivanetich, Kathryn M.; Yan, Wilson; Wunderlich, Kathleen M.; Weston, Jennifer; Walkup, Ward G.; Simeon, Christian

    2005-01-01

    The quantification of plasmid DNA by the PicoGreen dye binding assay has been automated, and the effect of quantification of user-submitted templates on DNA sequence quality in a core laboratory has been assessed. The protocol pipets, mixes and reads standards, blanks and up to 88 unknowns, generates a standard curve, and calculates template concentrations. For pUC19 replicates at five concentrations, coefficients of variance were 0.1, and percent errors were from 1% to 7% (n = 198). Standard curves with pUC19 DNA were nonlinear over the 1 to 1733 ng/μL concentration range required to assay the majority (98.7%) of user-submitted templates. Over 35,000 templates have been quantified using the protocol. For 1350 user-submitted plasmids, 87% deviated by ≥ 20% from the requested concentration (500 ng/μL). Based on data from 418 sequencing reactions, quantification of user-submitted templates was shown to significantly improve DNA sequence quality. The protocol is applicable to all types of double-stranded DNA, is unaffected by primer (1 pmol/μL), and is user modifiable. The protocol takes 30 min, saves 1 h of technical time, and costs approximately $0.20 per unknown. PMID:16461949

  5. Manually segmented template library for 8-year-old pediatric brain MRI data with 16 subcortical structures

    PubMed Central

    Garg, Amanmeet; Wong, Darren; Popuri, Karteek; Poskitt, Kenneth J.; Fitzpatrick, Kevin; Bjornson, Bruce; Grunau, Ruth E.; Beg, Mirza Faisal

    2014-01-01

    Abstract. Manual segmentation of anatomy in brain MRI data taken to be the closest to the “gold standard” in quality is often used in automated registration-based segmentation paradigms for transfer of template labels onto the unlabeled MRI images. This study presents a library of template data with 16 subcortical structures in the central brain area which were manually labeled for MRI data from 22 children (8 male, mean age=8±0.6  years). The lateral ventricle, thalamus, caudate, putamen, hippocampus, cerebellum, third vevntricle, fourth ventricle, brainstem, and corpuscallosum were segmented by two expert raters. Cross-validation experiments with randomized template subset selection were conducted to test for their ability to accurately segment MRI data under an automated segmentation pipeline. A high value of the dice similarity coefficient (0.86±0.06, min=0.74, max=0.96) and small Hausdorff distance (3.33±4.24, min=0.63, max=25.24) of the automated segmentation against the manual labels was obtained on this template library data. Additionally, comparison with segmentation obtained from adult templates showed significant improvement in accuracy with the use of an age-matched library in this cohort. A manually delineated pediatric template library such as the one described here could provide a useful benchmark for testing segmentation algorithms. PMID:26158067

  6. Manually segmented template library for 8-year-old pediatric brain MRI data with 16 subcortical structures.

    PubMed

    Garg, Amanmeet; Wong, Darren; Popuri, Karteek; Poskitt, Kenneth J; Fitzpatrick, Kevin; Bjornson, Bruce; Grunau, Ruth E; Beg, Mirza Faisal

    2014-10-01

    Manual segmentation of anatomy in brain MRI data taken to be the closest to the "gold standard" in quality is often used in automated registration-based segmentation paradigms for transfer of template labels onto the unlabeled MRI images. This study presents a library of template data with 16 subcortical structures in the central brain area which were manually labeled for MRI data from 22 children (8 male, [Formula: see text]). The lateral ventricle, thalamus, caudate, putamen, hippocampus, cerebellum, third vevntricle, fourth ventricle, brainstem, and corpuscallosum were segmented by two expert raters. Cross-validation experiments with randomized template subset selection were conducted to test for their ability to accurately segment MRI data under an automated segmentation pipeline. A high value of the dice similarity coefficient ([Formula: see text], [Formula: see text], [Formula: see text]) and small Hausdorff distance ([Formula: see text], [Formula: see text], [Formula: see text]) of the automated segmentation against the manual labels was obtained on this template library data. Additionally, comparison with segmentation obtained from adult templates showed significant improvement in accuracy with the use of an age-matched library in this cohort. A manually delineated pediatric template library such as the one described here could provide a useful benchmark for testing segmentation algorithms.

  7. Hi-LAB: A New Measure of Aptitude for High-Level Language Proficiency

    ERIC Educational Resources Information Center

    Linck, Jared A.; Hughes, Meredith M.; Campbell, Susan G.; Silbert, Noah H.; Tare, Medha; Jackson, Scott R.; Smith, Benjamin K.; Bunting, Michael F.; Doughty, Catherine J.

    2013-01-01

    Few adult second language (L2) learners successfully attain high-level proficiency. Although decades of research on beginning to intermediate stages of L2 learning have identified a number of predictors of the rate of acquisition, little research has examined factors relevant to predicting very high levels of L2 proficiency. The current study,…

  8. Alternatives Generation and Analysis for Heat Removal from High Level Waste Tanks

    SciTech Connect

    WILLIS, W.L.

    2000-06-15

    This document addresses the preferred combination of design and operational configurations to provide heat removal from high-level waste tanks during Phase 1 waste feed delivery to prevent the waste temperature from exceeding tank safety requirement limits. An interim decision for the preferred method to remove the heat from the high-level waste tanks during waste feed delivery operations is presented herein.

  9. Characteristics Data Base: Programmer's guide to the High-Level Waste Data Base

    SciTech Connect

    Jones, K.E. ); Salmon, R. )

    1990-08-01

    The High-Level Waste Data Base is a menu-driven PC data base developed as part of OCRWM's technical data base on the characteristics of potential repository wastes, which also includes spent fuel and other materials. This programmer's guide completes the documentation for the High-Level Waste Data Base, the user's guide having been published previously. 3 figs.

  10. 40 CFR 1065.725 - High-level ethanol-gasoline blends.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... ethanol used for blending must be either denatured ethanol meeting the specifications in 40 CFR 80.1610... 40 Protection of Environment 33 2014-07-01 2014-07-01 false High-level ethanol-gasoline blends... Calibration Standards § 1065.725 High-level ethanol-gasoline blends. For testing vehicles capable of...

  11. 78 FR 70281 - United States-Mexico High Level Economic Dialogue

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-25

    ... United States-Mexico High Level Economic Dialogue AGENCY: International Trade Administration, Commerce... Register notice on the United States-Mexico High Level Economic Dialogue. DATES: The agency must receive... largest export market and third largest overall trading partner. The United States, in turn, is...

  12. DOE Matching Grant Program

    SciTech Connect

    Tsoukalas, L.

    2002-12-31

    Funding used to support a portion of the Nuclear Engineering Educational Activities. Upgrade of teaching labs, student support to attend professional conferences, salary support for graduate students. The US Department of Energy (DOE) has funded Purdue University School of Nuclear Engineering during the period of five academic years covered in this report starting in the academic year 1996-97 and ending in the academic year 2000-2001. The total amount of funding for the grant received from DOE is $416K. In the 1990's, Nuclear Engineering Education in the US experienced a significant slow down. Student enrollment, research support, number of degrees at all levels (BS, MS, and PhD), number of accredited programs, University Research and Training Reactors, all went through a decline to alarmingly low levels. Several departments closed down, while some were amalgamated with other academic units (Mechanical Engineering, Chemical Engineering, etc). The School of Nuclear Engineering at Purdue University faced a major challenge when in the mid 90's our total undergraduate enrollment for the Sophomore, Junior and Senior Years dropped in the low 30's. The DOE Matching Grant program greatly strengthened Purdue's commitment to the Nuclear Engineering discipline and has helped to dramatically improve our undergraduate and graduate enrollment, attract new faculty and raise the School of Nuclear Engineering status within the University and in the National scene (our undergraduate enrollment has actually tripled and stands at an all time high of over 90 students; total enrollment currently exceeds 110 students). In this final technical report we outline and summarize how the grant was expended at Purdue University.

  13. A multiple-template approach to protein threading.

    PubMed

    Peng, Jian; Xu, Jinbo

    2011-06-01

    Most threading methods predict the structure of a protein using only a single template. Due to the increasing number of solved structures, a protein without solved structure is very likely to have more than one similar template structures. Therefore, a natural question to ask is if we can improve modeling accuracy using multiple templates. This article describes a new multiple-template threading method to answer this question. At the heart of this multiple-template threading method is a novel probabilistic-consistency algorithm that can accurately align a single protein sequence simultaneously to multiple templates. Experimental results indicate that our multiple-template method can improve pairwise sequence-template alignment accuracy and generate models with better quality than single-template models even if they are built from the best single templates (P-value <10(-6)) while many popular multiple sequence/structure alignment tools fail to do so. The underlying reason is that our probabilistic-consistency algorithm can generate accurate multiple sequence/template alignments. In another word, without an accurate multiple sequence/template alignment, the modeling accuracy cannot be improved by simply using multiple templates to increase alignment coverage. Blindly tested on the CASP9 targets with more than one good template structures, our method outperforms all other CASP9 servers except two (Zhang-Server and QUARK of the same group). Our probabilistic-consistency algorithm can possibly be extended to align multiple protein/RNA sequences and structures.

  14. Transcriptional template activity of covalently modified DNA.

    PubMed

    Tolwińska-Stańczyk, Z; Wilmańska, D; Studzian, K; Gniazdowski, M

    1997-03-01

    The transcriptional template activity of covalent modified DNA is compared. 8-Methoxypsoralen (MOP), 3,4'dimethyl-8-methoxypsoralen (DMMOP) and benzopsoralen (BP) forming with DNA covalent complexes upon UV irradiation and exhibiting preference to pyrimidines, mostly thymines, differ in their cross-linking potency. MOP and DMMOP form both monoadducts and diadducts while no cross-links are formed by BP. Nitracrine (NC) forms covalent complexes with DNA upon reductive activation with dithiothreitol exhibiting a preference to purines and low cross-linking potency. Semilogarithmic plots of the relative template activity against the number of the drugs molecules covalently bound per 10(3) DNA nucleotides fit to regression lines corresponding to one-hit inactivation characteristics. The number of drug molecules decreasing RNA synthesis to 37% differ from 0.25 to 1.26 depending on the template used and the base preference but no dependence on the cross-linking potency was found. PMID:9067423

  15. Polyaniline nanowire synthesis templated by DNA

    NASA Astrophysics Data System (ADS)

    Nickels, Patrick; Dittmer, Wendy U.; Beyer, Stefan; Kotthaus, Jörg P.; Simmel, Friedrich C.

    2004-11-01

    DNA-templated polyaniline nanowires and networks are synthesized using three different methods. The resulting DNA/polyaniline hybrids are fully characterized using atomic force microscopy, UV-vis spectroscopy and current-voltage measurements. Oxidative polymerization of polyaniline at moderate pH values is accomplished using ammonium persulfate as an oxidant, or alternatively in an enzymatic oxidation by hydrogen peroxide using horseradish peroxidase, or by photo-oxidation using a ruthenium complex as photo-oxidant. Atomic force microscopy shows that all three methods lead to the preferential growth of polyaniline along DNA templates. With ammonium persulfate, polyaniline can be grown on DNA templates already immobilized on a surface. Current-voltage measurements are successfully conducted on DNA/polyaniline networks synthesized by the enzymatic method and the photo-oxidation method. The conductance is found to be consistent with values measured for undoped polyaniline films.

  16. Pattern matching is assessed in retinotopic coordinates.

    PubMed

    McKyton, Ayelet; Pertzov, Yoni; Zohary, Ehud

    2009-01-01

    We typically examine scenes by performing multiple saccades to different objects of interest within the image. Therefore, an extra-retinotopic representation, invariant to the changes in the retinal image caused by eye movements, might be useful for high-level visual processing. We investigate here, using a matching task, whether the representation of complex natural images is retinotopic or screen-based. Subjects observed two simultaneously presented images, made a saccadic eye movement to a new fixation point, and viewed a third image. Their task was to judge whether the third image was identical to one of the two earlier images or different. Identical images could appear either in the same retinotopic position, in the same screen position, or in totally different locations. Performance was best when the identical images appeared in the same retinotopic position and worst when they appeared in the opposite hemifield. Counter to commonplace intuition, no advantage was conferred from presenting the identical images in the same screen position. This, together with performance sensitivity for image translation of a few degrees, suggests that image matching, which can often be judged without overall recognition of the scene, is mostly determined by neuronal activity in earlier brain areas containing a strictly retinotopic representation and small receptive fields. PMID:20055552

  17. Exposure to Organic Solvents Used in Dry Cleaning Reduces Low and High Level Visual Function

    PubMed Central

    Jiménez Barbosa, Ingrid Astrid

    2015-01-01

    were also significantly higher and almost double than that obtained from non dry-cleaners. However, reaction time performance on both parallel and serial visual search was not different between dry cleaners and non dry-cleaners. Conclusions Exposure to occupational levels of organic solvents is associated with neurotoxicity which is in turn associated with both low level deficits (such as the perception of contrast and discrimination of colour) and high level visual deficits such as the perception of global form and motion, but not visual search performance. The latter finding indicates that the deficits in visual function are unlikely to be due to changes in general cognitive performance. PMID:25933026

  18. Matching the Spectrometers on board ISO

    NASA Astrophysics Data System (ADS)

    Burgdorf, M.; Feuchtgruber, H.; Salama, A.; García-Lario, P.; Müller, T.; Lord, S.

    We report on the findings of the Spectral Matching Working Group, the main aim of which was to investigate discontinuities between SWS and LWS in complete ISO spectra from 2 - 200 μm. In order to check in a quantitative way the agreement between the two spectrometers, a software tool was developed which automatically selected observations made with SWS and LWS on the same coordinates and which calculated the ratio of the fluxes in the overlap region from the browser products. In this way all observations suitable for this cross-calibration exercise could be selected, provided that they were performed with standard Astronomical Observing Templates and covered the wavelength range that SWS and LWS have in common. 95% of those targets which were neither extended nor variable showed an agreement better than 20% between the two spectrometers. Several problems with the data from the instruments, like saturation effects, detector transients and discontinuities between the sub-spectra from different detectors, affect both spectrometers in a similar way and require special processing steps. We show, for some solar system objects, to which extent the spectra taken with ISO from the mid- to the far-infrared agree with theoretical models. Furthermore, we discuss for the example of Neptune how the combined information from both spectrometers can be used to put new constraints on models of objects that are possible calibration standards for future missions.

  19. Biological doses with template distribution patterns

    SciTech Connect

    Harrop, R.; Haymond, H.R.; Nisar, A.; Syed, A.N.M.; Feder, B.H.; Neblett, D.L.

    1981-02-01

    Consideration of radiation dose rate effects emphasizes advantages of the template method for lateral distribution of multiple sources in treatment of laterally infiltrating gynecologic cancer, when compared to a conventional technique with colpostats. Biological doses in time dose fractionation (TDF), ret and reu units are calculated for the two treatment methods. With the template method the lateral dose (point B) is raised without significantly increasing the doses to the rectum and bladder, that is, relatively, the calculated biological doses at point A and B are more nearly equivalent and the doses to the rectum and bladder are significantly lower than the dose to point B.

  20. Affordance Templates for Shared Robot Control

    NASA Technical Reports Server (NTRS)

    Hart, Stephen; Dinh, Paul; Hambuchen, Kim

    2014-01-01

    This paper introduces the Affordance Template framework used to supervise task behaviors on the NASA-JSC Valkyrie robot at the 2013 DARPA Robotics Challenge (DRC) Trials. This framework provides graphical interfaces to human supervisors that are adjustable based on the run-time environmental context (e.g., size, location, and shape of objects that the robot must interact with, etc.). Additional improvements, described below, inject degrees of autonomy into instantiations of affordance templates at run-time in order to enable efficient human supervision of the robot for accomplishing tasks.

  1. Preparation of porous lanthanum phosphate with templates

    SciTech Connect

    Onoda, Hiroaki; Ishima, Yuya; Takenaka, Atsushi; Tanaka, Isao

    2009-08-05

    Malonic acid, propionic acid, glycine, n-butylamine, and urea were added to the preparation of lanthanum phosphate from lanthanum nitrate and phosphoric acid solutions. All additives were taken into lanthanum phosphate particles. The additives that have a basic site were easy to contain in precipitates. The addition of templates improved the specific surface area of lanthanum phosphate. The amount of pore, with radius smaller than 4 nm, increased with the addition of templates. The remained additives had influence on the acidic properties of lanthanum phosphate.

  2. Templates for Deposition of Microscopic Pointed Structures

    NASA Technical Reports Server (NTRS)

    Pugel, Diane E.

    2008-01-01

    Templates for fabricating sharply pointed microscopic peaks arranged in nearly regular planar arrays can be fabricated by a relatively inexpensive technique that has recently been demonstrated. Depending on the intended application, a semiconducting, insulating, or metallic film could be deposited on such a template by sputtering, thermal evaporation, pulsed laser deposition, or any other suitable conventional deposition technique. Pointed structures fabricated by use of these techniques may prove useful as photocathodes or field emitters in plasma television screens. Selected peaks could be removed from such structures and used individually as scanning tips in atomic force microscopy or mechanical surface profiling.

  3. Vertically aligned zinc oxide nanowires electrodeposited within porous polycarbonate templates for vibrational energy harvesting.

    PubMed

    Boughey, Francesca L; Davies, Timothy; Datta, Anuja; Whiter, Richard A; Sahonta, Suman-Lata; Kar-Narayan, Sohini

    2016-07-15

    A piezoelectric nanogenerator has been fabricated using a simple, fast and scalable template-assisted electrodeposition process, by which vertically aligned zinc oxide (ZnO) nanowires were directly grown within a nanoporous polycarbonate (PC) template. The nanowires, having average diameter 184 nm and length 12 μm, are polycrystalline and have a preferred orientation of the [100] axis parallel to the long axis. The output power density of a nanogenerator fabricated from the as-grown ZnO nanowires still embedded within the PC template was found to be 151 ± 25 mW m(-3) at an impedance-matched load, when subjected to a low-level periodic (5 Hz) impacting force akin to gentle finger tapping. An energy conversion efficiency of ∼4.2% was evaluated for the electrodeposited ZnO nanowires, and the ZnO-PC composite nanogenerator was found to maintain good energy harvesting performance through 24 h of continuous fatigue testing. This is particularly significant given that ZnO-based nanostructures typically suffer from mechanical and/or environmental degradation that otherwise limits their applicability in vibrational energy harvesting. Our template-assisted synthesis of ZnO nanowires embedded within a protective polymer matrix through a single growth process is thus attractive for the fabrication of low-cost, robust and stable nanogenerators.

  4. Vertically aligned zinc oxide nanowires electrodeposited within porous polycarbonate templates for vibrational energy harvesting

    NASA Astrophysics Data System (ADS)

    Boughey, Francesca L.; Davies, Timothy; Datta, Anuja; Whiter, Richard A.; Sahonta, Suman-Lata; Kar-Narayan, Sohini

    2016-07-01

    A piezoelectric nanogenerator has been fabricated using a simple, fast and scalable template-assisted electrodeposition process, by which vertically aligned zinc oxide (ZnO) nanowires were directly grown within a nanoporous polycarbonate (PC) template. The nanowires, having average diameter 184 nm and length 12 μm, are polycrystalline and have a preferred orientation of the [100] axis parallel to the long axis. The output power density of a nanogenerator fabricated from the as-grown ZnO nanowires still embedded within the PC template was found to be 151 ± 25 mW m-3 at an impedance-matched load, when subjected to a low-level periodic (5 Hz) impacting force akin to gentle finger tapping. An energy conversion efficiency of ˜4.2% was evaluated for the electrodeposited ZnO nanowires, and the ZnO-PC composite nanogenerator was found to maintain good energy harvesting performance through 24 h of continuous fatigue testing. This is particularly significant given that ZnO-based nanostructures typically suffer from mechanical and/or environmental degradation that otherwise limits their applicability in vibrational energy harvesting. Our template-assisted synthesis of ZnO nanowires embedded within a protective polymer matrix through a single growth process is thus attractive for the fabrication of low-cost, robust and stable nanogenerators.

  5. [Exploration of research approaches of Chinese medicine's pharmacology based on "imprinting templates" (medical element) of supramolecules].

    PubMed

    He, Fu-yuan; He, Hong; Deng, Kai-wen; Zhou, Yi-qun; Shi, Ji-lian; Liu, Wen-long; Yang, Yan-tao; Tang, Yu

    2015-11-01

    The paper, based on the previous publication as special impact of Chinese medicine theories on supramolcular chemistry, aims to analyze the natural origination for the Chinese medicine and to explain the special impact of "Qi chromatography" reaction on "imprinting templates" in supramolcular host of human being with Chinese medicine, in order to reveal the CM's properties of "medical element" with "imprinting templates" autonomisation generally took place in natural supramolecules, and also to discover that the CM's pharmacology are satisfied with its own approaches different form western pharmacology. It was decided, for CM's pharmacology guided by CM's theories, to "Qi chromatography" relations between the CM's ingredient groups and the meridian zang-fu viscera. The supramolcular chemistry played an all-through role in procession of making macro-regularities and special presentation on behavior of "Qi chromatography" impulse owning to the matching action of all kinds of ingredients on the meridian zang-fu viscera with similar "imprinting templates". The CM's pharmacology were guided by CM's theories, owing to its interpretation of supramolecular chemistry. The pharmacology was achieved to construct up completely on base of classical chemical single molecular bonds whereas the CM's pharmacology be configured to big building by way of "imprinting templates" as multi-weak bonds among "supramolecular society". CM's pharmacology was supramolcular pharmacology dealt with "molecular society" on the base of western pharmacology, and employed to double research approaches both math-physical quantitative representation on macroscope and qualitative analyses in microscope.

  6. Vertically aligned zinc oxide nanowires electrodeposited within porous polycarbonate templates for vibrational energy harvesting.

    PubMed

    Boughey, Francesca L; Davies, Timothy; Datta, Anuja; Whiter, Richard A; Sahonta, Suman-Lata; Kar-Narayan, Sohini

    2016-07-15

    A piezoelectric nanogenerator has been fabricated using a simple, fast and scalable template-assisted electrodeposition process, by which vertically aligned zinc oxide (ZnO) nanowires were directly grown within a nanoporous polycarbonate (PC) template. The nanowires, having average diameter 184 nm and length 12 μm, are polycrystalline and have a preferred orientation of the [100] axis parallel to the long axis. The output power density of a nanogenerator fabricated from the as-grown ZnO nanowires still embedded within the PC template was found to be 151 ± 25 mW m(-3) at an impedance-matched load, when subjected to a low-level periodic (5 Hz) impacting force akin to gentle finger tapping. An energy conversion efficiency of ∼4.2% was evaluated for the electrodeposited ZnO nanowires, and the ZnO-PC composite nanogenerator was found to maintain good energy harvesting performance through 24 h of continuous fatigue testing. This is particularly significant given that ZnO-based nanostructures typically suffer from mechanical and/or environmental degradation that otherwise limits their applicability in vibrational energy harvesting. Our template-assisted synthesis of ZnO nanowires embedded within a protective polymer matrix through a single growth process is thus attractive for the fabrication of low-cost, robust and stable nanogenerators. PMID:27256619

  7. Emulating the Visual Receptive Field Properties of MST Neurons with a Template Model of Heading Estimation

    NASA Technical Reports Server (NTRS)

    Perrone, John A.; Stone, Leland S.

    1997-01-01

    We have previously proposed a computational neural-network model by which the complex patterns of retinal image motion generated during locomotion (optic flow) can be processed by specialized detectors acting as templates for specific instances of self-motion. The detectors in this template model respond to global optic flow by sampling image motion over a large portion of the visual field through networks of local motion sensors with properties similar to neurons found in the middle temporal (MT) area of primate extrastriate visual cortex. The model detectors were designed to extract self-translation (heading), self-rotation, as well as the scene layout (relative distances) ahead of a moving observer, and are arranged in cortical-like heading maps to perform this function. Heading estimation from optic flow has been postulated by some to be implemented within the medial superior temporal (MST) area. Others have questioned whether MST neurons can fulfill this role because some of their receptive-field properties appear inconsistent with a role in heading estimation. To resolve this issue, we systematically compared MST single-unit responses with the outputs of model detectors under matched stimulus conditions. We found that the basic physiological properties of MST neurons can be explained by the template model. We conclude that MST neurons are well suited to support heading estimation and that the template model provides an explicit set of testable hypotheses which can guide future exploration of MST and adjacent areas within the primate superior temporal sulcus.

  8. A template bank to search for gravitational waves from inspiralling compact binaries: I. Physical models

    NASA Astrophysics Data System (ADS)

    Babak, S.; Balasubramanian, R.; Churches, D.; Cokelaer, T.; Sathyaprakash, B. S.

    2006-09-01

    Gravitational waves from coalescing compact binaries are searched for using the matched filtering technique. As the model waveform depends on a number of parameters, it is necessary to filter the data through a template bank covering the astrophysically interesting region of the parameter space. The choice of templates is defined by the maximum allowed drop in signal-to-noise ratio due to the discreteness of the template bank. In this paper we describe the template-bank algorithm that was used in the analysis of data from the Laser Interferometer Gravitational Wave Observatory (LIGO) and GEO 600 detectors to search for signals from binaries consisting of non-spinning compact objects. Using Monte Carlo simulations, we study the efficiency of the bank and show that its performance is satisfactory for the design sensitivity curves of ground-based interferometric gravitational wave detectors GEO 600, initial LIGO, advanced LIGO and Virgo. The bank is efficient in searching for various compact binaries such as binary primordial black holes, binary neutron stars, binary black holes, as well as a mixed binary consisting of a non-spinning black hole and a neutron star.

  9. The biogeochemical cycle of the adsorbed template. II - Selective adsorption of mononucleotides on adsorbed polynucleotide templates

    NASA Technical Reports Server (NTRS)

    Lazard, Daniel; Lahav, Noam; Orenberg, James B.

    1988-01-01

    Experimental results are presented for the verification of the specific interaction step of the 'adsorbed template' biogeochemical cycle, a simple model for a primitive prebiotic replication system. The experimental system consisted of gypsum as the mineral to which an oligonucleotide template attaches (Poly-C or Poly-U) and (5-prime)-AMP, (5-prime)-GMP, (5-prime)-CMP and (5-prime)-UMP as the interacting biomonomers. When Poly-C or Poly-U were used as adsorbed templates, (5-prime)-GMP and (5-prime)-AMP, respectively, were observed to be the most strongly adsorbed species.

  10. Rapid muscle force capacity changes after soccer match play.

    PubMed

    Thorlund, J B; Aagaard, P; Madsen, K

    2009-04-01

    The present study examined the fatigue development in muscle mechanical properties with emphasis on rapid force characteristics and neuromuscular activity in response to high level soccer match play. Young elite soccer players (n=9) were tested before (CON) and after (POST) a soccer match for maximal knee extensor and flexor isometric strength (MVC) and contractile rate of force development (RFD) with synchronous surface electromyography (EMG) recording. Furthermore, maximal vertical jump power and related parameters were assessed. Isometric knee extensor and flexor MVC decreased approximately 10% (p< or =0.01) along with a right-shift in the moment-time curve. RFD decreased approximately 9% (0-200 ms) for the knee flexors while there was a tendency towards reduced RFD during knee extension following soccer match play. Similar reductions were observed for some but not all selected EMG parameters during the MVC and RFD tests. Mechanical jump parameters generally remained unchanged post match play. This study is the first to examine the fatigue induced changes in rapid muscle force production (RFD) induced by soccer match play. The observed decrement in rapid muscle force capacity is likely to have negative impact on performance in explosive playing actions (i.e. accelerations, kicking, sprinting) that typically is involved in soccer match play.

  11. Ontological Problem-Solving Framework for Assigning Sensor Systems and Algorithms to High-Level Missions

    PubMed Central

    Qualls, Joseph; Russomanno, David J.

    2011-01-01

    The lack of knowledge models to represent sensor systems, algorithms, and missions makes opportunistically discovering a synthesis of systems and algorithms that can satisfy high-level mission specifications impractical. A novel ontological problem-solving framework has been designed that leverages knowledge models describing sensors, algorithms, and high-level missions to facilitate automated inference of assigning systems to subtasks that may satisfy a given mission specification. To demonstrate the efficacy of the ontological problem-solving architecture, a family of persistence surveillance sensor systems and algorithms has been instantiated in a prototype environment to demonstrate the assignment of systems to subtasks of high-level missions. PMID:22164081

  12. Intensity of tennis match play

    PubMed Central

    Fernandez, J; Mendez‐Villanueva, A; Pluim, B M

    2006-01-01

    This review focuses on the characteristics of tennis players during match play and provides a greater insight into the energy demands of tennis. A tennis match often lasts longer than an hour and in some cases more than five hours. During a match there is a combination of periods of maximal or near maximal work and longer periods of moderate and low intensity activity. Match intensity varies considerably depending on the players' level, style, and sex. It is also influenced by factors such as court surface and ball type. This has important implications for the training of tennis players, which should resemble match intensity and include interval training with appropriate work to rest ratios. PMID:16632566

  13. Organic or organometallic template mediated clay synthesis

    DOEpatents

    Gregar, Kathleen C.; Winans, Randall E.; Botto, Robert E.

    1994-01-01

    A method for incorporating diverse Varieties of intercalants or templates directly during hydrothermal synthesis of clays such as hectorite or montmorillonite-type layer-silicate clays. For a hectorite layer-silicate clay, refluxing a gel of silica sol, magnesium hydroxide sol and lithium fluoride for two days in the presence of an organic or organometallic intercalant or template results in crystalline products containing either (a) organic dye molecules such as ethyl violet and methyl green, (b) dye molecules such as alcian blue that are based on a Cu(II)-phthalocyannine complex, or (c) transition metal complexes such as Ru(II)phenanthroline and Co(III)sepulchrate or (d) water-soluble porphyrins and metalloporphyrins. Montmorillonite-type clays are made by the method taught by U.S. Pat. No. 3,887,454 issued to Hickson, Jun. 13, 1975; however, a variety of intercalants or templates may be introduced. The intercalants or templates should have (i) water-solubility, (ii) positive charge, and (iii) thermal stability under moderately basic (pH 9-10) aqueous reflux conditions or hydrothermal pressurized conditions for the montmorillonite-type clays.

  14. Organic or organometallic template mediated clay synthesis

    DOEpatents

    Gregar, K.C.; Winans, R.E.; Botto, R.E.

    1994-05-03

    A method is described for incorporating diverse varieties of intercalates or templates directly during hydrothermal synthesis of clays such as hectorite or montmorillonite-type layer-silicate clays. For a hectorite layer-silicate clay, refluxing a gel of silica sol, magnesium hydroxide sol and lithium fluoride for two days in the presence of an organic or organometallic intercalate or template results in crystalline products containing either (a) organic dye molecules such as ethyl violet and methyl green, (b) dye molecules such as alcian blue that are based on a Cu(II)-phthalocyannine complex, or (c) transition metal complexes such as Ru(II)phenanthroline and Co(III)sepulchrate or (d) water-soluble porphyrins and metalloporphyrins. Montmorillonite-type clays are made by the method taught by U.S. Pat. No. 3,887,454 issued to Hickson, Jun. 13, 1975; however, a variety of intercalates or templates may be introduced. The intercalates or templates should have (i) water-solubility, (ii) positive charge, and (iii) thermal stability under moderately basic (pH 9-10) aqueous reflux conditions or hydrothermal pressurized conditions for the montmorillonite-type clays. 22 figures.

  15. A lightweight approach for biometric template protection

    NASA Astrophysics Data System (ADS)

    Al-Assam, Hisham; Sellahewa, Harin; Jassim, Sabah

    2009-05-01

    Privacy and security are vital concerns for practical biometric systems. The concept of cancelable or revocable biometrics has been proposed as a solution for biometric template security. Revocable biometric means that biometric templates are no longer fixed over time and could be revoked in the same way as lost or stolen credit cards are. In this paper, we describe a novel and an efficient approach to biometric template protection that meets the revocability property. This scheme can be incorporated into any biometric verification scheme while maintaining, if not improving, the accuracy of the original biometric system. However, we shall demonstrate the result of applying such transforms on face biometric templates and compare the efficiency of our approach with that of the well-known random projection techniques. We shall also present the results of experimental work on recognition accuracy before and after applying the proposed transform on feature vectors that are generated by wavelet transforms. These results are based on experiments conducted on a number of well-known face image databases, e.g. Yale and ORL databases.

  16. Organic or organometallic template mediated clay synthesis

    SciTech Connect

    Gregar, K.C.; Winans, R.E.; Botto, R.E.

    1992-12-31

    A method is given for incorporating diverse varieties of intercalants or templates directly during hydrothermal synthesis of clays such as hectorite or montmorillonite-type layer-silicate clays. For a hectorite layer-silicate clay, refluxing a gel of silica sol, magnesium hydroxide sol and LiF for 2 days with an organic or organometallic intercalant or template results in crystalline products containing either (a) organic dye molecules such as ethyl violet and methyl green, (b) dye molecules such as alcian blue based on a Cu(II)-phthalocyannine complex, or (c) transition metal complexes such as Ru(II)phenanthroline and Co(III)sepulchrate or (d) water-soluble porphyrins and metalloporphyrins. Montmorillonite-type clays are made by the method taught by US patent No. 3,887,454 issued to Hickson, June 13, 1975; however, a variety of intercalants or templates may be introduced. The intercalants or templates should have water-solubility, positive charge, and thermal stability under moderately basic (pH 9-10) aqueous reflux conditions or hydrothermal pressurized conditions for the montmorillonite-type clays.

  17. Electrochemical synthesis on single cells as templates.

    PubMed

    Tam, Jasper; Salgado, Shehan; Miltenburg, Mark; Maheshwari, Vivek

    2013-10-01

    The cell surface is made electrochemically active by interfacing with graphene sheets. The electrical and thermal properties of graphene allow the control of cell surface potential for electrochemical synthesis. Using this approach radially projecting ZnO nanorods are templated on the surface of single cells. This reported single cell photosensor has superior performance than similar devices made on planar surfaces.

  18. Stacked subsea templates accelerate deepwater development

    SciTech Connect

    Ramsey, J.F.; Blincow, R.M.; Pickard, R.D. )

    1991-10-21

    This paper reports on a deepwater project that can be brought on-line more quickly because of stackable drilling and production templates. Historically, one of the primary barriers to the economic development of deepwater reserves has been the long lead time from discovery to first production. Typically, production facilities must be built and often installed before development wells are drilled. The use of three-slot drilling templates allows development drilling to proceed while the production templates, Christmas trees, flow lines, and production platform are constructed. Thus, the time from initial investment to first revenue reduced. Enserch Exploration Inc., along with partners Petrofina Delaware Inc. and AGIP Petroleum, is using a piggy-back or transportable stacked template system to develop deepwater gas reserves in Mississippi Canyon Block 441, approximately 50 miles south of Grand Isle, La. The discovery is located in 1,410-1,520 ft of water. The Louisiana Offshore Oil Port (LOOP) safety fairway running north to south covers the eastern three fourths of Mississippi Canyon Block 441 and rules out surface production facilities over the well locations.

  19. Engineering topochemical polymerizations using block copolymer templates.

    PubMed

    Zhu, Liangliang; Tran, Helen; Beyer, Frederick L; Walck, Scott D; Li, Xin; Agren, Hans; Killops, Kato L; Campos, Luis M

    2014-09-24

    With the aim to achieve rapid and efficient topochemical polymerizations in the solid state, via solution-based processing of thin films, we report the integration of a diphenyldiacetylene monomer and a poly(styrene-b-acrylic acid) block copolymer template for the generation of supramolecular architectural photopolymerizable materials. This strategy takes advantage of non-covalent interactions to template a topochemical photopolymerization that yields a polydiphenyldiacetylene (PDPDA) derivative. In thin films, it was found that hierarchical self-assembly of the diacetylene monomers by microphase segregation of the block copolymer template enhances the topochemical photopolymerization, which is complete within a 20 s exposure to UV light. Moreover, UV-active cross-linkable groups were incorporated within the block copolymer template to create micropatterns of PDPDA by photolithography, in the same step as the polymerization reaction. The materials design and processing may find potential uses in the microfabrication of sensors and other important areas that benefit from solution-based processing of flexible conjugated materials. PMID:25208609

  20. A Method for Non-Rigid Face Alignment via Combining Local and Holistic Matching

    PubMed Central

    Yang, Yang; Chen, Zhuo

    2016-01-01

    We propose a method for non-rigid face alignment which only needs a single template, such as using a person’s smile face to match his surprise face. First, in order to be robust to outliers caused by complex geometric deformations, a new local feature matching method called K Patch Pairs (K-PP) is proposed. Specifically, inspired by the state-of-art similarity measure used in template matching, K-PP is to find the mutual K nearest neighbors between two images. A weight matrix is then presented to balance the similarity and the number of local matching. Second, we proposed a modified Lucas-Kanade algorithm combined with local matching constraint to solve the non-rigid face alignment, so that a holistic face representation and local features can be jointly modeled in the object function. Both the flexible ability of local matching and the robust ability of holistic fitting are included in our method. Furthermore, we show that the optimization problem can be efficiently solved by the inverse compositional algorithm. Comparison results with conventional methods demonstrate our superiority in terms of both accuracy and robustness. PMID:27494319

  1. Object matching using a locally affine invariant and linear programming techniques.

    PubMed

    Li, Hongsheng; Huang, Xiaolei; He, Lei

    2013-02-01

    In this paper, we introduce a new matching method based on a novel locally affine-invariant geometric constraint and linear programming techniques. To model and solve the matching problem in a linear programming formulation, all geometric constraints should be able to be exactly or approximately reformulated into a linear form. This is a major difficulty for this kind of matching algorithm. We propose a novel locally affine-invariant constraint which can be exactly linearized and requires a lot fewer auxiliary variables than other linear programming-based methods do. The key idea behind it is that each point in the template point set can be exactly represented by an affine combination of its neighboring points, whose weights can be solved easily by least squares. Errors of reconstructing each matched point using such weights are used to penalize the disagreement of geometric relationships between the template points and the matched points. The resulting overall objective function can be solved efficiently by linear programming techniques. Our experimental results on both rigid and nonrigid object matching show the effectiveness of the proposed algorithm.

  2. OCCURRENCE OF HIGH-LEVEL AMINOGLYCOSIDE RESISTANCE IN ENVIRONMENTAL ISOLATES OF ENTEROCOCCI

    EPA Science Inventory

    High-level resistance fo aminoglycosides was observed in environmental isolates of enterococci. Various aquatic habitats, including agricultural runoff, creeks, rivers, wastewater, and wells, were analyzed. Strains of Enterococcus faecalis, e.faecium, E. gallinarum, and other Ent...

  3. Assessment of high-level waste form conformance with proposed regulatory and repository criteria

    SciTech Connect

    Gordon, D E; Gray, P L; Jennings, A S; Permar, P H

    1982-04-01

    Federal regulatory criteria for geologic disposal of high-level waste are under development. Also, interim performance specifications for high-level waste forms in geologic isolation are being developed within the Federal program responsible for repository selection and operation. Two high-level waste forms, borosilicate glass and crystalline ceramic, have been selected as candidate immobilization forms for the Defense Waste Processing Facility (DWPF) which is to immobilize high-level wastes at the Savannah River Plant (SRP). An assessment of how these two waste forms conform with the proposed regulatory criteria and repository specifications was performed. Both forms were determined to be in conformance with postulated rules for radionuclide releases and radiation exposures throughout the entire waste disposal system, as well as with proposed repository operation requirements.

  4. Demonstration of Small Tank Tetraphenylborate Precipitation Process Using Savannah River Site High Level Waste

    SciTech Connect

    Peters, T.B.

    2001-09-10

    This report details the experimental effort to demonstrate the continuous precipitation of cesium from Savannah River Site High Level Waste using sodium tetraphenylborate. In addition, the experiments examined the removal of strontium and various actinides through addition of monosodium titanate.

  5. Visual cluster analysis and pattern recognition template and methods

    SciTech Connect

    Osbourn, G.C.; Martinez, R.F.

    1993-12-31

    This invention is comprised of a method of clustering using a novel template to define a region of influence. Using neighboring approximation methods, computation times can be significantly reduced. The template and method are applicable and improve pattern recognition techniques.

  6. Computational templates for introductory nuclear science using mathcad

    NASA Astrophysics Data System (ADS)

    Sarantites, D. G.; Sobotka, L. G.

    2013-01-01

    Computational templates used to teach an introductory course in nuclear chemistry and physics at Washington University in St. Louis are presented in brief. The templates cover both basic and applied topics.

  7. Visual cluster analysis and pattern recognition template and methods

    DOEpatents

    Osbourn, Gordon Cecil; Martinez, Rubel Francisco

    1999-01-01

    A method of clustering using a novel template to define a region of influence. Using neighboring approximation methods, computation times can be significantly reduced. The template and method are applicable and improve pattern recognition techniques.

  8. Visual cluster analysis and pattern recognition template and methods

    DOEpatents

    Osbourn, G.C.; Martinez, R.F.

    1999-05-04

    A method of clustering using a novel template to define a region of influence is disclosed. Using neighboring approximation methods, computation times can be significantly reduced. The template and method are applicable and improve pattern recognition techniques. 30 figs.

  9. The Demographics of High-level and Recreational Athletes With Intra-articular Hip Injury

    PubMed Central

    Tibor, Lisa M.; Bedi, Asheesh; Oltean, Hanna N.; Gagnier, Joel Joseph; Kelly, Bryan T.

    2013-01-01

    Objectives: The pathoanatomy that causes femoroacetabular impingement (FAI) is common, but not everyone develops hip pain or arthrosis. Symptomatic FAI is likely due to a combination of anatomy and biomechanical demands, including sports participation. The primary purpose of this study was to determine demographic differences between high-level and recreational athletes undergoing hip arthroscopy. The secondary purpose of this study was to look at the demographics of high-level athletes grouped by sports with similar mechanical demands on the hip. We hypothesize that high-level and recreational athletes will differ by age, gender, and need for bilateral surgery. We also predict that demographics for high-level athletes will differ for sports with unique demands for hip kinematics. Methods: Using our hip preservation center registry, a retrospective review of prospectively collected data from patients undergoing hip arthroscopy between March 2010 and April 2012 was performed. Athletes were categorized as high-level (high school, collegiate, Olympic/international, or professional) or recreational. Subgroup analysis was performed for high-level athletes, looking at differences between contact, rotational running, impingement, overhead/asymmetric, endurance, and flexibility sports. Results: 288 high-level athletes and 334 recreational athletes were included. Being a high level athlete was associated with younger age (average age 20.2 vs 33.0, OR=0.69, P<0.001) and male gender (61.5% vs 53.6%, OR=1.75, P=0.03). The percentage of high-level athletes undergoing bilateral surgery was higher than for recreational athletes (28.4% vs 15.9%); however, this association was found to be confounded by age in multivariate analysis The most common sports for high-level athletes were soccer, hockey, and football. Athletes participating in rotational running sports were significantly younger than flexibility, contact, or impingement athletes. Similarly, endurance athletes were

  10. Immobilized high-level waste interim storage alternatives generation and analysis and decision report

    SciTech Connect

    CALMUS, R.B.

    1999-05-18

    This report presents a study of alternative system architectures to provide onsite interim storage for the immobilized high-level waste produced by the Tank Waste Remediation System (TWRS) privatization vendor. It examines the contract and program changes that have occurred and evaluates their impacts on the baseline immobilized high-level waste (IHLW) interim storage strategy. In addition, this report documents the recommended initial interim storage architecture and implementation path forward.

  11. Evaluation of high-level clouds in cloud resolving model simulations with ARM and KWAJEX observations

    NASA Astrophysics Data System (ADS)

    Liu, Zheng; Muhlbauer, Andreas; Ackerman, Thomas

    2015-12-01

    In this study, we evaluate high-level clouds in a cloud resolving model during two convective cases, ARM9707 and KWAJEX. The simulated joint histograms of cloud occurrence and radar reflectivity compare well with cloud radar and satellite observations when using a two-moment microphysics scheme. However, simulations performed with a single moment microphysical scheme exhibit low biases of approximately 20 dB. During convective events, two-moment microphysical overestimate the amount of high-level cloud and one-moment microphysics precipitate too readily and underestimate the amount and height of high-level cloud. For ARM9707, persistent large positive biases in high-level cloud are found, which are not sensitive to changes in ice particle fall velocity and ice nuclei number concentration in the two-moment microphysics. These biases are caused by biases in large-scale forcing and maintained by the periodic lateral boundary conditions. The combined effects include significant biases in high-level cloud amount, radiation, and high sensitivity of cloud amount to nudging time scale in both convective cases. The high sensitivity of high-level cloud amount to the thermodynamic nudging time scale suggests that thermodynamic nudging can be a powerful "tuning" parameter for the simulated cloud and radiation but should be applied with caution. The role of the periodic lateral boundary conditions in reinforcing the biases in cloud and radiation suggests that reducing the uncertainty in the large-scale forcing in high levels is important for similar convective cases and has far reaching implications for simulating high-level clouds in super-parameterized global climate models such as the multiscale modeling framework.

  12. Templated Self Assemble of Nano-Structures

    SciTech Connect

    Suo, Zhigang

    2013-04-29

    This project will identify and model mechanisms that template the self-assembly of nanostructures. We focus on a class of systems involving a two-phase monolayer of molecules adsorbed on a solid surface. At a suitably elevated temperature, the molecules diffuse on the surface to reduce the combined free energy of mixing, phase boundary, elastic field, and electrostatic field. With no template, the phases may form a pattern of stripes or disks. The feature size is on the order of 1-100 nm, selected to compromise the phase boundary energy and the long-range elastic or electrostatic interaction. Both experimental observations and our theoretical simulations have shown that the pattern resembles a periodic lattice, but has abundant imperfections. To form a perfect periodic pattern, or a designed aperiodic pattern, one must introduce a template to guide the assembly. For example, a coarse-scale pattern, lithographically defined on the substrate, will guide the assembly of the nanoscale pattern. As another example, if the molecules on the substrate surface carry strong electric dipoles, a charged object, placed in the space above the monolayer, will guide the assembly of the molecular dipoles. In particular, the charged object can be a mask with a designed nanoscale topographic pattern. A serial process (e.g., e-beam lithography) is necessary to make the mask, but the pattern transfer to the molecules on the substrate is a parallel process. The technique is potentially a high throughput, low cost process to pattern a monolayer. The monolayer pattern itself may serve as a template to fabricate a functional structure. This project will model fundamental aspects of these processes, including thermodynamics and kinetics of self-assembly, templated self-assembly, and self-assembly on unconventional substrates. It is envisioned that the theory will not only explain the available experimental observations, but also motivate new experiments.

  13. Optically Transparent Wood from a Nanoporous Cellulosic Template: Combining Functional and Structural Performance.

    PubMed

    Li, Yuanyuan; Fu, Qiliang; Yu, Shun; Yan, Min; Berglund, Lars

    2016-04-11

    Optically transparent wood (TW) with transmittance as high as 85% and haze of 71% was obtained using a delignified nanoporous wood template. The template was prepared by removing the light-absorbing lignin component, creating nanoporosity in the wood cell wall. Transparent wood was prepared by successful impregnation of lumen and the nanoscale cellulose fiber network in the cell wall with refractive-index-matched prepolymerized methyl methacrylate (MMA). During the process, the hierarchical wood structure was preserved. Optical properties of TW are tunable by changing the cellulose volume fraction. The synergy between wood and PMMA was observed for mechanical properties. Lightweight and strong transparent wood is a potential candidate for lightweight low-cost, light-transmitting buildings and transparent solar cell windows.

  14. The shape of the face template: geometric distortions of faces and their detection in natural scenes.

    PubMed

    Pongakkasira, Kaewmart; Bindemann, Markus

    2015-04-01

    Human face detection might be driven by skin-coloured face-shaped templates. To explore this idea, this study compared the detection of faces for which the natural height-to-width ratios were preserved with distorted faces that were stretched vertically or horizontally. The impact of stretching on detection performance was not obvious when faces were equated to their unstretched counterparts in terms of their height or width dimension (Experiment 1). However, stretching impaired detection when the original and distorted faces were matched for their surface area (Experiment 2), and this was found with both vertically and horizontally stretched faces (Experiment 3). This effect was evident in accuracy, response times, and also observers' eye movements to faces. These findings demonstrate that height-to-width ratios are an important component of the cognitive template for face detection. The results also highlight important differences between face detection and face recognition.

  15. [Field matching in breast irradiation

    PubMed

    Varga, Sz; Takácsi Nagy, L; Pesznyák, Cs; Lövey, K; Polgár, I

    2001-01-01

    INTRODUCTION: In this paper the authors have combined different irradiation techniques for breast and adjacent supraclavicular lymph nodes. The aim was to reduce inhomogeneity in the match-line. METHODS: The CadPlan 6.1.5 three-dimensional treatment planning system was applied in this study for CT based plan using a standard medial and lateral wedged tangential breast portals with the adjacent supraclavicular field. Isocenter is placed at depth on the match-line, where asymmetric jaws are used to produce non-divergent field edges. The tangential fields are shaped using multi-leaf collimator (MLC), by following the curvature of the thorax. In this way the cranial vertical match plane is maintaned without using the breast board. The prescribed dose was 50 Gy at the isocentre. RESULTS: The calculated dose distributions were evaluated in three dimension in the match region of supraclavicular field and the two opposing tangential fields. This method produces a more uniform dose distribution in the target volume and in the match-line. Set-up is fast, this is done without the need for table rotation, or vertical cephalad blocks. The average dose to the ipsilateral lung is reduced using the IMRT (intensity modulated radiotherapy) technique by approximately 10% compared with the conventional technique. Furthermore, this new technique has the possibility to improve the field match between the tangential fields and the parasternal field, while maintaning the field match between the tangential fields and the axillary and supraclavicular fields.

  16. Lead-free electric matches.

    SciTech Connect

    Son, S. F.; Hiskey, M. A.; Naud, D.; Busse, J. R.; Asay, B. W.

    2002-01-01

    Electric matches are used in pyrotechnics to initiate devices electrically rather than by burning fuses. Fuses have the disadvantage of burning with a long delay before igniting a pyrotechnic device, while electric matches can instantaneously fire a device at a user's command. In addition, electric matches can be fired remotely at a safe distance. Unfortunately, most current commercial electric match compositions contain lead as thiocyanate, nitroresorcinate or tetroxide, which when burned, produces lead-containing smoke. This lead pollutant presents environmental exposure problems to cast, crew, and audience. The reason that these lead containing compounds are used as electric match compositions is that these mixtures have the required thermal stability, yet are simultaneously able to be initiated reliably by a very small thermal stimulus. A possible alternative to lead-containing compounds is nanoscale thermite materials (metastable intermolecular composites or MIC). These superthermite materials can be formulated to be extremely spark sensitive with tunable reaction rate and yield high temperature products. We have formulated and manufactured lead-free electric matches based on nanoscale Al/MoO{sub 3} mixtures. We have determined that these matches fire reliably and to consistently ignite a sample of black powder. Initial safety, ageing and performance results are presented in this paper.

  17. PIV uncertainty quantification by image matching

    NASA Astrophysics Data System (ADS)

    Sciacchitano, Andrea; Wieneke, Bernhard; Scarano, Fulvio

    2013-04-01

    highly sheared regions and in the 3D turbulent regions. The high level of correlation between the estimated error and the actual error indicates that this new approach can be utilized to directly infer the measurement uncertainty from PIV data. A procedure is shown where the results of the error estimation are employed to minimize the measurement uncertainty by selecting the optimal interrogation window size.

  18. MATCHING IN INFORMAL FINANCIAL INSTITUTIONS

    PubMed Central

    Eeckhout, Jan; Munshi, Kaivan

    2013-01-01

    This paper analyzes an informal financial institution that brings heterogeneous agents together in groups. We analyze decentralized matching into these groups, and the equilibrium composition of participants that consequently arises. We find that participants sort remarkably well across the competing groups, and that they re-sort immediately following an unexpected exogenous regulatory change. These findings suggest that the competitive matching model might have applicability and bite in other settings where matching is an important equilibrium phenomenon. (JEL: O12, O17, G20, D40) PMID:24027491

  19. Immunophenotype Discovery, Hierarchical Organization, and Template-Based Classification of Flow Cytometry Samples

    PubMed Central

    Azad, Ariful; Rajwa, Bartek; Pothen, Alex

    2016-01-01

    We describe algorithms for discovering immunophenotypes from large collections of flow cytometry samples and using them to organize the samples into a hierarchy based on phenotypic similarity. The hierarchical organization is helpful for effective and robust cytometry data mining, including the creation of collections of cell populations’ characteristic of different classes of samples, robust classification, and anomaly detection. We summarize a set of samples belonging to a biological class or category with a statistically derived template for the class. Whereas individual samples are represented in terms of their cell populations (clusters), a template consists of generic meta-populations (a group of homogeneous cell populations obtained from the samples in a class) that describe key phenotypes shared among all those samples. We organize an FC data collection in a hierarchical data structure that supports the identification of immunophenotypes relevant to clinical diagnosis. A robust template-based classification scheme is also developed, but our primary focus is in the discovery of phenotypic signatures and inter-sample relationships in an FC data collection. This collective analysis approach is more efficient and robust since templates describe phenotypic signatures common to cell populations in several samples while ignoring noise and small sample-specific variations. We have applied the template-based scheme to analyze several datasets, including one representing a healthy immune system and one of acute myeloid leukemia (AML) samples. The last task is challenging due to the phenotypic heterogeneity of the several subtypes of AML. However, we identified thirteen immunophenotypes corresponding to subtypes of AML and were able to distinguish acute promyelocytic leukemia (APL) samples with the markers provided. Clinically, this is helpful since APL has a different treatment regimen from other subtypes of AML. Core algorithms used in our data analysis are

  20. Immunophenotype Discovery, Hierarchical Organization, and Template-Based Classification of Flow Cytometry Samples

    PubMed Central

    Azad, Ariful; Rajwa, Bartek; Pothen, Alex

    2016-01-01

    We describe algorithms for discovering immunophenotypes from large collections of flow cytometry samples and using them to organize the samples into a hierarchy based on phenotypic similarity. The hierarchical organization is helpful for effective and robust cytometry data mining, including the creation of collections of cell populations’ characteristic of different classes of samples, robust classification, and anomaly detection. We summarize a set of samples belonging to a biological class or category with a statistically derived template for the class. Whereas individual samples are represented in terms of their cell populations (clusters), a template consists of generic meta-populations (a group of homogeneous cell populations obtained from the samples in a class) that describe key phenotypes shared among all those samples. We organize an FC data collection in a hierarchical data structure that supports the identification of immunophenotypes relevant to clinical diagnosis. A robust template-based classification scheme is also developed, but our primary focus is in the discovery of phenotypic signatures and inter-sample relationships in an FC data collection. This collective analysis approach is more efficient and robust since templates describe phenotypic signatures common to cell populations in several samples while ignoring noise and small sample-specific variations. We have applied the template-based scheme to analyze several datasets, including one representing a healthy immune system and one of acute myeloid leukemia (AML) samples. The last task is challenging due to the phenotypic heterogeneity of the several subtypes of AML. However, we identified thirteen immunophenotypes corresponding to subtypes of AML and were able to distinguish acute promyelocytic leukemia (APL) samples with the markers provided. Clinically, this is helpful since APL has a different treatment regimen from other subtypes of AML. Core algorithms used in our data analysis are

  1. Immunophenotype Discovery, Hierarchical Organization, and Template-Based Classification of Flow Cytometry Samples.

    PubMed

    Azad, Ariful; Rajwa, Bartek; Pothen, Alex

    2016-01-01

    We describe algorithms for discovering immunophenotypes from large collections of flow cytometry samples and using them to organize the samples into a hierarchy based on phenotypic similarity. The hierarchical organization is helpful for effective and robust cytometry data mining, including the creation of collections of cell populations' characteristic of different classes of samples, robust classification, and anomaly detection. We summarize a set of samples belonging to a biological class or category with a statistically derived template for the class. Whereas individual samples are represented in terms of their cell populations (clusters), a template consists of generic meta-populations (a group of homogeneous cell populations obtained from the samples in a class) that describe key phenotypes shared among all those samples. We organize an FC data collection in a hierarchical data structure that supports the identification of immunophenotypes relevant to clinical diagnosis. A robust template-based classification scheme is also developed, but our primary focus is in the discovery of phenotypic signatures and inter-sample relationships in an FC data collection. This collective analysis approach is more efficient and robust since templates describe phenotypic signatures common to cell populations in several samples while ignoring noise and small sample-specific variations. We have applied the template-based scheme to analyze several datasets, including one representing a healthy immune system and one of acute myeloid leukemia (AML) samples. The last task is challenging due to the phenotypic heterogeneity of the several subtypes of AML. However, we identified thirteen immunophenotypes corresponding to subtypes of AML and were able to distinguish acute promyelocytic leukemia (APL) samples with the markers provided. Clinically, this is helpful since APL has a different treatment regimen from other subtypes of AML. Core algorithms used in our data analysis are

  2. Immunophenotype Discovery, Hierarchical Organization, and Template-Based Classification of Flow Cytometry Samples.

    PubMed

    Azad, Ariful; Rajwa, Bartek; Pothen, Alex

    2016-01-01

    We describe algorithms for discovering immunophenotypes from large collections of flow cytometry samples and using them to organize the samples into a hierarchy based on phenotypic similarity. The hierarchical organization is helpful for effective and robust cytometry data mining, including the creation of collections of cell populations' characteristic of different classes of samples, robust classification, and anomaly detection. We summarize a set of samples belonging to a biological class or category with a statistically derived template for the class. Whereas individual samples are represented in terms of their cell populations (clusters), a template consists of generic meta-populations (a group of homogeneous cell populations obtained from the samples in a class) that describe key phenotypes shared among all those samples. We organize an FC data collection in a hierarchical data structure that supports the identification of immunophenotypes relevant to clinical diagnosis. A robust template-based classification scheme is also developed, but our primary focus is in the discovery of phenotypic signatures and inter-sample relationships in an FC data collection. This collective analysis approach is more efficient and robust since templates describe phenotypic signatures common to cell populations in several samples while ignoring noise and small sample-specific variations. We have applied the template-based scheme to analyze several datasets, including one representing a healthy immune system and one of acute myeloid leukemia (AML) samples. The last task is challenging due to the phenotypic heterogeneity of the several subtypes of AML. However, we identified thirteen immunophenotypes corresponding to subtypes of AML and were able to distinguish acute promyelocytic leukemia (APL) samples with the markers provided. Clinically, this is helpful since APL has a different treatment regimen from other subtypes of AML. Core algorithms used in our data analysis are

  3. New Effective Multithreaded Matching Algorithms

    SciTech Connect

    Manne, Fredrik; Halappanavar, Mahantesh

    2014-05-19

    Matching is an important combinatorial problem with a number of applications in areas such as community detection, sparse linear algebra, and network alignment. Since computing optimal matchings can be very time consuming, several fast approximation algorithms, both sequential and parallel, have been suggested. Common to the algorithms giving the best solutions is that they tend to be sequential by nature, while algorithms more suitable for parallel computation give solutions of less quality. We present a new simple 1 2 -approximation algorithm for the weighted matching problem. This algorithm is both faster than any other suggested sequential 1 2 -approximation algorithm on almost all inputs and also scales better than previous multithreaded algorithms. We further extend this to a general scalable multithreaded algorithm that computes matchings of weight comparable with the best sequential algorithms. The performance of the suggested algorithms is documented through extensive experiments on different multithreaded architectures.

  4. Matching Faces Against the Clock

    PubMed Central

    Fysh, Matthew; Cross, Katie; Watts, Rebecca

    2016-01-01

    This study examined the effect of time pressure on face-matching accuracy. Across two experiments, observers decided whether pairs of faces depict one person or different people. Time pressure was exerted via two additional displays, which were constantly updated to inform observers on whether they were on track to meet or miss a time target. In this paradigm, faces were matched under increasing or decreasing (Experiment 1) and constant time pressure (Experiment 2), which varied from 10 to 2 seconds. In both experiments, time pressure reduced accuracy, but the point at which this declined varied from 8 to 2 seconds. A separate match response bias was found, which developed over the course of the experiments. These results indicate that both time pressure and the repetitive nature of face matching are detrimental to performance. PMID:27757219

  5. An Efficient Pattern Matching Algorithm

    NASA Astrophysics Data System (ADS)

    Sleit, Azzam; Almobaideen, Wesam; Baarah, Aladdin H.; Abusitta, Adel H.

    In this study, we present an efficient algorithm for pattern matching based on the combination of hashing and search trees. The proposed solution is classified as an offline algorithm. Although, this study demonstrates the merits of the technique for text matching, it can be utilized for various forms of digital data including images, audio and video. The performance superiority of the proposed solution is validated analytically and experimentally.

  6. Fabrication of high fidelity, high index three-dimensional photonic crystals using a templating approach

    NASA Astrophysics Data System (ADS)

    Xu, Yongan

    In this dissertation, we demonstrate the fabrication of high fidelity 3D photonic crystal through polymer template fabrication, backfilling and template removal to obtain high index inversed inorganic photonic crystals (PCs). Along the line, we study the photoresist chemistry to minimize the shrinkage, backfilling strategies for complete infiltration, and template removal at high and low temperatures to minimize crack-formation. Using multibeam interference lithography (MBIL), we fabricate diamond-like photonic structures from commercially available photoresist, SU-8, epoxy functionalized polyhedral oligomeric silsesquioxane (POSS), and narrowly distributed poly(glycidyl methacrylate)s (PGMA). The 3D structure from PGMA shows the lowest shrinkage in the [111] direction, 18%, compared to those fabricated from the SU-8 (41%) and POSS (48%) materials under the same conditions. To fabricate a photonic crystal with large and complete photonic bandgap, it often requires backfilling of high index inorganic materials into a 3D polymer template. We have studied different backfilling methods to create three different types of high index, inorganic 3D photonic crystals. Using SU-8 structures as templates, we systematically study the electrodeposition technique to create inversed 3D titania crystals. We find that 3D SU-8 template is completely infiltrated with titania sol-gel through a two-stage process: a conformal coating of a thin layer of films occurs at the early electrodeposition stage (< 60 min), followed by bottom-up deposition. After calcination at 500°C to remove the polymer template, inversed 3D titania crystals are obtained. The optical properties of the 3D photonic crystals characterized at various processing steps matches with the simulated photonic bandgaps (PBGs) and the SEM observation, further supporting the complete filling by the wet chemistry. Since both PGMA and SU-8 decompose at a temperature above 400°C, leading to the formation of defects and cracks

  7. A string matching computer-assisted system for dolphin photoidentification.

    PubMed

    Araabi, B N; Kehtarnavaz, N; McKinney, T; Hillman, G; Würsig, B

    2000-01-01

    This paper presents a syntactic/semantic string representation scheme as well as a string matching method as part of a computer-assisted system to identify dolphins from photographs of their dorsal fins. A low-level string representation is constructed from the curvature function of a dolphin's fin trailing edge, consisting of positive and negative curvature primitives. A high-level string representation is then built over the low-level string via merging appropriate groupings of primitives in order to have a less sensitive representation to curvature fluctuations or noise. A family of syntactic/semantic distance measures between two strings is introduced. A composite distance measure is then defined and used as a dissimilarity measure for database search, highlighting both the syntax (structure or sequence) and semantic (attribute or feature) differences. The syntax consists of an ordered sequence of significant protrusions and intrusions on the edge, while the semantics consist of seven attributes extracted from the edge and its curvature function. The matching results are reported for a database of 624 images corresponding to 164 individual dolphins. The identification results indicate that the developed string matching method performs better than the previous matching methods including dorsal ratio, curvature, and curve matching. The developed computer-assisted system can help marine mammalogists in their identification of dolphins, since it allows them to examine only a handful of candidate images instead of the currently used manual searching of the entire database. PMID:11144987

  8. Low-level awareness accompanies "unconscious" high-level processing during continuous flash suppression.

    PubMed

    Gelbard-Sagiv, Hagar; Faivre, Nathan; Mudrik, Liad; Koch, Christof

    2016-01-01

    The scope and limits of unconscious processing are a matter of ongoing debate. Lately, continuous flash suppression (CFS), a technique for suppressing visual stimuli, has been widely used to demonstrate surprisingly high-level processing of invisible stimuli. Yet, recent studies showed that CFS might actually allow low-level features of the stimulus to escape suppression and be consciously perceived. The influence of such low-level awareness on high-level processing might easily go unnoticed, as studies usually only probe the visibility of the feature of interest, and not that of lower-level features. For instance, face identity is held to be processed unconsciously since subjects who fail to judge the identity of suppressed faces still show identity priming effects. Here we challenge these results, showing that such high-level priming effects are indeed induced by faces whose identity is invisible, but critically, only when a lower-level feature, such as color or location, is visible. No evidence for identity processing was found when subjects had no conscious access to any feature of the suppressed face. These results suggest that high-level processing of an image might be enabled by-or co-occur with-conscious access to some of its low-level features, even when these features are not relevant to the processed dimension. Accordingly, they call for further investigation of lower-level awareness during CFS, and reevaluation of other unconscious high-level processing findings. PMID:26756173

  9. Extending Automatic Parallelization to Optimize High-Level Abstractions for Multicore

    SciTech Connect

    Liao, C; Quinlan, D J; Willcock, J J; Panas, T

    2008-12-12

    Automatic introduction of OpenMP for sequential applications has attracted significant attention recently because of the proliferation of multicore processors and the simplicity of using OpenMP to express parallelism for shared-memory systems. However, most previous research has only focused on C and Fortran applications operating on primitive data types. C++ applications using high-level abstractions, such as STL containers and complex user-defined types, are largely ignored due to the lack of research compilers that are readily able to recognize high-level object-oriented abstractions and leverage their associated semantics. In this paper, we automatically parallelize C++ applications using ROSE, a multiple-language source-to-source compiler infrastructure which preserves the high-level abstractions and gives us access to their semantics. Several representative parallelization candidate kernels are used to explore semantic-aware parallelization strategies for high-level abstractions, combined with extended compiler analyses. Those kernels include an array-base computation loop, a loop with task-level parallelism, and a domain-specific tree traversal. Our work extends the applicability of automatic parallelization to modern applications using high-level abstractions and exposes more opportunities to take advantage of multicore processors.

  10. SU-F-BRD-10: Lung IMRT Planning Using Standardized Beam Bouquet Templates

    SciTech Connect

    Yuan, L; Wu, Q J.; Yin, F; Ge, Y

    2014-06-15

    Purpose: We investigate the feasibility of choosing from a small set of standardized templates of beam bouquets (i.e., entire beam configuration settings) for lung IMRT planning to improve planning efficiency and quality consistency, and also to facilitate automated planning. Methods: A set of beam bouquet templates is determined by learning from the beam angle settings in 60 clinical lung IMRT plans. A k-medoids cluster analysis method is used to classify the beam angle configuration into clusters. The value of the average silhouette width is used to determine the ideal number of clusters. The beam arrangements in each medoid of the resulting clusters are taken as the standardized beam bouquet for the cluster, with the corresponding case taken as the reference case. The resulting set of beam bouquet templates was used to re-plan 20 cases randomly selected from the database and the dosimetric quality of the plans was evaluated against the corresponding clinical plans by a paired t-test. The template for each test case was manually selected by a planner based on the match between the test and reference cases. Results: The dosimetric parameters (mean±S.D. in percentage of prescription dose) of the plans using 6 beam bouquet templates and those of the clinical plans, respectively, and the p-values (in parenthesis) are: lung Dmean: 18.8±7.0, 19.2±7.0 (0.28), esophagus Dmean: 32.0±16.3, 34.4±17.9 (0.01), heart Dmean: 19.2±16.5, 19.4±16.6 (0.74), spinal cord D2%: 47.7±18.8, 52.0±20.3 (0.01), PTV dose homogeneity (D2%-D99%): 17.1±15.4, 20.7±12.2 (0.03).The esophagus Dmean, cord D02 and PTV dose homogeneity are statistically better in the plans using the standardized templates, but the improvements (<5%) may not be clinically significant. The other dosimetric parameters are not statistically different. Conclusion: It's feasible to use a small number of standardized beam bouquet templates (e.g. 6) to generate plans with quality comparable to that of clinical

  11. Macroporous polymer foams by hydrocarbon templating

    PubMed Central

    Shastri, Venkatram Prasad; Martin, Ivan; Langer, Robert

    2000-01-01

    Porous polymeric media (polymer foams) are utilized in a wide range of applications, such as thermal and mechanical insulators, solid supports for catalysis, and medical devices. A process for the production of polymer foams has been developed. This process, which is applicable to a wide range of polymers, uses a hydrocarbon particulate phase as a template for the precipitation of the polymer phase and subsequent pore formation. The use of a hydrocarbon template allows for enhanced control over pore structure, porosity, and other structural and bulk characteristics of the polymer foam. Polymer foams with densities as low as 120 mg/cc, porosity as high as 87%, and high surface areas (20 m2/g) have been produced. Foams of poly(l-lactic acid), a biodegradable polymer, produced by this process have been used to engineer a variety of different structures, including tissues with complex geometries such as in the likeness of a human nose. PMID:10696111

  12. A template for integrated community sustainability planning.

    PubMed

    Ling, Christopher; Hanna, Kevin; Dale, Ann

    2009-08-01

    This article describes a template for implementing an integrated community sustainability plan. The template emphasizes community engagement and outlines the components of a basic framework for integrating ecological, social and economic dynamics into a community plan. The framework is a series of steps that support a sustainable community development process. While it reflects the Canadian experience, the tools and techniques have applied value for a range of environmental planning contexts around the world. The research is case study based and draws from a diverse range of communities representing many types of infrastructure, demographics and ecological and geographical contexts. A critical path for moving local governments to sustainable community development is the creation and implementation of integrated planning approaches. To be effective and to be implemented, a requisite shift to sustainability requires active community engagement processes, political will, and a commitment to political and administrative accountability, and measurement. PMID:19495860

  13. Template analysis of a Faraday disk dynamo

    NASA Astrophysics Data System (ADS)

    Moroz, I. M.

    2008-12-01

    In a recent paper Moroz [1] returned to a nonlinear three-dimensional model of dynamo action for a self-exciting Faraday disk dynamo introduced by Hide et al. [2]. Since only two examples of chaotic behaviour were shown in [2], Moroz [1] performed a more extensive analysis of the dynamo model, producing a selection of bifurcation transition diagrams, including those encompassing the two examples of chaotic behaviour in [2]. Unstable periodic orbits were extracted and presented in [1], but no attempt was made to identify the underlying chaotic attractor. We rectify that here. Illustrating the procedure with one of the cases considered in [1], we use some of the unstable periodic orbits to identify a possible template for the chaotic attractor, using ideas from topology [3]. In particular, we investigate how the template is affected by changes in bifurcation parameter.

  14. Template-based synthesis of nickel oxide

    NASA Astrophysics Data System (ADS)

    Mironova-Ulmane, N.; Kuzmin, A.; Sildos, I.

    2015-03-01

    Nanocrystalline NiO has been produced using a facile template-based synthesis from nickel nitrate solutions using cellulose as a template. Thus obtained oxides were studied by scanning electron microscopy, x-ray diffraction, Raman scattering spectroscopy, photoluminescence spectroscopy and confocal spectromicroscopy. The filamentary/coral morphology of the samples has been evidenced and is built up of agglomerated nanocrystallites with a size in the range of about 26-36 nm. The presence of two-magnon contribution in Raman scattering spectra suggests the existence of antiferromagnetic ordering at room temperature. Finally, the observed near-infrared photoluminescence band at 850 nm has been tentatively attributed to the defect-perturbed Ni2+ states at the surface.

  15. [Detection of R-wave in Fetal EGG Based on Wavelet Transform and Matched Filtering].

    PubMed

    Yan, Wenhong; Jiang, Ning

    2015-09-01

    By analyzing the characteristics of maternal abdominal ECG (Electrocardiogram), a method based on wavelet transform and matched filtering is proposed to detect the R-wave in fetal EGG (FECG). In this method, the high-frequency coefficients are calculated by using wavelet transform. First, the maternal QRS template is obtained by using the arithmetic mean scheme. Finally, the R-wave of FECG is detected based on matched filtering. The experimental results show that this method can effectively eliminate the noises, such as the maternal ECG signal and baseline drift, enhancing the accuracy of the detection of fetal ECG. PMID:26904869

  16. A KARAOKE System Singing Evaluation Method that More Closely Matches Human Evaluation

    NASA Astrophysics Data System (ADS)

    Takeuchi, Hideyo; Hoguro, Masahiro; Umezaki, Taizo

    KARAOKE is a popular amusement for old and young. Many KARAOKE machines have singing evaluation function. However, it is often said that the scores given by KARAOKE machines do not match human evaluation. In this paper a KARAOKE scoring method strongly correlated with human evaluation is proposed. This paper proposes a way to evaluate songs based on the distance between singing pitch and musical scale, employing a vibrato extraction method based on template matching of spectrum. The results show that correlation coefficients between scores given by the proposed system and human evaluation are -0.76∼-0.89.

  17. Patterning microsphere surfaces by templating colloidal crystals.

    PubMed

    Zhang, Gang; Wang, Dayang; Möhwald, Helmuth

    2005-01-01

    By using the upper single or double layers in colloidal crystals as masks during Au vapor deposition, various Au patterns have been successfully constructed on the surfaces of the lower spheres. The dimension and geometry of the Au patterns obtained are dependent on the orientation of the colloidal crystal templates. Our patterning procedure is independent of the curvature and chemical composition of the surfaces, which definitely pave a promising way to pattern highly curved surfaces.

  18. Deep Human Parsing with Active Template Regression.

    PubMed

    Liang, Xiaodan; Liu, Si; Shen, Xiaohui; Yang, Jianchao; Liu, Luoqi; Dong, Jian; Lin, Liang; Yan, Shuicheng

    2015-12-01

    In this work, the human parsing task, namely decomposing a human image into semantic fashion/body regions, is formulated as an active template regression (ATR) problem, where the normalized mask of each fashion/body item is expressed as the linear combination of the learned mask templates, and then morphed to a more precise mask with the active shape parameters, including position, scale and visibility of each semantic region. The mask template coefficients and the active shape parameters together can generate the human parsing results, and are thus called the structure outputs for human parsing. The deep Convolutional Neural Network (CNN) is utilized to build the end-to-end relation between the input human image and the structure outputs for human parsing. More specifically, the structure outputs are predicted by two separate networks. The first CNN network is with max-pooling, and designed to predict the template coefficients for each label mask, while the second CNN network is without max-pooling to preserve sensitivity to label mask position and accurately predict the active shape parameters. For a new image, the structure outputs of the two networks are fused to generate the probability of each label for each pixel, and super-pixel smoothing is finally used to refine the human parsing result. Comprehensive evaluations on a large dataset well demonstrate the significant superiority of the ATR framework over other state-of-the-arts for human parsing. In particular, the F1-score reaches 64.38 percent by our ATR framework, significantly higher than 44.76 percent based on the state-of-the-art algorithm [28]. PMID:26539846

  19. Deep Human Parsing with Active Template Regression.

    PubMed

    Liang, Xiaodan; Liu, Si; Shen, Xiaohui; Yang, Jianchao; Liu, Luoqi; Dong, Jian; Lin, Liang; Yan, Shuicheng

    2015-12-01

    In this work, the human parsing task, namely decomposing a human image into semantic fashion/body regions, is formulated as an active template regression (ATR) problem, where the normalized mask of each fashion/body item is expressed as the linear combination of the learned mask templates, and then morphed to a more precise mask with the active shape parameters, including position, scale and visibility of each semantic region. The mask template coefficients and the active shape parameters together can generate the human parsing results, and are thus called the structure outputs for human parsing. The deep Convolutional Neural Network (CNN) is utilized to build the end-to-end relation between the input human image and the structure outputs for human parsing. More specifically, the structure outputs are predicted by two separate networks. The first CNN network is with max-pooling, and designed to predict the template coefficients for each label mask, while the second CNN network is without max-pooling to preserve sensitivity to label mask position and accurately predict the active shape parameters. For a new image, the structure outputs of the two networks are fused to generate the probability of each label for each pixel, and super-pixel smoothing is finally used to refine the human parsing result. Comprehensive evaluations on a large dataset well demonstrate the significant superiority of the ATR framework over other state-of-the-arts for human parsing. In particular, the F1-score reaches 64.38 percent by our ATR framework, significantly higher than 44.76 percent based on the state-of-the-art algorithm [28].

  20. PROTEIN TEMPLATES IN HARD TISSUE ENGINEERING

    PubMed Central

    George, Anne; Ravindran, Sriram

    2010-01-01

    Biomineralization processes such as formation of bones and teeth require controlled mineral deposition and self-assembly into hierarchical biocomposites with unique mechanical properties. Ideal biomaterials for regeneration and repair of hard tissues must be biocompatible, possess micro and macroporosity for vascular invasion, provide surface chemistry and texture that facilitate cell attachment, proliferation, differentiation of lineage specific progenitor cells, and induce deposition of calcium phosphate mineral. To expect in-vivo like cellular response several investigators have used extracellular matrix proteins as templates to recreate in-vivo microenvironment for regeneration of hard tissues. Recently, several novel methods of designing tissue repair and restoration materials using bioinspired strategies are currently being formulated. Nanoscale structured materials can be fabricated via the spontaneous organization of self-assembling proteins to construct hierarchically organized nanomaterials. The advantage of such a method is that polypeptides can be specifically designed as building blocks incorporated with molecular recognition features and spatially distributed bioactive ligands that would provide a physiological environment for cells in-vitro and in-vivo. This is a rapidly evolving area and provides a promising platform for future development of nanostructured templates for hard tissue engineering. In this review we try to highlight the importance of proteins as templates for regeneration and repair of hard tissues as well as the potential of peptide based nanomaterials for regenerative therapies. PMID:20802848

  1. Sacrificial template method of fabricating a nanotube

    DOEpatents

    Yang, Peidong; He, Rongrui; Goldberger, Joshua; Fan, Rong; Wu, Yi-Ying; Li, Deyu; Majumdar, Arun

    2007-05-01

    Methods of fabricating uniform nanotubes are described in which nanotubes were synthesized as sheaths over nanowire templates, such as using a chemical vapor deposition process. For example, single-crystalline zinc oxide (ZnO) nanowires are utilized as templates over which gallium nitride (GaN) is epitaxially grown. The ZnO templates are then removed, such as by thermal reduction and evaporation. The completed single-crystalline GaN nanotubes preferably have inner diameters ranging from 30 nm to 200 nm, and wall thicknesses between 5 and 50 nm. Transmission electron microscopy studies show that the resultant nanotubes are single-crystalline with a wurtzite structure, and are oriented along the <001> direction. The present invention exemplifies single-crystalline nanotubes of materials with a non-layered crystal structure. Similar "epitaxial-casting" approaches could be used to produce arrays and single-crystalline nanotubes of other solid materials and semiconductors. Furthermore, the fabrication of multi-sheath nanotubes are described as well as nanotubes having multiple longitudinal segments.

  2. DNA-templated gold nanoparticles formation.

    PubMed

    Sun, Lanlan; Song, Yonghai; Wang, Li; Sun, Yujing; Guo, Cunlan; Liu, Zhelin; Li, Zhuang

    2008-09-01

    The interaction between HAuCl4 and DNA has enabled creation of DNA-templated gold nanoparticles without formation of large nanoparticles. It was found that spheral DNA-HAuCl4 hybrid of 8.7 nm in diameter, flower-like DNA-HAuCl4 hybrid, nanoparticles chains and nanoparticles network of DNA-HAuCl4 hybrid could be obtained by varying the reaction conditions, including DNA concentration and reaction temperature. The intermediate product was investigated by shortening the reaction time of DNA and HAuCl4, and the obtained nanoparticles preserved a small DNA segment, which indicated that the reaction between DNA and HAuCl4 had a process. The addition of reduction reagent resulted in DNA-templated gold nanoparticles and nanoflowers, respectively. UV-vis absorption spectra were used to characterize the DNA-HAuCl4 hybrid and the gold nanostructures templated on DNA, and XPS spectra were used to compare the composition of DNA-Au(III) complex and gold nanoparticles. AFM and TEM results revealed that the spheral gold nanoparticles of about 11 nm in size and flower-like gold nanoparticles were formed after the addition of NaBH4.

  3. Mixed-metal templated phosphate phases

    SciTech Connect

    Nenoff, T.M.; Jackson, N.B.; Thoma, S.G.; Kohler, S.D.; Harrison, W.T.A.

    1997-08-01

    In an effort to direct the structure formation and subsequently the catalytic properties of novel materials, both organic molecules and transition metals have been systematically incorporated into zinc phosphate materials, and various transition metals in zirconium phosphate materials. The resultant phases in the Zn/P experiments are determined not by the organic template, but by the type and stoichiometric amount of metal incorporated and by the organic template`s anion. Furthermore, only one of the phases, a Ni/Zn/P, shows any acidic catalytic behavior. Similarly, the transition metals incorporated in stoichiometric amounts into the catalytically active novel zirconium phosphate are highly structure directing. Their presence inhibits the formation of the phosphate phase, instead promoting the formation of tetragonal ZrO{sub 2}. The catalytic activity of the products are greatly diminished from the baseline material. The synthesis and characterization methods for each phase will be presented. Characterization techniques employed include single-crystal and powder X-ray diffraction, magnetic susceptibility, thermal analysis, DCP and FTIR.

  4. Determination of total cyanide in Hanford Site high-level wastes

    SciTech Connect

    Winters, W.I.; Pool, K.H.

    1994-05-01

    Nickel ferrocyanide compounds (Na{sub 2-x}Cs{sub x}NiFe (CN){sub 6}) were produced in a scavenging process to remove {sup 137}Cs from Hanford Site single-shell tank waste supernates. Methods for determining total cyanide in Hanford Site high-level wastes are needed for the evaluation of potential exothermic reactions between cyanide and oxidizers such as nitrate and for safe storage, processing, and management of the wastes in compliance with regulatory requirements. Hanford Site laboratory experience in determining cyanide in high-level wastes is summarized. Modifications were made to standard cyanide methods to permit improved handling of high-level waste samples and to eliminate interferences found in Hanford Site waste matrices. Interferences and associated procedure modifications caused by high nitrates/nitrite concentrations, insoluble nickel ferrocyanides, and organic complexants are described.

  5. Predictors of High Level of Hostility among Homeless Men on Parole.

    PubMed

    Nyamathi, Adeline; Salem, Benissa; Farabee, David; Hall, Elizabeth; Zhang, Sheldon; Khalilifard, Farinaz; Faucette, Mark; Leake, Barbara

    2014-02-01

    High levels of hostility present a formidable challenge among homeless ex-offenders. This cross-sectional study assessed correlates of high levels of hostility using baseline data collected on recently-released male parolees (N=472; age 18-60) participating in a randomized trial focused on prevention of illicit drug use and recidivism. Predictors of high levels of hostility included greater depressive symptomatology, lower self-esteem, having a mother who was treated for alcohol/drugs, belonging to a gang, more tangible support, having used methamphetamine and having a history of cognitive difficulties. These findings highlight the need to understand predictors of hostility among recently released homeless men and how these predictors may relate to recidivism. Research implications are discussed as these findings will shape future nurse-led harm reduction and community-based interventions.

  6. In vitro effect of levofloxacin and vancomycin combination against high level aminoglycoside-resistant enterococci.

    PubMed

    Erdem, Ilknur; Cicek-Senturk, Gonul; Yucesoy-Dede, Behiye; Yuksel-Kocdogan, Funda; Yuksel, Saim; Karagul, Emin

    2004-01-01

    The in vitro effects of levofloxacin and vancomycin in combination were evaluated against high level aminoglycoside-resistant (HLAR) enterococci using chequerboard and time-kill curve techniques. We examined 28 strains of enterococci comprising 17 Enterococcus faecalis, 10 E. faecium and one E. durans. The combination of vancomycin and levofloxacin had indifferent activity against all isolates according to chequerboard microdilution method, but was synergistic for two isolates, one E. faecium and one E. faecalis, using the time-kill curve method. Both strains were levofloxacin resistant and had high level aminoglycoside resistance to gentamicin and streptomycin. Antagonism was not detected in any strain. The results of this study suggested that the combination of vancomycin with levofloxacin does not often show synergistic effect against high level aminoglycoside-resistant enterococci.

  7. The Savannah River Site Replacement High Level Radioactive Waste Evaporator Project

    SciTech Connect

    Presgrove, S.B.

    1992-08-01

    The Replacement High Level Waste Evaporator Project was conceived in 1985 to reduce the volume of the high level radioactive waste Process of the high level waste has been accomplished up to this time using Bent Tube type evaporators and therefore, that type evaporator was selected for this project. The Title I Design of the project was 70% completed in late 1990. The Department of Energy at that time hired an independent consulting firm to perform a complete review of the project. The DOE placed a STOP ORDER on purchasing the evaporator in January 1991. Essentially, no construction was to be done on this project until all findings and concerns dealing with the type and design of the evaporator are resolved. This report addresses two aspects of the DOE design review; (1) Comparing the Bent Tube Evaporator with the Forced Circulation Evaporator, (2) The design portion of the DOE Project Review - concentrated on the mechanical design properties of the evaporator. 1 ref.

  8. The Savannah River Site Replacement High Level Radioactive Waste Evaporator Project

    SciTech Connect

    Presgrove, S.B. )

    1992-01-01

    The Replacement High Level Waste Evaporator Project was conceived in 1985 to reduce the volume of the high level radioactive waste Process of the high level waste has been accomplished up to this time using Bent Tube type evaporators and therefore, that type evaporator was selected for this project. The Title I Design of the project was 70% completed in late 1990. The Department of Energy at that time hired an independent consulting firm to perform a complete review of the project. The DOE placed a STOP ORDER on purchasing the evaporator in January 1991. Essentially, no construction was to be done on this project until all findings and concerns dealing with the type and design of the evaporator are resolved. This report addresses two aspects of the DOE design review; (1) Comparing the Bent Tube Evaporator with the Forced Circulation Evaporator, (2) The design portion of the DOE Project Review - concentrated on the mechanical design properties of the evaporator. 1 ref.

  9. Development of Total Knee Replacement Digital Templating Software

    NASA Astrophysics Data System (ADS)

    Yusof, Siti Fairuz; Sulaiman, Riza; Thian Seng, Lee; Mohd. Kassim, Abdul Yazid; Abdullah, Suhail; Yusof, Shahril; Omar, Masbah; Abdul Hamid, Hamzaini

    In this study, by taking full advantage of digital X-ray and computer technology, we have developed a semi-automated procedure to template knee implants, by making use of digital templating method. Using this approach, a software system called OrthoKneeTMhas been designed and developed. The system is to be utilities as a study in the Department of Orthopaedic and Traumatology in medical faculty, UKM (FPUKM). OrthoKneeTMtemplating process employs uses a technique similar to those used by many surgeons, using acetate templates over X-ray films. Using template technique makes it easy to template various implant from every Implant manufacturers who have with a comprehensive database of templates. The templating functionality includes, template (knee) and manufactures templates (Smith & Nephew; and Zimmer). From an image of patient x-ray OrthoKneeTMtemplates help in quickly and easily reads to the approximate template size needed. The visual templating features then allow us quickly review multiple template sizes against the X-ray and thus obtain the nearly precise view of the implant size required. The system can assist by templating on one patient image and will generate reports that can accompany patient notes. The software system was implemented in Visual basic 6.0 Pro using the object-oriented techniques to manage the graphics and objects. The approaches for image scaling will be discussed. Several of measurement in orthopedic diagnosis process have been studied and added in this software as measurement tools features using mathematic theorem and equations. The study compared the results of the semi-automated (using digital templating) method to the conventional method to demonstrate the accuracy of the system.

  10. High-Level Clouds and Relation to Sea Surface Temperature as Inferred from Japan's GMS Measurements

    NASA Technical Reports Server (NTRS)

    Chou, Ming-Dah; Lindzen, Richard S.; Lee, Kyu-Tae; Einaudi, Franco (Technical Monitor)

    2000-01-01

    High-level clouds have a significant impact on the radiation energy budgets and, hence, the climate of the Earth. Convective cloud systems, which are controlled by large-scale thermal and dynamical conditions, propagate rapidly within days. At this time scale, changes of sea surface temperature (SST) are small. Radiances measured by Japan's Geostationary Meteorological Satellite (GMS) are used to study the relation between high-level clouds and SST in the tropical western and central Pacific (30 S-30 N; 130 E-170 W), where the ocean is warm and deep convection is intensive. Twenty months (January 1998 - August, 1999) of GMS data are used, which cover the second half of the strong 1997-1998 El Nino. Brightness temperature at the 11-micron channel is used to identify high-level clouds. The core of convection is identified based on the difference in the brightness temperatures of the 11- and 12-micron channels. Because of the rapid movement of clouds, there is little correlation between clouds six hours apart. When most of deep convection moves to regions of high SST, the domain averaged high-level cloud amount decreases. A +2C change of SST in cloudy regions results in a relative change of -30% in high-level cloud amount. This large change in cloud amount is due to clouds moving from cool regions to warm regions but not the change in SST itself. A reduction in high-level cloud amount in the equatorial region implies an expanded dry upper troposphere in the off-equatorial region, and the greenhouse warming of high clouds and water vapor is reduced through enhanced longwave cooling to space. The results are important for understanding the physical processes relating SST, convection, and water vapor in the tropics. They are also important for validating climate simulations using global general circulation models.

  11. Path matching and graph matching in biological networks.

    PubMed

    Yang, Qingwu; Sze, Sing-Hoi

    2007-01-01

    We develop algorithms for the following path matching and graph matching problems: (i) given a query path p and a graph G, find a path p' that is most similar to p in G; (ii) given a query graph G (0) and a graph G, find a graph G (0)' that is most similar to G (0) in G. In these problems, p and G (0) represent a given substructure of interest to a biologist, and G represents a large network in which the biologist desires to find a related substructure. These algorithms allow the study of common substructures in biological networks in order to understand how these networks evolve both within and between organisms. We reduce the path matching problem to finding a longest weighted path in a directed acyclic graph and show that the problem of finding top k suboptimal paths can be solved in polynomial time. This is in contrast with most previous approaches that used exponential time algorithms to find simple paths which are practical only when the paths are short. We reduce the graph matching problem to finding highest scoring subgraphs in a graph and give an exact algorithm to solve the problem when the query graph G (0) is of moderate size. This eliminates the need for less accurate heuristic or randomized algorithms. We show that our algorithms are able to extract biologically meaningful pathways from protein interaction networks in the DIP database and metabolic networks in the KEGG database. Software programs implementing these techniques (PathMatch and GraphMatch) are available at http://faculty.cs.tamu.edu/shsze/pathmatch and http://faculty.cs.tamu.edu/shsze/graphmatch.

  12. Tank waste remediation system phase I high-level waste feed processability assessment report

    SciTech Connect

    Lambert, S.L.; Stegen, G.E., Westinghouse Hanford

    1996-08-01

    This report evaluates the effects of feed composition on the Phase I high-level waste immobilization process and interim storage facility requirements for the high-level waste glass.Several different Phase I staging (retrieval, blending, and pretreatment) scenarios were used to generate example feed compositions for glass formulations, testing, and glass sensitivity analysis. Glass models and data form laboratory glass studies were used to estimate achievable waste loading and corresponding glass volumes for various Phase I feeds. Key issues related to feed process ability, feed composition, uncertainty, and immobilization process technology are identified for future consideration in other tank waste disposal program activities.

  13. The radiation characteristics of the transport packages with vitrified high-level waste

    NASA Astrophysics Data System (ADS)

    Bogatov, S. A.; Mitenkova, E. F.; Novikov, N. V.

    2015-12-01

    The calculation method of neutron yield in the (α, n) reaction for a homogeneous material of arbitrary composition is represented. It is shown that the use of the ORIGEN 2 code excluding the real elemental composition of vitrified high-level waste leads to significant underestimation of the neutron yield in the (α, n) reaction. For vitrified high-level waste and spent nuclear fuel from VVER, the neutron fluxes are analyzed. The thickness of the protective materials for a transfer cask and a shipping cask with vitrified highlevel waste are estimated.

  14. Avoiding the zero sum game in global cancer policy: beyond 2011 UN high level summit.

    PubMed

    Sullivan, R; Purushotham, A D

    2011-11-01

    In September 2011 a unique high level summit on non-communicable diseases will be held in New York. For cancer as for many of the other chronic diseases this marks their first high level recognition. However, the reality of cancer control in middle and low income countries is and will be very different from the trajectory experienced by developed countries. This perspective seeks to critically examine the approach being taken, mapping pitfalls and presenting alternative solutions for an international cancer control policy. PMID:22018537

  15. The radiation characteristics of the transport packages with vitrified high-level waste

    SciTech Connect

    Bogatov, S. A.; Mitenkova, E. F. Novikov, N. V.

    2015-12-15

    The calculation method of neutron yield in the (α, n) reaction for a homogeneous material of arbitrary composition is represented. It is shown that the use of the ORIGEN 2 code excluding the real elemental composition of vitrified high-level waste leads to significant underestimation of the neutron yield in the (α, n) reaction. For vitrified high-level waste and spent nuclear fuel from VVER, the neutron fluxes are analyzed. The thickness of the protective materials for a transfer cask and a shipping cask with vitrified highlevel waste are estimated.

  16. Talc-silicon glass-ceramic waste forms for immobilization of high- level calcined waste

    SciTech Connect

    Vinjamuri, K.

    1993-06-01

    Talc-silicon glass-ceramic waste forms are being evaluated as candidates for immobilization of the high level calcined waste stored onsite at the Idaho Chemical Processing Plant. These glass-ceramic waste forms were prepared by hot isostatically pressing a mixture of simulated nonradioactive high level calcined waste, talc, silicon and aluminum metal additives. The waste forms were characterized for density, chemical durability, and glass and crystalline phase compositions. The results indicate improved density and chemical durability as the silicon content is increased.

  17. Adapting high-level language programs for parallel processing using data flow

    NASA Technical Reports Server (NTRS)

    Standley, Hilda M.

    1988-01-01

    EASY-FLOW, a very high-level data flow language, is introduced for the purpose of adapting programs written in a conventional high-level language to a parallel environment. The level of parallelism provided is of the large-grained variety in which parallel activities take place between subprograms or processes. A program written in EASY-FLOW is a set of subprogram calls as units, structured by iteration, branching, and distribution constructs. A data flow graph may be deduced from an EASY-FLOW program.

  18. Impact of transporting defense high-level waste to a geologic repository

    SciTech Connect

    Joy, D.S.; Shappert, L.B.; Boyle, J.W.

    1984-08-01

    This transportation study assumes that defense high-level waste is stored in three locations (the Savannah River, Hanford, and Idaho Falls plants) and may be disposed of in (1) a commercial repository or (2) a defense-only repository, either of which could be located at one of the five candidate sites; also documented is a preliminary analysis of the costs and risks of transporting defense high-level waste from the three storage sites to the five potential candidate repository sites. 17 references, 4 figures, 27 tables.

  19. Solvent extraction in the treatment of acidic high-level liquid waste : where do we stand?

    SciTech Connect

    Horwitz, E. P.; Schulz, W. W.

    1998-06-18

    During the last 15 years, a number of solvent extraction/recovery processes have been developed for the removal of the transuranic elements, {sup 90}Sr and {sup 137}Cs from acidic high-level liquid waste. These processes are based on the use of a variety of both acidic and neutral extractants. This chapter will present an overview and analysis of the various extractants and flowsheets developed to treat acidic high-level liquid waste streams. The advantages and disadvantages of each extractant along with comparisons of the individual systems are discussed.

  20. High-level seismic response and failure prediction methods for piping

    SciTech Connect

    Severud, L.K.; Anderson, M.J.; Lindquist, M.R.; Wagner, S.E.; Weiner, E.O.

    1988-01-01

    Seismic response and failure analyses were performed for four piping systems that were shake-tested to high level nonlinear and inelastic response levels. Both pre- and post-test analyses were accomplished. A number of simplified elastic, elasto-plastic, and inelastic transient dynamic analysis methods were utilized. Descriptions of these methods, with their special structural parameters and comparisons of predictions using each method to test data, are provided. Reasonably useful, but conservative, methods were found for predicting the high-level inelastic response and the failure modes.

  1. Review of recent Chinese research on field dependence-independence in high-level athletes.

    PubMed

    Liu, W H

    1996-12-01

    A review of seven studies in China concerning field dependence-independence among 500 athletes in 10 different sports is presented. Athletes participating in closed-skill sports were more field-independent than those in open-skill sports. In closed-skill sports, high-level athletes were more field-independent than those of medium level. In open-skill sports involving direct contact, high-level athletes were more field-dependent than those of medium level. No significant relationship was found between field dependence-independence and athletes' performance in open skill sports in which no direct contact was involved. PMID:9017730

  2. Biofeedback to facilitate unassisted ventilation in individuals with high-level quadriplegia. A case report.

    PubMed

    Morrison, S A

    1988-09-01

    The purpose of this case report is to discuss the effectiveness of electromyographic biofeedback in reeducating and strengthening the accessory breathing muscles in an individual with high-level (C1) complete quadriplegia. Six unassisted breathing sessions were performed with EMG biofeedback intervention. Six unassisted breathing sessions without EMG biofeedback intervention were also performed. In both conditions, the subject's vital capacity and the amount of time of unassisted ventilation were recorded. The study results indicated that EMG biofeedback may be a helpful modality in training accessory breathing muscles to enable an individual with high-level quadriplegia to become independent of mechanical ventilation for varying amounts of time.

  3. Understanding Y haplotype matching probability.

    PubMed

    Brenner, Charles H

    2014-01-01

    The Y haplotype population-genetic terrain is better explored from a fresh perspective rather than by analogy with the more familiar autosomal ideas. For haplotype matching probabilities, versus for autosomal matching probabilities, explicit attention to modelling - such as how evolution got us where we are - is much more important while consideration of population frequency is much less so. This paper explores, extends, and explains some of the concepts of "Fundamental problem of forensic mathematics - the evidential strength of a rare haplotype match". That earlier paper presented and validated a "kappa method" formula for the evidential strength when a suspect matches a previously unseen haplotype (such as a Y-haplotype) at the crime scene. Mathematical implications of the kappa method are intuitive and reasonable. Suspicions to the contrary raised in rest on elementary errors. Critical to deriving the kappa method or any sensible evidential calculation is understanding that thinking about haplotype population frequency is a red herring; the pivotal question is one of matching probability. But confusion between the two is unfortunately institutionalized in much of the forensic world. Examples make clear why (matching) probability is not (population) frequency and why uncertainty intervals on matching probabilities are merely confused thinking. Forensic matching calculations should be based on a model, on stipulated premises. The model inevitably only approximates reality, and any error in the results comes only from error in the model, the inexactness of the approximation. Sampling variation does not measure that inexactness and hence is not helpful in explaining evidence and is in fact an impediment. Alternative haplotype matching probability approaches that various authors have considered are reviewed. Some are based on no model and cannot be taken seriously. For the others, some evaluation of the models is discussed. Recent evidence supports the adequacy of

  4. Block Matching for Object Tracking

    SciTech Connect

    Gyaourova, A; Kamath, C; Cheung, S

    2003-10-13

    Models which describe road traffic patterns can be helpful in detection and/or prevention of uncommon and dangerous situations. Such models can be built by the use of motion detection algorithms applied to video data. Block matching is a standard technique for encoding motion in video compression algorithms. We explored the capabilities of the block matching algorithm when applied for object tracking. The goal of our experiments is two-fold: (1) to explore the abilities of the block matching algorithm on low resolution and low frame rate video and (2) to improve the motion detection performance by the use of different search techniques during the process of block matching. Our experiments showed that the block matching algorithm yields good object tracking results and can be used with high success on low resolution and low frame rate video data. We observed that different searching methods have small effect on the final results. In addition, we proposed a technique based on frame history, which successfully overcame false motion caused by small camera movements.

  5. Perturbational treatment of spin-orbit coupling for generally applicable high-level multi-reference methods

    SciTech Connect

    Mai, Sebastian; Marquetand, Philipp; González, Leticia; Müller, Thomas; Plasser, Felix; Lischka, Hans

    2014-08-21

    An efficient perturbational treatment of spin-orbit coupling within the framework of high-level multi-reference techniques has been implemented in the most recent version of the COLUMBUS quantum chemistry package, extending the existing fully variational two-component (2c) multi-reference configuration interaction singles and doubles (MRCISD) method. The proposed scheme follows related implementations of quasi-degenerate perturbation theory (QDPT) model space techniques. Our model space is built either from uncontracted, large-scale scalar relativistic MRCISD wavefunctions or based on the scalar-relativistic solutions of the linear-response-theory-based multi-configurational averaged quadratic coupled cluster method (LRT-MRAQCC). The latter approach allows for a consistent, approximatively size-consistent and size-extensive treatment of spin-orbit coupling. The approach is described in detail and compared to a number of related techniques. The inherent accuracy of the QDPT approach is validated by comparing cuts of the potential energy surfaces of acrolein and its S, Se, and Te analoga with the corresponding data obtained from matching fully variational spin-orbit MRCISD calculations. The conceptual availability of approximate analytic gradients with respect to geometrical displacements is an attractive feature of the 2c-QDPT-MRCISD and 2c-QDPT-LRT-MRAQCC methods for structure optimization and ab inito molecular dynamics simulations.

  6. Perturbational treatment of spin-orbit coupling for generally applicable high-level multi-reference methods

    NASA Astrophysics Data System (ADS)

    Mai, Sebastian; Müller, Thomas; Plasser, Felix; Marquetand, Philipp; Lischka, Hans; González, Leticia

    2014-08-01

    An efficient perturbational treatment of spin-orbit coupling within the framework of high-level multi-reference techniques has been implemented in the most recent version of the Columbus quantum chemistry package, extending the existing fully variational two-component (2c) multi-reference configuration interaction singles and doubles (MRCISD) method. The proposed scheme follows related implementations of quasi-degenerate perturbation theory (QDPT) model space techniques. Our model space is built either from uncontracted, large-scale scalar relativistic MRCISD wavefunctions or based on the scalar-relativistic solutions of the linear-response-theory-based multi-configurational averaged quadratic coupled cluster method (LRT-MRAQCC). The latter approach allows for a consistent, approximatively size-consistent and size-extensive treatment of spin-orbit coupling. The approach is described in detail and compared to a number of related techniques. The inherent accuracy of the QDPT approach is validated by comparing cuts of the potential energy surfaces of acrolein and its S, Se, and Te analoga with the corresponding data obtained from matching fully variational spin-orbit MRCISD calculations. The conceptual availability of approximate analytic gradients with respect to geometrical displacements is an attractive feature of the 2c-QDPT-MRCISD and 2c-QDPT-LRT-MRAQCC methods for structure optimization and ab inito molecular dynamics simulations.

  7. Susceptibility of HPV16 and 18 to high level disinfectants indicated for semi‐critical ultrasound probes

    PubMed Central

    Ryndock, Eric; Robison, Richard

    2015-01-01

    Ultrasound probes used in endocavitary procedures have been shown to be contaminated with high‐risk HPV after routine use and HPV is also known to be resistant to some high level disinfectants (HLDs). This study compared efficacy of two leading ultrasound probe HLD methods; liquid ortho‐phthalaldehyde (Cidex® OPA) and an automated device using sonicated hydrogen peroxide (trophon® EPR) against HPV16 and HPV18 in a hard‐surface carrier test. Native HPV16 and HPV18 virions were generated in organotypic epithelial raft cultures. Viral lysates were dried onto carriers with a 5% (v/v) protein soil. Efficacy tests were performed against the automated device at 35% and 31.5% H2O2 and 0.55% OPA in quadruplicate with matched input, neutralization, and cytotoxicity controls. Hypochlorite was included as a positive control. Infectivity was determined by the abundance (qRT‐PCR) of the spliced E1^E4 transcript in infected recipient cells. The automated HLD device showed excellent efficacy against HPV16 and HPV18 (>5 log10 reductions in infectivity) whereas OPA showed minimal efficacy (<0.6 log10 reductions). While HPV is highly resistant to OPA, sonicated hydrogen peroxide offers an effective disinfection solution for ultrasound probes. Disinfection methods that are effective against HPV should be adopted where possible. J. Med. Virol. 88:1076–1080, 2016. © 2015 The Authors. Journal of Medical Virology Published by Wiley Periodicals, Inc. PMID:26519866

  8. A Matched Filter Hypothesis for Cognitive Control

    PubMed Central

    Thompson-Schill, Sharon L.

    2013-01-01

    The prefrontal cortex exerts top-down influences on several aspects of higher-order cognition by functioning as a filtering mechanism that biases bottom-up sensory information toward a response that is optimal in context. However, research also indicates that not all aspects of complex cognition benefit from prefrontal regulation. Here we review and synthesize this research with an emphasis on the domains of learning and creative cognition, and outline how the appropriate level of cognitive control in a given situation can vary depending on the organism's goals and the characteristics of the given task. We offer a Matched Filter Hypothesis for cognitive control, which proposes that the optimal level of cognitive control is task-dependent, with high levels of cognitive control best suited to tasks that are explicit, rule-based, verbal or abstract, and can be accomplished given the capacity limits of working memory and with low levels of cognitive control best suited to tasks that are implicit, reward-based, non-verbal or intuitive, and which can be accomplished irrespective of working memory limitations. Our approach promotes a view of cognitive control as a tool adapted to a subset of common challenges, rather than an all-purpose optimization system suited to every problem the organism might encounter. PMID:24200920

  9. Path similarity skeleton graph matching.

    PubMed

    Bai, Xiang; Latecki, Longin Jan

    2008-07-01

    This paper presents a novel framework to for shape recognition based on object silhouettes. The main idea is to match skeleton graphs by comparing the shortest paths between skeleton endpoints. In contrast to typical tree or graph matching methods, we completely ignore the topological graph structure. Our approach is motivated by the fact that visually similar skeleton graphs may have completely different topological structures. The proposed comparison of shortest paths between endpoints of skeleton graphs yields correct matching results in such cases. The skeletons are pruned by contour partitioning with Discrete Curve Evolution, which implies that the endpoints of skeleton branches correspond to visual parts of the objects. The experimental results demonstrate that our method is able to produce correct results in the presence of articulations, stretching, and occlusion.

  10. DNA repair by RNA: Templated, or not templated, that is the question.

    PubMed

    Meers, Chance; Keskin, Havva; Storici, Francesca

    2016-08-01

    Cells are continuously exposed to both endogenous and exogenous sources of genomic stress. To maintain chromosome stability, a variety of mechanisms have evolved to cope with the multitude of genetic abnormalities that can arise over the life of a cell. Still, failures to repair these lesions are the driving force of cancers and other degenerative disorders. DNA double-strand breaks (DSBs) are among the most toxic genetic lesions, inhibiting cell ability to replicate, and are sites of mutations and chromosomal rearrangements. DSB repair is known to proceed via two major mechanisms: homologous recombination (HR) and non-homologous end joining (NHEJ). HR reliance on the exchange of genetic information between two identical or nearly identical DNA molecules offers increased accuracy. While the preferred substrate for HR in mitotic cells is the sister chromatid, this is limited to the S and G2 phases of the cell cycle. However, abundant amounts of homologous genetic substrate may exist throughout the cell cycle in the form of RNA. Considered an uncommon occurrence, the direct transfer of information from RNA to DNA is thought to be limited to special circumstances. Studies have shown that RNA molecules reverse transcribed into cDNA can be incorporated into DNA at DSB sites via a non-templated mechanism by NHEJ or a templated mechanism by HR. In addition, synthetic RNA molecules can directly template the repair of DSBs in yeast and human cells via an HR mechanism. New work suggests that even endogenous transcript RNA can serve as a homologous template to repair a DSB in chromosomal DNA. In this perspective, we will review and discuss the recent advancements in DSB repair by RNA via non-templated and templated mechanisms. We will provide current findings, models and future challenges investigating RNA and its role in DSB repair. PMID:27237587

  11. High-level waste-form-product performance evaluation. [Leaching; waste loading; mechanical stability

    SciTech Connect

    Bernadzikowski, T A; Allender, J S; Stone, J A; Gordon, D E; Gould, Jr, T H; Westberry, III, C F

    1982-01-01

    Seven candidate waste forms were evaluated for immobilization and geologic disposal of high-level radioactive wastes. The waste forms were compared on the basis of leach resistance, mechanical stability, and waste loading. All forms performed well at leaching temperatures of 40, 90, and 150/sup 0/C. Ceramic forms ranked highest, followed by glasses, a metal matrix form, and concrete. 11 tables.

  12. Effects of Crowding and Attention on High-Levels of Motion Processing and Motion Adaptation

    PubMed Central

    Pavan, Andrea; Greenlee, Mark W.

    2015-01-01

    The motion after-effect (MAE) persists in crowding conditions, i.e., when the adaptation direction cannot be reliably perceived. The MAE originating from complex moving patterns spreads into non-adapted sectors of a multi-sector adapting display (i.e., phantom MAE). In the present study we used global rotating patterns to measure the strength of the conventional and phantom MAEs in crowded and non-crowded conditions, and when attention was directed to the adapting stimulus and when it was diverted away from the adapting stimulus. The results show that: (i) the phantom MAE is weaker than the conventional MAE, for both non-crowded and crowded conditions, and when attention was focused on the adapting stimulus and when it was diverted from it, (ii) conventional and phantom MAEs in the crowded condition are weaker than in the non-crowded condition. Analysis conducted to assess the effect of crowding on high-level of motion adaptation suggests that crowding is likely to affect the awareness of the adapting stimulus rather than degrading its sensory representation, (iii) for high-level of motion processing the attentional manipulation does not affect the strength of either conventional or phantom MAEs, neither in the non-crowded nor in the crowded conditions. These results suggest that high-level MAEs do not depend on attention and that at high-level of motion adaptation the effects of crowding are not modulated by attention. PMID:25615577

  13. Structural integrity and potential failure modes of hanford high-level waste tanks

    SciTech Connect

    Han, F.C.

    1996-09-30

    Structural Integrity of the Hanford High-Level Waste Tanks were evaluated based on the existing Design and Analysis Documents. All tank structures were found adequate for the normal operating and seismic loads. Potential failure modes of the tanks were assessed by engineering interpretation and extrapolation of the existing engineering documents.

  14. Effects of crowding and attention on high-levels of motion processing and motion adaptation.

    PubMed

    Pavan, Andrea; Greenlee, Mark W

    2015-01-01

    The motion after-effect (MAE) persists in crowding conditions, i.e., when the adaptation direction cannot be reliably perceived. The MAE originating from complex moving patterns spreads into non-adapted sectors of a multi-sector adapting display (i.e., phantom MAE). In the present study we used global rotating patterns to measure the strength of the conventional and phantom MAEs in crowded and non-crowded conditions, and when attention was directed to the adapting stimulus and when it was diverted away from the adapting stimulus. The results show that: (i) the phantom MAE is weaker than the conventional MAE, for both non-crowded and crowded conditions, and when attention was focused on the adapting stimulus and when it was diverted from it, (ii) conventional and phantom MAEs in the crowded condition are weaker than in the non-crowded condition. Analysis conducted to assess the effect of crowding on high-level of motion adaptation suggests that crowding is likely to affect the awareness of the adapting stimulus rather than degrading its sensory representation, (iii) for high-level of motion processing the attentional manipulation does not affect the strength of either conventional or phantom MAEs, neither in the non-crowded nor in the crowded conditions. These results suggest that high-level MAEs do not depend on attention and that at high-level of motion adaptation the effects of crowding are not modulated by attention.

  15. Insect cell transformation vectors that support high level expression and promoter assessment in insect cell culture

    Technology Transfer Automated Retrieval System (TEKTRAN)

    A somatic transformation vector, pDP9, was constructed that provides a simplified means of producing permanently transformed cultured insect cells that support high levels of protein expression of foreign genes. The pDP9 plasmid vector incorporates DNA sequences from the Junonia coenia densovirus th...

  16. A Simple Gauss-Newton Procedure for Covariance Structure Analysis with High-Level Computer Languages.

    ERIC Educational Resources Information Center

    Cudeck, Robert; And Others

    1993-01-01

    An implementation of the Gauss-Newton algorithm for the analysis of covariance structure that is specifically adapted for high-level computer languages is reviewed. This simple method for estimating structural equation models is useful for a variety of standard models, as is illustrated. (SLD)

  17. 'Going The Distance?' A National Academies Report on Spent Fuel and High-Level Waste Transportation

    SciTech Connect

    Crowley, K.D.

    2007-07-01

    The National Academies released the report entitled Going the Distance? The Safe Transport of Spent Nuclear Fuel and High-Level Radioactive Waste in the United States in February 2006. This paper provides a summary of the findings and recommendations from that report. (authors)

  18. Plastid-expressed 5-enolpyruvylshikimate-3-phosphate synthase genes provide high level glyphosate tolerance in tobacco.

    PubMed

    Ye, G N; Hajdukiewicz, P T; Broyles, D; Rodriguez, D; Xu, C W; Nehra, N; Staub, J M

    2001-02-01

    Plastid transformation (transplastomic) technology has several potential advantages for biotechnological applications including the use of unmodified prokaryotic genes for engineering, potential high-level gene expression and gene containment due to maternal inheritance in most crop plants. However, the efficacy of a plastid-encoded trait may change depending on plastid number and tissue type. We report a feasibility study in tobacco plastids to achieve high-level herbicide resistance in both vegetative tissues and reproductive organs. We chose to test glyphosate resistance via over-expression in plastids of tolerant forms of 5-enolpyruvylshikimate-3-phosphate synthase (EPSPS). Immunological, enzymatic and whole-plant assays were used to prove the efficacy of three different prokaryotic (Achromobacter, Agrobacterium and Bacillus) EPSPS genes. Using the Agrobacterium strain CP4 EPSPS as a model we identified translational control sequences that direct a 10,000-fold range of protein accumulation (to >10% total soluble protein in leaves). Plastid-expressed EPSPS could provide very high levels of glyphosate resistance, although levels of resistance in vegetative and reproductive tissues differed depending on EPSPS accumulation levels, and correlated to the plastid abundance in these tissues. Paradoxically, higher levels of plastid-expressed EPSPS protein accumulation were apparently required for efficacy than from a similar nuclear-encoded gene. Nevertheless, the demonstration of high-level glyphosate tolerance in vegetative and reproductive organs using transplastomic technology provides a necessary step for transfer of this technology to other crop species.

  19. 75 FR 61228 - Board Meeting: Technical Lessons Gained From High-Level Nuclear Waste Disposal Efforts

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-10-04

    ... From the Federal Register Online via the Government Publishing Office NUCLEAR WASTE TECHNICAL REVIEW BOARD Board Meeting: Technical Lessons Gained From High-Level Nuclear Waste Disposal Efforts Pursuant to its authority under section 5051 of Public Law 100-203, Nuclear Waste Policy Amendments Act...

  20. Chem I Supplement. Chemistry Related to Isolation of High-Level Nuclear Waste.

    ERIC Educational Resources Information Center

    Hoffman, Darleane C.; Choppin, Gregory R.

    1986-01-01

    Discusses some of the problems associated with the safe disposal of high-level nuclear wastes. Describes several waste disposal plans developed by various nations. Outlines the multiple-barrier concept of isolation in deep geological questions associated with the implementation of such a method. (TW)

  1. A new irradiation effect and its implications for the disposal of high-level radioactive waste.

    PubMed

    Hirsch, E H

    1980-09-26

    Materials containing alkali metals or alkaline earths are sensitized by bombardment with either ions, electrons, or photons to chemical attack by atmospheric moisture. The implications of this effect on the proposed immobilization and long-term storage of high-level nuclear waste in glass or similar materials is discussed. PMID:7433973

  2. Is It Really Possible to Test All Educationally Significant Achievements with High Levels of Reliability?

    ERIC Educational Resources Information Center

    Davis, Andrew

    2015-01-01

    PISA claims that it can extend its reach from its current core subjects of Reading, Science, Maths and problem-solving. Yet given the requirement for high levels of reliability for PISA, especially in the light of its current high stakes character, proposed widening of its subject coverage cannot embrace some important aspects of the social and…

  3. HTML::GMap-A High Level Perl Wrapper Around the Google Maps(TM) API

    Technology Transfer Automated Retrieval System (TEKTRAN)

    We have developed HTML::GMap, a generic, high-level Perl wrapper, to easily build web-based geographic map displays on top of the Google MapsTM Mapping Service. Using HTML::GMap, we built custom display tools to present the molecular diversity data generated by the National Science Foundation-suppor...

  4. Design of equipment used for high-level waste vitrification at the West Valley Demonstration Project

    SciTech Connect

    Vance, R.F.; Brill, B.A.; Carl, D.E.

    1997-06-01

    The equipment as designed, started, and operated for high-level radioactive waste vitrification at the West Valley Demonstration Project in western New York State is described. Equipment for the processes of melter feed make-up, vitrification, canister handling, and off-gas treatment are included. For each item of equipment the functional requirements, process description, and hardware descriptions are presented.

  5. Student Motivations as Predictors of High-Level Cognitions in Project-Based Classrooms

    ERIC Educational Resources Information Center

    Stolk, Jonathan; Harari, Janie

    2014-01-01

    It is well established that active learning helps students engage in high-level thinking strategies and develop improved cognitive skills. Motivation and self-regulated learning research, however, illustrates that cognitive engagement is an effortful process that is related to students' valuing of the learning tasks, adoption of internalized…

  6. A HIGH-LEVEL CALCULATION OF THE PROTON AFFINITY OF DIBORANE

    EPA Science Inventory

    The experimental proton affinity of diborane (B2H6) is based on an unstable species, B2H,+, 4 which has been observed only at low temperatures. The present work calculates the proton 5 affinity of diborane using the Gaussian-3 method and other high-level compound ab initio 6 met...

  7. Confronting the Universal Disbelief that Poor Children Can Achieve at High Levels

    ERIC Educational Resources Information Center

    Kelleher, Paul

    2006-01-01

    In her first year as superintendent of the Atlanta Public Schools, Beverly Hall encountered what she describes as the nearly universal disbelief that poor children of color can achieve high levels of learning. This chapter recounts how she confronted this and other obstacles and challenges as the new leader of the Atlanta Public Schools. Hall is…

  8. Antecedent and Concurrent Psychosocial Skills That Support High Levels of Achievement within Talent Domains

    ERIC Educational Resources Information Center

    Olszewski-Kubilius, Paula; Subotnik, Rena F.; Worrell, Frank C.

    2015-01-01

    Motivation and emotional regulation are important for the sustained focused study and practice required for high levels of achievement and creative productivity in adulthood. Using the talent development model proposed by the authors as a framework, the authors discuss several important psychosocial skills based on the psychological research…

  9. High Level Manpower and Technological Change in the Steel Industry: Implications for Corporate Manpower Planning.

    ERIC Educational Resources Information Center

    Hiestand, Dale L.

    The purpose of this study was to examine the role that high level manpower plays in the establishment of new technologies at the plant and industry level. The steel industry was selected as an appropriate industry to approach these questions due to: its considerable technological changes; its straightforward, easier-to-understand technology; its…

  10. Model and Observer Learning of Low, Medium, and High Level Concepts.

    ERIC Educational Resources Information Center

    Walls, Richard T.; And Others

    Low (conjunctive), medium (disjunctive), and high (biconditional) level concept attainment problems were used to assess whether high level versus low and/or medium difficulty concept rules yield less positive transfer for observers than models. Direct learning and transfer of models was compared with vicarious learning and transfer of observers.…

  11. Do Highly Effective Principals Also Have High Levels of Cultural Intelligence?

    ERIC Educational Resources Information Center

    Naughton, Whitney Michelle

    2010-01-01

    Purpose: The purpose of this study was to determine if elementary school principals who exhibit characteristics of highly effective principals also possess high levels of cultural intelligence. Methodology: Three instruments were used in this study, combining both qualitative and quantitative approaches to the collection of data. The first…

  12. 77 FR 1778 - U.S.-EU High Level Working Group on Jobs and Growth

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-01-11

    ... TRADE REPRESENTATIVE U.S.-EU High Level Working Group on Jobs and Growth AGENCY: Office of the United... Working Group on Jobs and Growth, led by U.S. Trade Representative Ron Kirk and EU Trade Commissioner... and investment to support mutually beneficial job creation, economic growth, and...

  13. Semantic-Aware Automatic Parallelization of Modern Applications Using High-Level Abstractions

    SciTech Connect

    Liao, C; Quinlan, D J; Willcock, J J; Panas, T

    2009-12-21

    Automatic introduction of OpenMP for sequential applications has attracted significant attention recently because of the proliferation of multicore processors and the simplicity of using OpenMP to express parallelism for shared-memory systems. However, most previous research has only focused on C and Fortran applications operating on primitive data types. Modern applications using high-level abstractions, such as C++ STL containers and complex user-defined class types, are largely ignored due to the lack of research compilers that are readily able to recognize high-level object-oriented abstractions and leverage their associated semantics. In this paper, we use a source-to-source compiler infrastructure, ROSE, to explore compiler techniques to recognize high-level abstractions and to exploit their semantics for automatic parallelization. Several representative parallelization candidate kernels are used to study semantic-aware parallelization strategies for high-level abstractions, combined with extended compiler analyses. Preliminary results have shown that semantics of abstractions can help extend the applicability of automatic parallelization to modern applications and expose more opportunities to take advantage of multicore processors.

  14. Reducing transmission risk through high-level disinfection of transvaginal ultrasound transducer handles.

    PubMed

    Ngu, Andrew; McNally, Glenn; Patel, Dipika; Gorgis, Vivian; Leroy, Sandrine; Burdach, Jon

    2015-05-01

    Intracavity ultrasound transducer handles are not routinely immersed in liquid high-level disinfectants. We show that residual bacteria, including pathogens, persist on more than 80% of handles that are not disinfected, whereas use of an automated device reduces contamination to background levels. Clinical staff should consider the need for handle disinfection.

  15. Experiential Learning and Its Role in Training and Improved Practice in High Level Sports Officiating

    ERIC Educational Resources Information Center

    Grover, Kenda S.

    2014-01-01

    This qualitative study investigated how high level sports officials engage in experiential learning to improve their practice. Adult learning occurs in formal, nonformal and informal environments, and in some cases it is difficult to differentiate between these settings. In the case of cycling officials, learning begins in a nonformal environment…

  16. Luminance and color inputs to mid-level and high-level vision.

    PubMed

    Jennings, Ben J; Martinovic, Jasna

    2014-01-01

    We investigated the interdependence of activity within the luminance (L + M) and opponent chromatic (L - M and S - [L + M]) postreceptoral mechanisms in mid-level and high-level vision. Mid-level processes extract contours and perform figure-background organization whereas high-level processes depend on additional semantic input, such as object knowledge. We collected mid-level (good/poor continuation) and high-level (object/nonobject) two-alternative forced-choice discrimination threshold data over a range of conditions that isolate mechanisms or simultaneously stimulate them. The L - M mechanism drove discrimination in the presence of very low luminance inputs. Contrast-dependent interactions between the luminance and L - M as well as combined L - M and S - (L + M) inputs were also found, but S - (L + M) signals, on their own, did not interact with luminance. Mean mid-level and high-level thresholds were related, with luminance providing inputs capable of sustaining performance over a broader, linearly corresponding range of contrasts when compared to L - M signals. The observed interactions are likely to be driven by L - M signals and relatively low luminance signals (approximately 0.05-0.09 L + M contrast) facilitating each other. The results are consistent with previous findings on low-level interactions between chromatic and luminance signals and demonstrate that functional interdependence between the geniculate mechanisms extends to the highest stages of the visual hierarchy.

  17. Driving Objectives and High-level Requirements for KP-Lab Technologies

    ERIC Educational Resources Information Center

    Lakkala, Minna; Paavola, Sami; Toikka, Seppo; Bauters, Merja; Markannen, Hannu; de Groot, Reuma; Ben Ami, Zvi; Baurens, Benoit; Jadin, Tanja; Richter, Christoph; Zoserl, Eva; Batatia, Hadj; Paralic, Jan; Babic, Frantisek; Damsa, Crina; Sins, Patrick; Moen, Anne; Norenes, Svein Olav; Bugnon, Alexandra; Karlgren, Klas; Kotzinons, Dimitris

    2008-01-01

    One of the central goals of the KP-Lab project is to co-design pedagogical methods and technologies for knowledge creation and practice transformation in an integrative and reciprocal manner. In order to facilitate this process user tasks, driving objectives and high-level requirements have been introduced as conceptual tools to mediate between…

  18. A Systematic Review of Research on Questioning as a High-Level Cognitive Strategy

    ERIC Educational Resources Information Center

    Davoudi, Mohammad; Sadeghi, Narges Amel

    2015-01-01

    Given the significance of questioning as a high-level cognitive strategy in language teaching and learning in the literature on TEFL as well as in education in general, this study sought to make a systematic review of research studies conducted in the span of the last three decades on the issue of questioning across different disciplines with a…

  19. Conceptual design report for immobilized high-level waste interim storage facility (Phase 1)

    SciTech Connect

    Burgard, K.C.

    1998-04-09

    The Hanford Site Canister Storage Building (CSB Bldg. 212H) will be utilized to interim store Phase 1 HLW products. Project W-464, Immobilized High-Level Waste Interim Storage, will procure an onsite transportation system and retrofit the CSB to accommodate the Phase 1 HLW products. The Conceptual Design Report establishes the Project W-464 technical and cost basis.

  20. Conceptual design report for immobilized high-level waste interim storage facility (Phase 1)

    SciTech Connect

    Burgard, K.C.

    1998-06-02

    The Hanford Site Canister Storage Building (CSB Bldg. 212H) will be utilized to interim store Phase 1 HLW products. Project W-464, Immobilized High-Level Waste Interim Storage, will procure an onsite transportation system and retrofit the CSB to accommodate the Phase 1 HLW products. The Conceptual Design Report establishes the Project W-464 technical and cost basis.