Sample records for reconstruction integrated codes

  1. Timing Analysis with INTEGRAL: Comparing Different Reconstruction Algorithms

    NASA Technical Reports Server (NTRS)

    Grinberg, V.; Kreykenboehm, I.; Fuerst, F.; Wilms, J.; Pottschmidt, K.; Bel, M. Cadolle; Rodriquez, J.; Marcu, D. M.; Suchy, S.; Markowitz, A.; hide

    2010-01-01

    INTEGRAL is one of the few instruments capable of detecting X-rays above 20keV. It is therefore in principle well suited for studying X-ray variability in this regime. Because INTEGRAL uses coded mask instruments for imaging, the reconstruction of light curves of X-ray sources is highly non-trivial. We present results from the comparison of two commonly employed algorithms, which primarily measure flux from mask deconvolution (ii-lc-extract) and from calculating the pixel illuminated fraction (ii-light). Both methods agree well for timescales above about 10 s, the highest time resolution for which image reconstruction is possible. For higher time resolution, ii-light produces meaningful results, although the overall variance of the lightcurves is not preserved.

  2. Global magnetosphere simulations using constrained-transport Hall-MHD with CWENO reconstruction

    NASA Astrophysics Data System (ADS)

    Lin, L.; Germaschewski, K.; Maynard, K. M.; Abbott, S.; Bhattacharjee, A.; Raeder, J.

    2013-12-01

    We present a new CWENO (Centrally-Weighted Essentially Non-Oscillatory) reconstruction based MHD solver for the OpenGGCM global magnetosphere code. The solver was built using libMRC, a library for creating efficient parallel PDE solvers on structured grids. The use of libMRC gives us access to its core functionality of providing an automated code generation framework which takes a user provided PDE right hand side in symbolic form to generate an efficient, computer architecture specific, parallel code. libMRC also supports block-structured adaptive mesh refinement and implicit-time stepping through integration with the PETSc library. We validate the new CWENO Hall-MHD solver against existing solvers both in standard test problems as well as in global magnetosphere simulations.

  3. Partial sequence homogenization in the 5S multigene families may generate sequence chimeras and spurious results in phylogenetic reconstructions.

    PubMed

    Galián, José A; Rosato, Marcela; Rosselló, Josep A

    2014-03-01

    Multigene families have provided opportunities for evolutionary biologists to assess molecular evolution processes and phylogenetic reconstructions at deep and shallow systematic levels. However, the use of these markers is not free of technical and analytical challenges. Many evolutionary studies that used the nuclear 5S rDNA gene family rarely used contiguous 5S coding sequences due to the routine use of head-to-tail polymerase chain reaction primers that are anchored to the coding region. Moreover, the 5S coding sequences have been concatenated with independent, adjacent gene units in many studies, creating simulated chimeric genes as the raw data for evolutionary analysis. This practice is based on the tacitly assumed, but rarely tested, hypothesis that strict intra-locus concerted evolution processes are operating in 5S rDNA genes, without any empirical evidence as to whether it holds for the recovered data. The potential pitfalls of analysing the patterns of molecular evolution and reconstructing phylogenies based on these chimeric genes have not been assessed to date. Here, we compared the sequence integrity and phylogenetic behavior of entire versus concatenated 5S coding regions from a real data set obtained from closely related plant species (Medicago, Fabaceae). Our results suggest that within arrays sequence homogenization is partially operating in the 5S coding region, which is traditionally assumed to be highly conserved. Consequently, concatenating 5S genes increases haplotype diversity, generating novel chimeric genotypes that most likely do not exist within the genome. In addition, the patterns of gene evolution are distorted, leading to incorrect haplotype relationships in some evolutionary reconstructions.

  4. Surface reconstruction, figure-ground modulation, and border-ownership.

    PubMed

    Jeurissen, Danique; Self, Matthew W; Roelfsema, Pieter R

    2013-01-01

    The Differentiation-Integration for Surface Completion (DISC) model aims to explain the reconstruction of visual surfaces. We find the model a valuable contribution to our understanding of figure-ground organization. We point out that, next to border-ownership, neurons in visual cortex code whether surface elements belong to a figure or the background and that this is influenced by attention. We furthermore suggest that there must be strong links between object recognition and figure-ground assignment in order to resolve the status of interior contours. Incorporation of these factors in neurocomputational models will further improve our understanding of surface reconstruction, figure-ground organization, and border-ownership.

  5. Synthetic reconstruction of recycling on the limiter during startup phase of W7-X based on EMC3-EIRENE simulations

    NASA Astrophysics Data System (ADS)

    Frerichs, Heinke; Effenberg, Florian; Schmitz, Oliver; Stephey, Laurie; W7-X Team

    2016-10-01

    Interpretation of spectroscopic measurements in the edge region of high-temperature plasmas can be a challenge due to line of sight integration effects. The EMC3-EIRENE code - a 3D fluid edge plasma and kinetic neutral gas transport code - is a suitable tool for full 3D reconstruction of such signals. A versatile synthetic diagnostic module has been developed recently which allows the realistic three dimensional setup of various plasma edge diagnostics to be captured. We present an analysis of recycling on the inboard limiter of W7-X during its startup phase in terms of a synthetic camera for Hα light observations and reconstruct the particle flux from these synthetic images based on ionization per photon coefficients (S/XB). We find that line of sight integration effects can lead to misinterpretation of data (redistribution of particle flux due to neutral gas diffusion), and that local plasma effects are important for the correct treatment of photon emissions. This work was supported by the U.S. Department of Energy (DOE) under Grant DE-SC0014210, by startup funds of the Department of Engineering Physics at the University of Wisconsin - Madison, and by the EUROfusion Consortium under Euratom Grant No 633053.

  6. Surface-from-gradients without discrete integrability enforcement: A Gaussian kernel approach.

    PubMed

    Ng, Heung-Sun; Wu, Tai-Pang; Tang, Chi-Keung

    2010-11-01

    Representative surface reconstruction algorithms taking a gradient field as input enforce the integrability constraint in a discrete manner. While enforcing integrability allows the subsequent integration to produce surface heights, existing algorithms have one or more of the following disadvantages: They can only handle dense per-pixel gradient fields, smooth out sharp features in a partially integrable field, or produce severe surface distortion in the results. In this paper, we present a method which does not enforce discrete integrability and reconstructs a 3D continuous surface from a gradient or a height field, or a combination of both, which can be dense or sparse. The key to our approach is the use of kernel basis functions, which transfer the continuous surface reconstruction problem into high-dimensional space, where a closed-form solution exists. By using the Gaussian kernel, we can derive a straightforward implementation which is able to produce results better than traditional techniques. In general, an important advantage of our kernel-based method is that the method does not suffer discretization and finite approximation, both of which lead to surface distortion, which is typical of Fourier or wavelet bases widely adopted by previous representative approaches. We perform comparisons with classical and recent methods on benchmark as well as challenging data sets to demonstrate that our method produces accurate surface reconstruction that preserves salient and sharp features. The source code and executable of the system are available for downloading.

  7. Novel Integration of Frame Rate Up Conversion and HEVC Coding Based on Rate-Distortion Optimization.

    PubMed

    Guo Lu; Xiaoyun Zhang; Li Chen; Zhiyong Gao

    2018-02-01

    Frame rate up conversion (FRUC) can improve the visual quality by interpolating new intermediate frames. However, high frame rate videos by FRUC are confronted with more bitrate consumption or annoying artifacts of interpolated frames. In this paper, a novel integration framework of FRUC and high efficiency video coding (HEVC) is proposed based on rate-distortion optimization, and the interpolated frames can be reconstructed at encoder side with low bitrate cost and high visual quality. First, joint motion estimation (JME) algorithm is proposed to obtain robust motion vectors, which are shared between FRUC and video coding. What's more, JME is embedded into the coding loop and employs the original motion search strategy in HEVC coding. Then, the frame interpolation is formulated as a rate-distortion optimization problem, where both the coding bitrate consumption and visual quality are taken into account. Due to the absence of original frames, the distortion model for interpolated frames is established according to the motion vector reliability and coding quantization error. Experimental results demonstrate that the proposed framework can achieve 21% ~ 42% reduction in BDBR, when compared with the traditional methods of FRUC cascaded with coding.

  8. The genome of black cottonwood, Populus trichocarpa (Torr. & Gray)

    Treesearch

    G.A. Tuskan; S. DiFazio; S. Jansson; J. Bohlmann; I. Grigoriev; U. Hellsten; N. Putnam; S. Ralph; S. Rombauts; A. Salamov; J. Schein; L. Sterck; A. Aerts; R.R. Bhalerao; R.P. Bhalerao; D. Blaudez; W. Boerjan; A. Brun; A. Brunner; V. Busov; M. Campbell; J. Carlson; M. Chalot; J. Chapman; G.-L. Chen; D. Cooper; P.M. Coutinho; J. Couturier; S. Covert; Q. Cronk; R. Cunningham; J. Davis; S. Degroeve; A. Dejardin; C. dePamphilis; J. Detter; B. Dirks; U. Dubchak; S. Duplessis; J. Ehlting; B. Ellis; K. Gendler; D. Goodstein; M. Gribskov; J. Grimwood; A. Groover; L. Gunter; B. Hamberger; B. Heinze; Y. Helariutta; B. Henrissat; D. Holligan; R. Holt; W. Huang; N. Islam-Faridi; S. Jones; M. Jones-Rhoades; R. Jorgensen; C. Joshi; J. Kangasjarvi; J. Karlsson; C. Kelleher; R. Kirkpatrick; M. Kirst; A. Kohler; U. Kalluri; F. Larimer; J. Leebens-Mack; J.-C. Leple; P. Locascio; Y. Lou; S. Lucas; F. Martin; B. Montanini; C. Napoli; D.R. Nelson; C. Nelson; K. Nieminen; O. Nilsson; V. Pereda; G. Peter; R. Philippe; G. Pilate; A. Poliakov; J. Razumovskaya; P. Richardson; C. Rinaldi; K. Ritland; P. Rouze; D. Ryaboy; J. Schumtz; J. Schrader; B. Segerman; H. Shin; A. Siddiqui; F. Sterky; A. Terry; C.-J. Tsai; E. Uberbacher; P. Unneberg; J. Vahala; K. Wall; S. Wessler; G. Yang; T. Yin; C. Douglas; M. Marra; G. Sandberg; Y. Van de Peer; D. Rokhsar

    2006-01-01

    We report the draft genome of the black cottonwood tree, Populus trichocarpa. Integration of shotgun sequence assembly with genetic mapping enabled chromosome-scale reconstruction of the genome. More than 45,000 putative protein-coding genes were identified. Analysis of the assembled genome revealed a whole-genome duplication event; about 8000 pairs...

  9. Implementation of an object oriented track reconstruction model into multiple LHC experiments*

    NASA Astrophysics Data System (ADS)

    Gaines, Irwin; Gonzalez, Saul; Qian, Sijin

    2001-10-01

    An Object Oriented (OO) model (Gaines et al., 1996; 1997; Gaines and Qian, 1998; 1999) for track reconstruction by the Kalman filtering method has been designed for high energy physics experiments at high luminosity hadron colliders. The model has been coded in the C++ programming language and has been successfully implemented into the OO computing environments of both the CMS (1994) and ATLAS (1994) experiments at the future Large Hadron Collider (LHC) at CERN. We shall report: how the OO model was adapted, with largely the same code, to different scenarios and serves the different reconstruction aims in different experiments (i.e. the level-2 trigger software for ATLAS and the offline software for CMS); how the OO model has been incorporated into different OO environments with a similar integration structure (demonstrating the ease of re-use of OO program); what are the OO model's performance, including execution time, memory usage, track finding efficiency and ghost rate, etc.; and additional physics performance based on use of the OO tracking model. We shall also mention the experience and lessons learned from the implementation of the OO model into the general OO software framework of the experiments. In summary, our practice shows that the OO technology really makes the software development and the integration issues straightforward and convenient; this may be particularly beneficial for the general non-computer-professional physicists.

  10. The historical biogeography of Mammalia

    PubMed Central

    Springer, Mark S.; Meredith, Robert W.; Janecka, Jan E.; Murphy, William J.

    2011-01-01

    Palaeobiogeographic reconstructions are underpinned by phylogenies, divergence times and ancestral area reconstructions, which together yield ancestral area chronograms that provide a basis for proposing and testing hypotheses of dispersal and vicariance. Methods for area coding include multi-state coding with a single character, binary coding with multiple characters and string coding. Ancestral reconstruction methods are divided into parsimony versus Bayesian/likelihood approaches. We compared nine methods for reconstructing ancestral areas for placental mammals. Ambiguous reconstructions were a problem for all methods. Important differences resulted from coding areas based on the geographical ranges of extant species versus the geographical provenance of the oldest fossil for each lineage. Africa and South America were reconstructed as the ancestral areas for Afrotheria and Xenarthra, respectively. Most methods reconstructed Eurasia as the ancestral area for Boreoeutheria, Euarchontoglires and Laurasiatheria. The coincidence of molecular dates for the separation of Afrotheria and Xenarthra at approximately 100 Ma with the plate tectonic sundering of Africa and South America hints at the importance of vicariance in the early history of Placentalia. Dispersal has also been important including the origins of Madagascar's endemic mammal fauna. Further studies will benefit from increased taxon sampling and the application of new ancestral area reconstruction methods. PMID:21807730

  11. ANGIOCARE: an automated system for fast three-dimensional coronary reconstruction by integrating angiographic and intracoronary ultrasound data.

    PubMed

    Bourantas, Christos V; Kalatzis, Fanis G; Papafaklis, Michail I; Fotiadis, Dimitrios I; Tweddel, Ann C; Kourtis, Iraklis C; Katsouras, Christos S; Michalis, Lampros K

    2008-08-01

    The development of an automated, user-friendly system (ANGIOCARE), for rapid three-dimensional (3D) coronary reconstruction, integrating angiographic and, intracoronary ultrasound (ICUS) data. Biplane angiographic and ICUS sequence images are imported into the system where a prevalidated method is used for coronary reconstruction. This incorporates extraction of the catheter path from two end-diastolic X-ray images and detection of regions of interest (lumen, outer vessel wall) in the ICUS sequence by an automated border detection algorithm. The detected borders are placed perpendicular to the catheter path and established algorithms used to estimate their absolute orientation. The resulting 3D object is imported into an advanced visualization module with which the operator can interact, examine plaque distribution (depicted as a color coded map) and assess plaque burden by virtual endoscopy. Data from 19 patients (27 vessels) undergoing biplane angiography and ICUS were examined. The reconstructed vessels were 21.3-80.2 mm long. The mean difference was 0.9 +/- 2.9% between the plaque volumes measured using linear 3D ICUS analysis and the volumes, estimated by taking into account the curvature of the vessel. The time required to reconstruct a luminal narrowing of 25 mm was approximately 10 min. The ANGIOCARE system provides rapid coronary reconstruction allowing the operator accurately to estimate the length of the lesion and determine plaque distribution and volume. (c) 2008 Wiley-Liss, Inc.

  12. Accelerating image reconstruction in dual-head PET system by GPU and symmetry properties.

    PubMed

    Chou, Cheng-Ying; Dong, Yun; Hung, Yukai; Kao, Yu-Jiun; Wang, Weichung; Kao, Chien-Min; Chen, Chin-Tu

    2012-01-01

    Positron emission tomography (PET) is an important imaging modality in both clinical usage and research studies. We have developed a compact high-sensitivity PET system that consisted of two large-area panel PET detector heads, which produce more than 224 million lines of response and thus request dramatic computational demands. In this work, we employed a state-of-the-art graphics processing unit (GPU), NVIDIA Tesla C2070, to yield an efficient reconstruction process. Our approaches ingeniously integrate the distinguished features of the symmetry properties of the imaging system and GPU architectures, including block/warp/thread assignments and effective memory usage, to accelerate the computations for ordered subset expectation maximization (OSEM) image reconstruction. The OSEM reconstruction algorithms were implemented employing both CPU-based and GPU-based codes, and their computational performance was quantitatively analyzed and compared. The results showed that the GPU-accelerated scheme can drastically reduce the reconstruction time and thus can largely expand the applicability of the dual-head PET system.

  13. OpenACC acceleration of an unstructured CFD solver based on a reconstructed discontinuous Galerkin method for compressible flows

    DOE PAGES

    Xia, Yidong; Lou, Jialin; Luo, Hong; ...

    2015-02-09

    Here, an OpenACC directive-based graphics processing unit (GPU) parallel scheme is presented for solving the compressible Navier–Stokes equations on 3D hybrid unstructured grids with a third-order reconstructed discontinuous Galerkin method. The developed scheme requires the minimum code intrusion and algorithm alteration for upgrading a legacy solver with the GPU computing capability at very little extra effort in programming, which leads to a unified and portable code development strategy. A face coloring algorithm is adopted to eliminate the memory contention because of the threading of internal and boundary face integrals. A number of flow problems are presented to verify the implementationmore » of the developed scheme. Timing measurements were obtained by running the resulting GPU code on one Nvidia Tesla K20c GPU card (Nvidia Corporation, Santa Clara, CA, USA) and compared with those obtained by running the equivalent Message Passing Interface (MPI) parallel CPU code on a compute node (consisting of two AMD Opteron 6128 eight-core CPUs (Advanced Micro Devices, Inc., Sunnyvale, CA, USA)). Speedup factors of up to 24× and 1.6× for the GPU code were achieved with respect to one and 16 CPU cores, respectively. The numerical results indicate that this OpenACC-based parallel scheme is an effective and extensible approach to port unstructured high-order CFD solvers to GPU computing.« less

  14. Three-dimensional integral imaging displays using a quick-response encoded elemental image array: an overview

    NASA Astrophysics Data System (ADS)

    Markman, A.; Javidi, B.

    2016-06-01

    Quick-response (QR) codes are barcodes that can store information such as numeric data and hyperlinks. The QR code can be scanned using a QR code reader, such as those built into smartphone devices, revealing the information stored in the code. Moreover, the QR code is robust to noise, rotation, and illumination when scanning due to error correction built in the QR code design. Integral imaging is an imaging technique used to generate a three-dimensional (3D) scene by combining the information from two-dimensional (2D) elemental images (EIs) each with a different perspective of a scene. Transferring these 2D images in a secure manner can be difficult. In this work, we overview two methods to store and encrypt EIs in multiple QR codes. The first method uses run-length encoding with Huffman coding and the double-random-phase encryption (DRPE) to compress and encrypt an EI. This information is then stored in a QR code. An alternative compression scheme is to perform photon-counting on the EI prior to compression. Photon-counting is a non-linear transformation of data that creates redundant information thus improving image compression. The compressed data is encrypted using the DRPE. Once information is stored in the QR codes, it is scanned using a smartphone device. The information scanned is decompressed and decrypted and an EI is recovered. Once all EIs have been recovered, a 3D optical reconstruction is generated.

  15. Model-Based Least Squares Reconstruction of Coded Source Neutron Radiographs: Integrating the ORNL HFIR CG1D Source Model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Santos-Villalobos, Hector J; Gregor, Jens; Bingham, Philip R

    2014-01-01

    At the present, neutron sources cannot be fabricated small and powerful enough in order to achieve high resolution radiography while maintaining an adequate flux. One solution is to employ computational imaging techniques such as a Magnified Coded Source Imaging (CSI) system. A coded-mask is placed between the neutron source and the object. The system resolution is increased by reducing the size of the mask holes and the flux is increased by increasing the size of the coded-mask and/or the number of holes. One limitation of such system is that the resolution of current state-of-the-art scintillator-based detectors caps around 50um. Tomore » overcome this challenge, the coded-mask and object are magnified by making the distance from the coded-mask to the object much smaller than the distance from object to detector. In previous work, we have shown via synthetic experiments that our least squares method outperforms other methods in image quality and reconstruction precision because of the modeling of the CSI system components. However, the validation experiments were limited to simplistic neutron sources. In this work, we aim to model the flux distribution of a real neutron source and incorporate such a model in our least squares computational system. We provide a full description of the methodology used to characterize the neutron source and validate the method with synthetic experiments.« less

  16. Side information in coded aperture compressive spectral imaging

    NASA Astrophysics Data System (ADS)

    Galvis, Laura; Arguello, Henry; Lau, Daniel; Arce, Gonzalo R.

    2017-02-01

    Coded aperture compressive spectral imagers sense a three-dimensional cube by using two-dimensional projections of the coded and spectrally dispersed source. These imagers systems often rely on FPA detectors, SLMs, micromirror devices (DMDs), and dispersive elements. The use of the DMDs to implement the coded apertures facilitates the capture of multiple projections, each admitting a different coded aperture pattern. The DMD allows not only to collect the sufficient number of measurements for spectrally rich scenes or very detailed spatial scenes but to design the spatial structure of the coded apertures to maximize the information content on the compressive measurements. Although sparsity is the only signal characteristic usually assumed for reconstruction in compressing sensing, other forms of prior information such as side information have been included as a way to improve the quality of the reconstructions. This paper presents the coded aperture design in a compressive spectral imager with side information in the form of RGB images of the scene. The use of RGB images as side information of the compressive sensing architecture has two main advantages: the RGB is not only used to improve the reconstruction quality but to optimally design the coded apertures for the sensing process. The coded aperture design is based on the RGB scene and thus the coded aperture structure exploits key features such as scene edges. Real reconstructions of noisy compressed measurements demonstrate the benefit of the designed coded apertures in addition to the improvement in the reconstruction quality obtained by the use of side information.

  17. A novel three-dimensional image reconstruction method for near-field coded aperture single photon emission computerized tomography

    PubMed Central

    Mu, Zhiping; Hong, Baoming; Li, Shimin; Liu, Yi-Hwa

    2009-01-01

    Coded aperture imaging for two-dimensional (2D) planar objects has been investigated extensively in the past, whereas little success has been achieved in imaging 3D objects using this technique. In this article, the authors present a novel method of 3D single photon emission computerized tomography (SPECT) reconstruction for near-field coded aperture imaging. Multiangular coded aperture projections are acquired and a stack of 2D images is reconstructed separately from each of the projections. Secondary projections are subsequently generated from the reconstructed image stacks based on the geometry of parallel-hole collimation and the variable magnification of near-field coded aperture imaging. Sinograms of cross-sectional slices of 3D objects are assembled from the secondary projections, and the ordered subset expectation and maximization algorithm is employed to reconstruct the cross-sectional image slices from the sinograms. Experiments were conducted using a customized capillary tube phantom and a micro hot rod phantom. Imaged at approximately 50 cm from the detector, hot rods in the phantom with diameters as small as 2.4 mm could be discerned in the reconstructed SPECT images. These results have demonstrated the feasibility of the authors’ 3D coded aperture image reconstruction algorithm for SPECT, representing an important step in their effort to develop a high sensitivity and high resolution SPECT imaging system. PMID:19544769

  18. Joint reconstruction of dynamic PET activity and kinetic parametric images using total variation constrained dictionary sparse coding

    NASA Astrophysics Data System (ADS)

    Yu, Haiqing; Chen, Shuhang; Chen, Yunmei; Liu, Huafeng

    2017-05-01

    Dynamic positron emission tomography (PET) is capable of providing both spatial and temporal information of radio tracers in vivo. In this paper, we present a novel joint estimation framework to reconstruct temporal sequences of dynamic PET images and the coefficients characterizing the system impulse response function, from which the associated parametric images of the system macro parameters for tracer kinetics can be estimated. The proposed algorithm, which combines statistical data measurement and tracer kinetic models, integrates a dictionary sparse coding (DSC) into a total variational minimization based algorithm for simultaneous reconstruction of the activity distribution and parametric map from measured emission sinograms. DSC, based on the compartmental theory, provides biologically meaningful regularization, and total variation regularization is incorporated to provide edge-preserving guidance. We rely on techniques from minimization algorithms (the alternating direction method of multipliers) to first generate the estimated activity distributions with sub-optimal kinetic parameter estimates, and then recover the parametric maps given these activity estimates. These coupled iterative steps are repeated as necessary until convergence. Experiments with synthetic, Monte Carlo generated data, and real patient data have been conducted, and the results are very promising.

  19. COBRApy: COnstraints-Based Reconstruction and Analysis for Python.

    PubMed

    Ebrahim, Ali; Lerman, Joshua A; Palsson, Bernhard O; Hyduke, Daniel R

    2013-08-08

    COnstraint-Based Reconstruction and Analysis (COBRA) methods are widely used for genome-scale modeling of metabolic networks in both prokaryotes and eukaryotes. Due to the successes with metabolism, there is an increasing effort to apply COBRA methods to reconstruct and analyze integrated models of cellular processes. The COBRA Toolbox for MATLAB is a leading software package for genome-scale analysis of metabolism; however, it was not designed to elegantly capture the complexity inherent in integrated biological networks and lacks an integration framework for the multiomics data used in systems biology. The openCOBRA Project is a community effort to promote constraints-based research through the distribution of freely available software. Here, we describe COBRA for Python (COBRApy), a Python package that provides support for basic COBRA methods. COBRApy is designed in an object-oriented fashion that facilitates the representation of the complex biological processes of metabolism and gene expression. COBRApy does not require MATLAB to function; however, it includes an interface to the COBRA Toolbox for MATLAB to facilitate use of legacy codes. For improved performance, COBRApy includes parallel processing support for computationally intensive processes. COBRApy is an object-oriented framework designed to meet the computational challenges associated with the next generation of stoichiometric constraint-based models and high-density omics data sets. http://opencobra.sourceforge.net/

  20. Real-time range acquisition by adaptive structured light.

    PubMed

    Koninckx, Thomas P; Van Gool, Luc

    2006-03-01

    The goal of this paper is to provide a "self-adaptive" system for real-time range acquisition. Reconstructions are based on a single frame structured light illumination. Instead of using generic, static coding that is supposed to work under all circumstances, system adaptation is proposed. This occurs on-the-fly and renders the system more robust against instant scene variability and creates suitable patterns at startup. A continuous trade-off between speed and quality is made. A weighted combination of different coding cues--based upon pattern color, geometry, and tracking--yields a robust way to solve the correspondence problem. The individual coding cues are automatically adapted within a considered family of patterns. The weights to combine them are based on the average consistency with the result within a small time-window. The integration itself is done by reformulating the problem as a graph cut. Also, the camera-projector configuration is taken into account for generating the projection patterns. The correctness of the range maps is not guaranteed, but an estimation of the uncertainty is provided for each part of the reconstruction. Our prototype is implemented using unmodified consumer hardware only and, therefore, is cheap. Frame rates vary between 10 and 25 fps, dependent on scene complexity.

  1. Comprehensive Reconstruction and Visualization of Non-Coding Regulatory Networks in Human

    PubMed Central

    Bonnici, Vincenzo; Russo, Francesco; Bombieri, Nicola; Pulvirenti, Alfredo; Giugno, Rosalba

    2014-01-01

    Research attention has been powered to understand the functional roles of non-coding RNAs (ncRNAs). Many studies have demonstrated their deregulation in cancer and other human disorders. ncRNAs are also present in extracellular human body fluids such as serum and plasma, giving them a great potential as non-invasive biomarkers. However, non-coding RNAs have been relatively recently discovered and a comprehensive database including all of them is still missing. Reconstructing and visualizing the network of ncRNAs interactions are important steps to understand their regulatory mechanism in complex systems. This work presents ncRNA-DB, a NoSQL database that integrates ncRNAs data interactions from a large number of well established on-line repositories. The interactions involve RNA, DNA, proteins, and diseases. ncRNA-DB is available at http://ncrnadb.scienze.univr.it/ncrnadb/. It is equipped with three interfaces: web based, command-line, and a Cytoscape app called ncINetView. By accessing only one resource, users can search for ncRNAs and their interactions, build a network annotated with all known ncRNAs and associated diseases, and use all visual and mining features available in Cytoscape. PMID:25540777

  2. [Aesthetic surgery].

    PubMed

    Bruck, Johannes C

    2006-01-01

    The WHO describes health as physical, mental and social well being. Ever since the establishment of plastic surgery aesthetic surgery has been an integral part of this medical specialty. It aims at reconstructing subjective well-being by employing plastic surgical procedures as described in the educational code and regulations for specialists of plastic surgery. This code confirms that plastic surgery comprises cosmetic procedures for the entire body that have to be applied in respect of psychological exploration and selection criteria. A wide variety of opinions resulting from very different motivations shows how difficult it is to differentiate aesthetic surgery as a therapeutic procedure from beauty surgery as a primarily economic service. Jurisdiction, guidelines for professional conduct and ethical codes have tried to solve this question. Regardless of the intention and ability of the health insurances, it has currently been established that the moral and legal evaluation of advertisements for medical procedures depends on their purpose: advertising with the intent of luring patients into cosmetic procedures that do not aim to reconstruct a subjective physical disorder does not comply with a medical indication. If, however, the initiative originates with the patient requesting the amelioration of a subjective disorder of his body, a medical indication can be assumed.

  3. Comprehensive reconstruction and visualization of non-coding regulatory networks in human.

    PubMed

    Bonnici, Vincenzo; Russo, Francesco; Bombieri, Nicola; Pulvirenti, Alfredo; Giugno, Rosalba

    2014-01-01

    Research attention has been powered to understand the functional roles of non-coding RNAs (ncRNAs). Many studies have demonstrated their deregulation in cancer and other human disorders. ncRNAs are also present in extracellular human body fluids such as serum and plasma, giving them a great potential as non-invasive biomarkers. However, non-coding RNAs have been relatively recently discovered and a comprehensive database including all of them is still missing. Reconstructing and visualizing the network of ncRNAs interactions are important steps to understand their regulatory mechanism in complex systems. This work presents ncRNA-DB, a NoSQL database that integrates ncRNAs data interactions from a large number of well established on-line repositories. The interactions involve RNA, DNA, proteins, and diseases. ncRNA-DB is available at http://ncrnadb.scienze.univr.it/ncrnadb/. It is equipped with three interfaces: web based, command-line, and a Cytoscape app called ncINetView. By accessing only one resource, users can search for ncRNAs and their interactions, build a network annotated with all known ncRNAs and associated diseases, and use all visual and mining features available in Cytoscape.

  4. SU-D-206-02: Evaluation of Partial Storage of the System Matrix for Cone Beam Computed Tomography Using a GPU Platform

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Matenine, D; Cote, G; Mascolo-Fortin, J

    2016-06-15

    Purpose: Iterative reconstruction algorithms in computed tomography (CT) require a fast method for computing the intersections between the photons’ trajectories and the object, also called ray-tracing or system matrix computation. This work evaluates different ways to store the system matrix, aiming to reconstruct dense image grids in reasonable time. Methods: We propose an optimized implementation of the Siddon’s algorithm using graphics processing units (GPUs) with a novel data storage scheme. The algorithm computes a part of the system matrix on demand, typically, for one projection angle. The proposed method was enhanced with accelerating options: storage of larger subsets of themore » system matrix, systematic reuse of data via geometric symmetries, an arithmetic-rich parallel code and code configuration via machine learning. It was tested on geometries mimicking a cone beam CT acquisition of a human head. To realistically assess the execution time, the ray-tracing routines were integrated into a regularized Poisson-based reconstruction algorithm. The proposed scheme was also compared to a different approach, where the system matrix is fully pre-computed and loaded at reconstruction time. Results: Fast ray-tracing of realistic acquisition geometries, which often lack spatial symmetry properties, was enabled via the proposed method. Ray-tracing interleaved with projection and backprojection operations required significant additional time. In most cases, ray-tracing was shown to use about 66 % of the total reconstruction time. In absolute terms, tracing times varied from 3.6 s to 7.5 min, depending on the problem size. The presence of geometrical symmetries allowed for non-negligible ray-tracing and reconstruction time reduction. Arithmetic-rich parallel code and machine learning permitted a modest reconstruction time reduction, in the order of 1 %. Conclusion: Partial system matrix storage permitted the reconstruction of higher 3D image grid sizes and larger projection datasets at the cost of additional time, when compared to the fully pre-computed approach. This work was supported in part by the Fonds de recherche du Quebec - Nature et technologies (FRQ-NT). The authors acknowledge partial support by the CREATE Medical Physics Research Training Network grant of the Natural Sciences and Engineering Research Council of Canada (Grant No. 432290).« less

  5. NR-code: Nonlinear reconstruction code

    NASA Astrophysics Data System (ADS)

    Yu, Yu; Pen, Ue-Li; Zhu, Hong-Ming

    2018-04-01

    NR-code applies nonlinear reconstruction to the dark matter density field in redshift space and solves for the nonlinear mapping from the initial Lagrangian positions to the final redshift space positions; this reverses the large-scale bulk flows and improves the precision measurement of the baryon acoustic oscillations (BAO) scale.

  6. Equilibrium Reconstruction on the Large Helical Device

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Samuel A. Lazerson, D. Gates, D. Monticello, H. Neilson, N. Pomphrey, A. Reiman S. Sakakibara, and Y. Suzuki

    Equilibrium reconstruction is commonly applied to axisymmetric toroidal devices. Recent advances in computational power and equilibrium codes have allowed for reconstructions of three-dimensional fields in stellarators and heliotrons. We present the first reconstructions of finite beta discharges in the Large Helical Device (LHD). The plasma boundary and magnetic axis are constrained by the pressure profile from Thomson scattering. This results in a calculation of plasma beta without a-priori assumptions of the equipartition of energy between species. Saddle loop arrays place additional constraints on the equilibrium. These reconstruction utilize STELLOPT, which calls VMEC. The VMEC equilibrium code assumes good nested fluxmore » surfaces. Reconstructed magnetic fields are fed into the PIES code which relaxes this constraint allowing for the examination of the effect of islands and stochastic regions on the magnetic measurements.« less

  7. Development of the PARVMEC Code for Rapid Analysis of 3D MHD Equilibrium

    NASA Astrophysics Data System (ADS)

    Seal, Sudip; Hirshman, Steven; Cianciosa, Mark; Wingen, Andreas; Unterberg, Ezekiel; Wilcox, Robert; ORNL Collaboration

    2015-11-01

    The VMEC three-dimensional (3D) MHD equilibrium has been used extensively for designing stellarator experiments and analyzing experimental data in such strongly 3D systems. Recent applications of VMEC include 2D systems such as tokamaks (in particular, the D3D experiment), where application of very small (delB/B ~ 10-3) 3D resonant magnetic field perturbations render the underlying assumption of axisymmetry invalid. In order to facilitate the rapid analysis of such equilibria (for example, for reconstruction purposes), we have undertaken the task of parallelizing the VMEC code (PARVMEC) to produce a scalable and temporally rapidly convergent equilibrium code for use on parallel distributed memory platforms. The parallelization task naturally splits into three distinct parts 1) radial surfaces in the fixed-boundary part of the calculation; 2) two 2D angular meshes needed to compute the Green's function integrals over the plasma boundary for the free-boundary part of the code; and 3) block tridiagonal matrix needed to compute the full (3D) pre-conditioner near the final equilibrium state. Preliminary results show that scalability is achieved for tasks 1 and 3, with task 2 still nearing completion. The impact of this work on the rapid reconstruction of D3D plasmas using PARVMEC in the V3FIT code will be discussed. Work supported by U.S. DOE under Contract DE-AC05-00OR22725 with UT-Battelle, LLC.

  8. INTEGRAL/IBIS 7-year All-Sky Hard X-ray Survey. I. Image reconstruction

    NASA Astrophysics Data System (ADS)

    Krivonos, R.; Revnivtsev, M.; Tsygankov, S.; Sazonov, S.; Vikhlinin, A.; Pavlinsky, M.; Churazov, E.; Sunyaev, R.

    2010-09-01

    This paper is the first in a series devoted to the hard X-ray whole sky survey performed by the INTEGRAL observatory over seven years. Here we present an improved method for image reconstruction with the IBIS coded mask telescope. The main improvements are related to the suppression of systematic effects that strongly limit sensitivity in the region of the Galactic plane (GP), especially in the crowded field of the Galactic center (GC). We extended the IBIS/ISGRI background model to take into account the Galactic ridge X-ray emission (GRXE). To suppress residual systematic artifacts on a reconstructed sky image, we applied nonparametric sky image filtering based on wavelet decomposition. The implemented modifications of the sky reconstruction method decrease the systematic noise in the ~20 Ms deep field of GC by ~44%, and practically remove it from the high-latitude sky images. New observational data sets, along with an improved reconstruction algorithm, allow us to conduct the hard X-ray survey with the best currently available minimal sensitivity 3.7 × 10-12 erg s-1 cm-2 ~ 0.26 mCrab in the 17-60 keV band at a 5σ detection level. The survey covers 90% of the sky down to the flux limit of 6.2 × 10-11 erg s-1 cm-2 (~4.32 mCrab) and 10% of the sky area down to the flux limit of 8.6 × 10-12 erg s-1 cm-2 (~0.60 mCrab). Based on observations with INTEGRAL, an ESA project with the instruments and science data center funded by ESA member states (especially the PI countries: Denmark, France, Germany, Italy, Switzerland, Spain), Czech Republic, and Poland, and with the participation of Russia and the USA.

  9. Unitary reconstruction of secret for stabilizer-based quantum secret sharing

    NASA Astrophysics Data System (ADS)

    Matsumoto, Ryutaroh

    2017-08-01

    We propose a unitary procedure to reconstruct quantum secret for a quantum secret sharing scheme constructed from stabilizer quantum error-correcting codes. Erasure correcting procedures for stabilizer codes need to add missing shares for reconstruction of quantum secret, while unitary reconstruction procedures for certain class of quantum secret sharing are known to work without adding missing shares. The proposed procedure also works without adding missing shares.

  10. Fast GPU-based Monte Carlo code for SPECT/CT reconstructions generates improved 177Lu images.

    PubMed

    Rydén, T; Heydorn Lagerlöf, J; Hemmingsson, J; Marin, I; Svensson, J; Båth, M; Gjertsson, P; Bernhardt, P

    2018-01-04

    Full Monte Carlo (MC)-based SPECT reconstructions have a strong potential for correcting for image degrading factors, but the reconstruction times are long. The objective of this study was to develop a highly parallel Monte Carlo code for fast, ordered subset expectation maximum (OSEM) reconstructions of SPECT/CT images. The MC code was written in the Compute Unified Device Architecture language for a computer with four graphics processing units (GPUs) (GeForce GTX Titan X, Nvidia, USA). This enabled simulations of parallel photon emissions from the voxels matrix (128 3 or 256 3 ). Each computed tomography (CT) number was converted to attenuation coefficients for photo absorption, coherent scattering, and incoherent scattering. For photon scattering, the deflection angle was determined by the differential scattering cross sections. An angular response function was developed and used to model the accepted angles for photon interaction with the crystal, and a detector scattering kernel was used for modeling the photon scattering in the detector. Predefined energy and spatial resolution kernels for the crystal were used. The MC code was implemented in the OSEM reconstruction of clinical and phantom 177 Lu SPECT/CT images. The Jaszczak image quality phantom was used to evaluate the performance of the MC reconstruction in comparison with attenuated corrected (AC) OSEM reconstructions and attenuated corrected OSEM reconstructions with resolution recovery corrections (RRC). The performance of the MC code was 3200 million photons/s. The required number of photons emitted per voxel to obtain a sufficiently low noise level in the simulated image was 200 for a 128 3 voxel matrix. With this number of emitted photons/voxel, the MC-based OSEM reconstruction with ten subsets was performed within 20 s/iteration. The images converged after around six iterations. Therefore, the reconstruction time was around 3 min. The activity recovery for the spheres in the Jaszczak phantom was clearly improved with MC-based OSEM reconstruction, e.g., the activity recovery was 88% for the largest sphere, while it was 66% for AC-OSEM and 79% for RRC-OSEM. The GPU-based MC code generated an MC-based SPECT/CT reconstruction within a few minutes, and reconstructed patient images of 177 Lu-DOTATATE treatments revealed clearly improved resolution and contrast.

  11. Estimation of 1945 to 1957 food consumption. Hanford Environmental Dose Reconstruction Project: Draft

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Anderson, D.M.; Bates, D.J.; Marsh, T.L.

    This report details the methods used and the results of the study on the estimated historic levels of food consumption by individuals in the Hanford Environmental Dose Reconstruction (HEDR) study area from 1945--1957. This period includes the time of highest releases from Hanford and is the period for which data are being collected in the Hanford Thyroid Disease Study. These estimates provide the food-consumption inputs for the HEDR database of individual diets. This database will be an input file in the Hanford Environmental Dose Reconstruction Integrated Code (HEDRIC) computer model that will be used to calculate the radiation dose. Themore » report focuses on fresh milk, eggs, lettuce, and spinach. These foods were chosen because they have been found to be significant contributors to radiation dose based on the Technical Steering Panel dose decision level.« less

  12. Estimation of 1945 to 1957 food consumption. Hanford Environmental Dose Reconstruction Project

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Anderson, D.M.; Bates, D.J.; Marsh, T.L.

    This report details the methods used and the results of the study on the estimated historic levels of food consumption by individuals in the Hanford Environmental Dose Reconstruction (HEDR) study area from 1945--1957. This period includes the time of highest releases from Hanford and is the period for which data are being collected in the Hanford Thyroid Disease Study. These estimates provide the food-consumption inputs for the HEDR database of individual diets. This database will be an input file in the Hanford Environmental Dose Reconstruction Integrated Code (HEDRIC) computer model that will be used to calculate the radiation dose. Themore » report focuses on fresh milk, eggs, lettuce, and spinach. These foods were chosen because they have been found to be significant contributors to radiation dose based on the Technical Steering Panel dose decision level.« less

  13. Real-time photoacoustic and ultrasound dual-modality imaging system facilitated with graphics processing unit and code parallel optimization.

    PubMed

    Yuan, Jie; Xu, Guan; Yu, Yao; Zhou, Yu; Carson, Paul L; Wang, Xueding; Liu, Xiaojun

    2013-08-01

    Photoacoustic tomography (PAT) offers structural and functional imaging of living biological tissue with highly sensitive optical absorption contrast and excellent spatial resolution comparable to medical ultrasound (US) imaging. We report the development of a fully integrated PAT and US dual-modality imaging system, which performs signal scanning, image reconstruction, and display for both photoacoustic (PA) and US imaging all in a truly real-time manner. The back-projection (BP) algorithm for PA image reconstruction is optimized to reduce the computational cost and facilitate parallel computation on a state of the art graphics processing unit (GPU) card. For the first time, PAT and US imaging of the same object can be conducted simultaneously and continuously, at a real-time frame rate, presently limited by the laser repetition rate of 10 Hz. Noninvasive PAT and US imaging of human peripheral joints in vivo were achieved, demonstrating the satisfactory image quality realized with this system. Another experiment, simultaneous PAT and US imaging of contrast agent flowing through an artificial vessel, was conducted to verify the performance of this system for imaging fast biological events. The GPU-based image reconstruction software code for this dual-modality system is open source and available for download from http://sourceforge.net/projects/patrealtime.

  14. Ab initio reconstruction of transcriptomes of pluripotent and lineage committed cells reveals gene structures of thousands of lincRNAs

    PubMed Central

    Guttman, Mitchell; Garber, Manuel; Levin, Joshua Z.; Donaghey, Julie; Robinson, James; Adiconis, Xian; Fan, Lin; Koziol, Magdalena J.; Gnirke, Andreas; Nusbaum, Chad; Rinn, John L.; Lander, Eric S.; Regev, Aviv

    2010-01-01

    RNA-Seq provides an unbiased way to study a transcriptome, including both coding and non-coding genes. To date, most RNA-Seq studies have critically depended on existing annotations, and thus focused on expression levels and variation in known transcripts. Here, we present Scripture, a method to reconstruct the transcriptome of a mammalian cell using only RNA-Seq reads and the genome sequence. We apply it to mouse embryonic stem cells, neuronal precursor cells, and lung fibroblasts to accurately reconstruct the full-length gene structures for the vast majority of known expressed genes. We identify substantial variation in protein-coding genes, including thousands of novel 5′-start sites, 3′-ends, and internal coding exons. We then determine the gene structures of over a thousand lincRNA and antisense loci. Our results open the way to direct experimental manipulation of thousands of non-coding RNAs, and demonstrate the power of ab initio reconstruction to render a comprehensive picture of mammalian transcriptomes. PMID:20436462

  15. Feature reconstruction of LFP signals based on PLSR in the neural information decoding study.

    PubMed

    Yonghui Dong; Zhigang Shang; Mengmeng Li; Xinyu Liu; Hong Wan

    2017-07-01

    To solve the problems of Signal-to-Noise Ratio (SNR) and multicollinearity when the Local Field Potential (LFP) signals is used for the decoding of animal motion intention, a feature reconstruction of LFP signals based on partial least squares regression (PLSR) in the neural information decoding study is proposed in this paper. Firstly, the feature information of LFP coding band is extracted based on wavelet transform. Then the PLSR model is constructed by the extracted LFP coding features. According to the multicollinearity characteristics among the coding features, several latent variables which contribute greatly to the steering behavior are obtained, and the new LFP coding features are reconstructed. Finally, the K-Nearest Neighbor (KNN) method is used to classify the reconstructed coding features to verify the decoding performance. The results show that the proposed method can achieve the highest accuracy compared to the other three methods and the decoding effect of the proposed method is robust.

  16. Conversion of Component-Based Point Definition to VSP Model and Higher Order Meshing

    NASA Technical Reports Server (NTRS)

    Ordaz, Irian

    2011-01-01

    Vehicle Sketch Pad (VSP) has become a powerful conceptual and parametric geometry tool with numerous export capabilities for third-party analysis codes as well as robust surface meshing capabilities for computational fluid dynamics (CFD) analysis. However, a capability gap currently exists for reconstructing a fully parametric VSP model of a geometry generated by third-party software. A computer code called GEO2VSP has been developed to close this gap and to allow the integration of VSP into a closed-loop geometry design process with other third-party design tools. Furthermore, the automated CFD surface meshing capability of VSP are demonstrated for component-based point definition geometries in a conceptual analysis and design framework.

  17. Spatiotemporal coding in the cortex: information flow-based learning in spiking neural networks.

    PubMed

    Deco, G; Schürmann, B

    1999-05-15

    We introduce a learning paradigm for networks of integrate-and-fire spiking neurons that is based on an information-theoretic criterion. This criterion can be viewed as a first principle that demonstrates the experimentally observed fact that cortical neurons display synchronous firing for some stimuli and not for others. The principle can be regarded as the postulation of a nonparametric reconstruction method as optimization criteria for learning the required functional connectivity that justifies and explains synchronous firing for binding of features as a mechanism for spatiotemporal coding. This can be expressed in an information-theoretic way by maximizing the discrimination ability between different sensory inputs in minimal time.

  18. Computing Challenges in Coded Mask Imaging

    NASA Technical Reports Server (NTRS)

    Skinner, Gerald

    2009-01-01

    This slide presaentation reviews the complications and challenges in developing computer systems for Coded Mask Imaging telescopes. The coded mask technique is used when there is no other way to create the telescope, (i.e., when there are wide fields of view, high energies for focusing or low energies for the Compton/Tracker Techniques and very good angular resolution.) The coded mask telescope is described, and the mask is reviewed. The coded Masks for the INTErnational Gamma-Ray Astrophysics Laboratory (INTEGRAL) instruments are shown, and a chart showing the types of position sensitive detectors used for the coded mask telescopes is also reviewed. Slides describe the mechanism of recovering an image from the masked pattern. The correlation with the mask pattern is described. The Matrix approach is reviewed, and other approaches to image reconstruction are described. Included in the presentation is a review of the Energetic X-ray Imaging Survey Telescope (EXIST) / High Energy Telescope (HET), with information about the mission, the operation of the telescope, comparison of the EXIST/HET with the SWIFT/BAT and details of the design of the EXIST/HET.

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lemaire, H.; Barat, E.; Carrel, F.

    In this work, we tested Maximum likelihood expectation-maximization (MLEM) algorithms optimized for gamma imaging applications on two recent coded mask gamma cameras. We respectively took advantage of the characteristics of the GAMPIX and Caliste HD-based gamma cameras: noise reduction thanks to mask/anti-mask procedure but limited energy resolution for GAMPIX, high energy resolution for Caliste HD. One of our short-term perspectives is the test of MAPEM algorithms integrating specific prior values for the data to reconstruct adapted to the gamma imaging topic. (authors)

  20. A novel data processing technique for image reconstruction of penumbral imaging

    NASA Astrophysics Data System (ADS)

    Xie, Hongwei; Li, Hongyun; Xu, Zeping; Song, Guzhou; Zhang, Faqiang; Zhou, Lin

    2011-06-01

    CT image reconstruction technique was applied to the data processing of the penumbral imaging. Compared with other traditional processing techniques for penumbral coded pinhole image such as Wiener, Lucy-Richardson and blind technique, this approach is brand new. In this method, the coded aperture processing method was used for the first time independent to the point spread function of the image diagnostic system. In this way, the technical obstacles was overcome in the traditional coded pinhole image processing caused by the uncertainty of point spread function of the image diagnostic system. Then based on the theoretical study, the simulation of penumbral imaging and image reconstruction was carried out to provide fairly good results. While in the visible light experiment, the point source of light was used to irradiate a 5mm×5mm object after diffuse scattering and volume scattering. The penumbral imaging was made with aperture size of ~20mm. Finally, the CT image reconstruction technique was used for image reconstruction to provide a fairly good reconstruction result.

  1. Implementation of Soft X-ray Tomography on NSTX

    NASA Astrophysics Data System (ADS)

    Tritz, K.; Stutman, D.; Finkenthal, M.; Granetz, R.; Menard, J.; Park, W.

    2003-10-01

    A set of poloidal ultrasoft X-ray arrays is operated by the Johns Hopkins group on NSTX. To enable MHD mode analysis independent of the magnetic reconstruction, the McCormick-Granetz tomography code developed at MIT is being adapted to the NSTX geometry. Tests of the code using synthetic data show that that present X-ray system is adequate for m=1 tomography. In addition, we have found that spline basis functions may be better suited than Bessel functions for the reconstruction of radially localized phenomena in NSTX. The tomography code was also used to determine the necessary array expansion and optimal array placement for the characterization of higher m modes (m=2,3) in the future. Initial reconstruction of experimental soft X-ray data has been performed for m=1 internal modes, which are often encountered in high beta NSTX discharges. The reconstruction of these modes will be compared to predictions from the M3D code and magnetic measurements.

  2. Research on compressive sensing reconstruction algorithm based on total variation model

    NASA Astrophysics Data System (ADS)

    Gao, Yu-xuan; Sun, Huayan; Zhang, Tinghua; Du, Lin

    2017-12-01

    Compressed sensing for breakthrough Nyquist sampling theorem provides a strong theoretical , making compressive sampling for image signals be carried out simultaneously. In traditional imaging procedures using compressed sensing theory, not only can it reduces the storage space, but also can reduce the demand for detector resolution greatly. Using the sparsity of image signal, by solving the mathematical model of inverse reconfiguration, realize the super-resolution imaging. Reconstruction algorithm is the most critical part of compression perception, to a large extent determine the accuracy of the reconstruction of the image.The reconstruction algorithm based on the total variation (TV) model is more suitable for the compression reconstruction of the two-dimensional image, and the better edge information can be obtained. In order to verify the performance of the algorithm, Simulation Analysis the reconstruction result in different coding mode of the reconstruction algorithm based on the TV reconstruction algorithm. The reconstruction effect of the reconfigurable algorithm based on TV based on the different coding methods is analyzed to verify the stability of the algorithm. This paper compares and analyzes the typical reconstruction algorithm in the same coding mode. On the basis of the minimum total variation algorithm, the Augmented Lagrangian function term is added and the optimal value is solved by the alternating direction method.Experimental results show that the reconstruction algorithm is compared with the traditional classical algorithm based on TV has great advantages, under the low measurement rate can be quickly and accurately recovers target image.

  3. Real-Time Sensor Validation System Developed for Reusable Launch Vehicle Testbed

    NASA Technical Reports Server (NTRS)

    Jankovsky, Amy L.

    1997-01-01

    A real-time system for validating sensor health has been developed for the reusable launch vehicle (RLV) program. This system, which is part of the propulsion checkout and control system (PCCS), was designed for use in an integrated propulsion technology demonstrator testbed built by Rockwell International and located at the NASA Marshall Space Flight Center. Work on the sensor health validation system, a result of an industry-NASA partnership, was completed at the NASA Lewis Research Center, then delivered to Marshall for integration and testing. The sensor validation software performs three basic functions: it identifies failed sensors, it provides reconstructed signals for failed sensors, and it identifies off-nominal system transient behavior that cannot be attributed to a failed sensor. The code is initiated by host software before the start of a propulsion system test, and it is called by the host program every control cycle. The output is posted to global memory for use by other PCCS modules. Output includes a list indicating the status of each sensor (i.e., failed, healthy, or reconstructed) and a list of features that are not due to a sensor failure. If a sensor failure is found, the system modifies that sensor's data array by substituting a reconstructed signal, when possible, for use by other PCCS modules.

  4. Optimization of Trade-offs in Error-free Image Transmission

    NASA Astrophysics Data System (ADS)

    Cox, Jerome R.; Moore, Stephen M.; Blaine, G. James; Zimmerman, John B.; Wallace, Gregory K.

    1989-05-01

    The availability of ubiquitous wide-area channels of both modest cost and higher transmission rate than voice-grade lines promises to allow the expansion of electronic radiology services to a larger community. The band-widths of the new services becoming available from the Integrated Services Digital Network (ISDN) are typically limited to 128 Kb/s, almost two orders of magnitude lower than popular LANs can support. Using Discrete Cosine Transform (DCT) techniques, a compressed approximation to an image may be rapidly transmitted. However, intensity or resampling transformations of the reconstructed image may reveal otherwise invisible artifacts of the approximate encoding. A progressive transmission scheme reported in ISO Working Paper N800 offers an attractive solution to this problem by rapidly reconstructing an apparently undistorted image from the DCT coefficients and then subse-quently transmitting the error image corresponding to the difference between the original and the reconstructed images. This approach achieves an error-free transmission without sacrificing the perception of rapid image delivery. Furthermore, subsequent intensity and resampling manipulations can be carried out with confidence. DCT coefficient precision affects the amount of error information that must be transmitted and, hence the delivery speed of error-free images. This study calculates the overall information coding rate for six radiographic images as a function of DCT coefficient precision. The results demonstrate that a minimum occurs for each of the six images at an average coefficient precision of between 0.5 and 1.0 bits per pixel (b/p). Apparently undistorted versions of these six images can be transmitted with a coding rate of between 0.25 and 0.75 b/p while error-free versions can be transmitted with an overall coding rate between 4.5 and 6.5 b/p.

  5. Improved image decompression for reduced transform coding artifacts

    NASA Technical Reports Server (NTRS)

    Orourke, Thomas P.; Stevenson, Robert L.

    1994-01-01

    The perceived quality of images reconstructed from low bit rate compression is severely degraded by the appearance of transform coding artifacts. This paper proposes a method for producing higher quality reconstructed images based on a stochastic model for the image data. Quantization (scalar or vector) partitions the transform coefficient space and maps all points in a partition cell to a representative reconstruction point, usually taken as the centroid of the cell. The proposed image estimation technique selects the reconstruction point within the quantization partition cell which results in a reconstructed image which best fits a non-Gaussian Markov random field (MRF) image model. This approach results in a convex constrained optimization problem which can be solved iteratively. At each iteration, the gradient projection method is used to update the estimate based on the image model. In the transform domain, the resulting coefficient reconstruction points are projected to the particular quantization partition cells defined by the compressed image. Experimental results will be shown for images compressed using scalar quantization of block DCT and using vector quantization of subband wavelet transform. The proposed image decompression provides a reconstructed image with reduced visibility of transform coding artifacts and superior perceived quality.

  6. TreSpEx—Detection of Misleading Signal in Phylogenetic Reconstructions Based on Tree Information

    PubMed Central

    Struck, Torsten H

    2014-01-01

    Phylogenies of species or genes are commonplace nowadays in many areas of comparative biological studies. However, for phylogenetic reconstructions one must refer to artificial signals such as paralogy, long-branch attraction, saturation, or conflict between different datasets. These signals might eventually mislead the reconstruction even in phylogenomic studies employing hundreds of genes. Unfortunately, there has been no program allowing the detection of such effects in combination with an implementation into automatic process pipelines. TreSpEx (Tree Space Explorer) now combines different approaches (including statistical tests), which utilize tree-based information like nodal support or patristic distances (PDs) to identify misleading signals. The program enables the parallel analysis of hundreds of trees and/or predefined gene partitions, and being command-line driven, it can be integrated into automatic process pipelines. TreSpEx is implemented in Perl and supported on Linux, Mac OS X, and MS Windows. Source code, binaries, and additional material are freely available at http://www.annelida.de/research/bioinformatics/software.html. PMID:24701118

  7. On scalable lossless video coding based on sub-pixel accurate MCTF

    NASA Astrophysics Data System (ADS)

    Yea, Sehoon; Pearlman, William A.

    2006-01-01

    We propose two approaches to scalable lossless coding of motion video. They achieve SNR-scalable bitstream up to lossless reconstruction based upon the subpixel-accurate MCTF-based wavelet video coding. The first approach is based upon a two-stage encoding strategy where a lossy reconstruction layer is augmented by a following residual layer in order to obtain (nearly) lossless reconstruction. The key advantages of our approach include an 'on-the-fly' determination of bit budget distribution between the lossy and the residual layers, freedom to use almost any progressive lossy video coding scheme as the first layer and an added feature of near-lossless compression. The second approach capitalizes on the fact that we can maintain the invertibility of MCTF with an arbitrary sub-pixel accuracy even in the presence of an extra truncation step for lossless reconstruction thanks to the lifting implementation. Experimental results show that the proposed schemes achieve compression ratios not obtainable by intra-frame coders such as Motion JPEG-2000 thanks to their inter-frame coding nature. Also they are shown to outperform the state-of-the-art non-scalable inter-frame coder H.264 (JM) lossless mode, with the added benefit of bitstream embeddedness.

  8. Coded diffraction system in X-ray crystallography using a boolean phase coded aperture approximation

    NASA Astrophysics Data System (ADS)

    Pinilla, Samuel; Poveda, Juan; Arguello, Henry

    2018-03-01

    Phase retrieval is a problem present in many applications such as optics, astronomical imaging, computational biology and X-ray crystallography. Recent work has shown that the phase can be better recovered when the acquisition architecture includes a coded aperture, which modulates the signal before diffraction, such that the underlying signal is recovered from coded diffraction patterns. Moreover, this type of modulation effect, before the diffraction operation, can be obtained using a phase coded aperture, just after the sample under study. However, a practical implementation of a phase coded aperture in an X-ray application is not feasible, because it is computationally modeled as a matrix with complex entries which requires changing the phase of the diffracted beams. In fact, changing the phase implies finding a material that allows to deviate the direction of an X-ray beam, which can considerably increase the implementation costs. Hence, this paper describes a low cost coded X-ray diffraction system based on block-unblock coded apertures that enables phase reconstruction. The proposed system approximates the phase coded aperture with a block-unblock coded aperture by using the detour-phase method. Moreover, the SAXS/WAXS X-ray crystallography software was used to simulate the diffraction patterns of a real crystal structure called Rhombic Dodecahedron. Additionally, several simulations were carried out to analyze the performance of block-unblock approximations in recovering the phase, using the simulated diffraction patterns. Furthermore, the quality of the reconstructions was measured in terms of the Peak Signal to Noise Ratio (PSNR). Results show that the performance of the block-unblock phase coded apertures approximation decreases at most 12.5% compared with the phase coded apertures. Moreover, the quality of the reconstructions using the boolean approximations is up to 2.5 dB of PSNR less with respect to the phase coded aperture reconstructions.

  9. On the Measurements of Numerical Viscosity and Resistivity in Eulerian MHD Codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rembiasz, Tomasz; Obergaulinger, Martin; Cerdá-Durán, Pablo

    2017-06-01

    We propose a simple ansatz for estimating the value of the numerical resistivity and the numerical viscosity of any Eulerian MHD code. We test this ansatz with the help of simulations of the propagation of (magneto)sonic waves, Alfvén waves, and the tearing mode (TM) instability using the MHD code Aenus. By comparing the simulation results with analytical solutions of the resistive-viscous MHD equations and an empirical ansatz for the growth rate of TMs, we measure the numerical viscosity and resistivity of Aenus. The comparison shows that the fast magnetosonic speed and wavelength are the characteristic velocity and length, respectively, ofmore » the aforementioned (relatively simple) systems. We also determine the dependence of the numerical viscosity and resistivity on the time integration method, the spatial reconstruction scheme and (to a lesser extent) the Riemann solver employed in the simulations. From the measured results, we infer the numerical resolution (as a function of the spatial reconstruction method) required to properly resolve the growth and saturation level of the magnetic field amplified by the magnetorotational instability in the post-collapsed core of massive stars. Our results show that it is most advantageous to resort to ultra-high-order methods (e.g., the ninth-order monotonicity-preserving method) to tackle this problem properly, in particular, in three-dimensional simulations.« less

  10. Structured Light Based 3d Scanning for Specular Surface by the Combination of Gray Code and Phase Shifting

    NASA Astrophysics Data System (ADS)

    Zhang, Yujia; Yilmaz, Alper

    2016-06-01

    Surface reconstruction using coded structured light is considered one of the most reliable techniques for high-quality 3D scanning. With a calibrated projector-camera stereo system, a light pattern is projected onto the scene and imaged by the camera. Correspondences between projected and recovered patterns are computed in the decoding process, which is used to generate 3D point cloud of the surface. However, the indirect illumination effects on the surface, such as subsurface scattering and interreflections, will raise the difficulties in reconstruction. In this paper, we apply maximum min-SW gray code to reduce the indirect illumination effects of the specular surface. We also analysis the errors when comparing the maximum min-SW gray code and the conventional gray code, which justifies that the maximum min-SW gray code has significant superiority to reduce the indirect illumination effects. To achieve sub-pixel accuracy, we project high frequency sinusoidal patterns onto the scene simultaneously. But for specular surface, the high frequency patterns are susceptible to decoding errors. Incorrect decoding of high frequency patterns will result in a loss of depth resolution. Our method to resolve this problem is combining the low frequency maximum min-SW gray code and the high frequency phase shifting code, which achieves dense 3D reconstruction for specular surface. Our contributions include: (i) A complete setup of the structured light based 3D scanning system; (ii) A novel combination technique of the maximum min-SW gray code and phase shifting code. First, phase shifting decoding with sub-pixel accuracy. Then, the maximum min-SW gray code is used to resolve the ambiguity resolution. According to the experimental results and data analysis, our structured light based 3D scanning system enables high quality dense reconstruction of scenes with a small number of images. Qualitative and quantitative comparisons are performed to extract the advantages of our new combined coding method.

  11. LArSoft: toolkit for simulation, reconstruction and analysis of liquid argon TPC neutrino detectors

    NASA Astrophysics Data System (ADS)

    Snider, E. L.; Petrillo, G.

    2017-10-01

    LArSoft is a set of detector-independent software tools for the simulation, reconstruction and analysis of data from liquid argon (LAr) neutrino experiments The common features of LAr time projection chambers (TPCs) enable sharing of algorithm code across detectors of very different size and configuration. LArSoft is currently used in production simulation and reconstruction by the ArgoNeuT, DUNE, LArlAT, MicroBooNE, and SBND experiments. The software suite offers a wide selection of algorithms and utilities, including those for associated photo-detectors and the handling of auxiliary detectors outside the TPCs. Available algorithms cover the full range of simulation and reconstruction, from raw waveforms to high-level reconstructed objects, event topologies and classification. The common code within LArSoft is contributed by adopting experiments, which also provide detector-specific geometry descriptions, and code for the treatment of electronic signals. LArSoft is also a collaboration of experiments, Fermilab and associated software projects which cooperate in setting requirements, priorities, and schedules. In this talk, we outline the general architecture of the software and the interaction with external libraries and detector-specific code. We also describe the dynamics of LArSoft software development between the contributing experiments, the projects supporting the software infrastructure LArSoft relies on, and the core LArSoft support project.

  12. Using Spherical-Harmonics Expansions for Optics Surface Reconstruction from Gradients.

    PubMed

    Solano-Altamirano, Juan Manuel; Vázquez-Otero, Alejandro; Khikhlukha, Danila; Dormido, Raquel; Duro, Natividad

    2017-11-30

    In this paper, we propose a new algorithm to reconstruct optics surfaces (aka wavefronts) from gradients, defined on a circular domain, by means of the Spherical Harmonics. The experimental results indicate that this algorithm renders the same accuracy, compared to the reconstruction based on classical Zernike polynomials, using a smaller number of polynomial terms, which potentially speeds up the wavefront reconstruction. Additionally, we provide an open-source C++ library, released under the terms of the GNU General Public License version 2 (GPLv2), wherein several polynomial sets are coded. Therefore, this library constitutes a robust software alternative for wavefront reconstruction in a high energy laser field, optical surface reconstruction, and, more generally, in surface reconstruction from gradients. The library is a candidate for being integrated in control systems for optical devices, or similarly to be used in ad hoc simulations. Moreover, it has been developed with flexibility in mind, and, as such, the implementation includes the following features: (i) a mock-up generator of various incident wavefronts, intended to simulate the wavefronts commonly encountered in the field of high-energy lasers production; (ii) runtime selection of the library in charge of performing the algebraic computations; (iii) a profiling mechanism to measure and compare the performance of different steps of the algorithms and/or third-party linear algebra libraries. Finally, the library can be easily extended to include additional dependencies, such as porting the algebraic operations to specific architectures, in order to exploit hardware acceleration features.

  13. Using Spherical-Harmonics Expansions for Optics Surface Reconstruction from Gradients

    PubMed Central

    Solano-Altamirano, Juan Manuel; Khikhlukha, Danila

    2017-01-01

    In this paper, we propose a new algorithm to reconstruct optics surfaces (aka wavefronts) from gradients, defined on a circular domain, by means of the Spherical Harmonics. The experimental results indicate that this algorithm renders the same accuracy, compared to the reconstruction based on classical Zernike polynomials, using a smaller number of polynomial terms, which potentially speeds up the wavefront reconstruction. Additionally, we provide an open-source C++ library, released under the terms of the GNU General Public License version 2 (GPLv2), wherein several polynomial sets are coded. Therefore, this library constitutes a robust software alternative for wavefront reconstruction in a high energy laser field, optical surface reconstruction, and, more generally, in surface reconstruction from gradients. The library is a candidate for being integrated in control systems for optical devices, or similarly to be used in ad hoc simulations. Moreover, it has been developed with flexibility in mind, and, as such, the implementation includes the following features: (i) a mock-up generator of various incident wavefronts, intended to simulate the wavefronts commonly encountered in the field of high-energy lasers production; (ii) runtime selection of the library in charge of performing the algebraic computations; (iii) a profiling mechanism to measure and compare the performance of different steps of the algorithms and/or third-party linear algebra libraries. Finally, the library can be easily extended to include additional dependencies, such as porting the algebraic operations to specific architectures, in order to exploit hardware acceleration features. PMID:29189722

  14. Estimation of 1945 to 1957 food consumption

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Anderson, D.M.; Bates, D.J.; Marsh, T.L.

    This report details the methods used and the results of the study on the estimated historic levels of food consumption by individuals in the Hanford Environmental Dose Reconstruction (HEDR) study area from 1945--1957. This period includes the time of highest releases from Hanford and is the period for which data are being collected in the Hanford Thyroid Disease Study. These estimates provide the food-consumption inputs for the HEDR database of individual diets. This database will be an input file in the Hanford Environmental Dose Reconstruction Integrated Code (HEDRIC) computer model that will be used to calculate the radiation dose. Themore » report focuses on fresh milk, eggs, lettuce, and spinach. These foods were chosen because they have been found to be significant contributors to radiation dose based on the Technical Steering Panel dose decision level.« less

  15. A novel shape-based coding-decoding technique for an industrial visual inspection system.

    PubMed

    Mukherjee, Anirban; Chaudhuri, Subhasis; Dutta, Pranab K; Sen, Siddhartha; Patra, Amit

    2004-01-01

    This paper describes a unique single camera-based dimension storage method for image-based measurement. The system has been designed and implemented in one of the integrated steel plants of India. The purpose of the system is to encode the frontal cross-sectional area of an ingot. The encoded data will be stored in a database to facilitate the future manufacturing diagnostic process. The compression efficiency and reconstruction error of the lossy encoding technique have been reported and found to be quite encouraging.

  16. A Unified Mathematical Framework for Coding Time, Space, and Sequences in the Hippocampal Region

    PubMed Central

    MacDonald, Christopher J.; Tiganj, Zoran; Shankar, Karthik H.; Du, Qian; Hasselmo, Michael E.; Eichenbaum, Howard

    2014-01-01

    The medial temporal lobe (MTL) is believed to support episodic memory, vivid recollection of a specific event situated in a particular place at a particular time. There is ample neurophysiological evidence that the MTL computes location in allocentric space and more recent evidence that the MTL also codes for time. Space and time represent a similar computational challenge; both are variables that cannot be simply calculated from the immediately available sensory information. We introduce a simple mathematical framework that computes functions of both spatial location and time as special cases of a more general computation. In this framework, experience unfolding in time is encoded via a set of leaky integrators. These leaky integrators encode the Laplace transform of their input. The information contained in the transform can be recovered using an approximation to the inverse Laplace transform. In the temporal domain, the resulting representation reconstructs the temporal history. By integrating movements, the equations give rise to a representation of the path taken to arrive at the present location. By modulating the transform with information about allocentric velocity, the equations code for position of a landmark. Simulated cells show a close correspondence to neurons observed in various regions for all three cases. In the temporal domain, novel secondary analyses of hippocampal time cells verified several qualitative predictions of the model. An integrated representation of spatiotemporal context can be computed by taking conjunctions of these elemental inputs, leading to a correspondence with conjunctive neural representations observed in dorsal CA1. PMID:24672015

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Martin, Shawn

    This code consists of Matlab routines which enable the user to perform non-manifold surface reconstruction via triangulation from high dimensional point cloud data. The code was based on an algorithm originally developed in [Freedman (2007), An Incremental Algorithm for Reconstruction of Surfaces of Arbitrary Codimension Computational Geometry: Theory and Applications, 36(2):106-116]. This algorithm has been modified to accommodate non-manifold surface according to the work described in [S. Martin and J.-P. Watson (2009), Non-Manifold Surface Reconstruction from High Dimensional Point Cloud DataSAND #5272610].The motivation for developing the code was a point cloud describing the molecular conformation space of cyclooctane (C8H16). Cyclooctanemore » conformation space was represented using points in 72 dimensions (3 coordinates for each molecule). The code was used to triangulate the point cloud and thereby study the geometry and topology of cyclooctane. Futures applications are envisioned for peptides and proteins.« less

  18. Modified Mean-Pyramid Coding Scheme

    NASA Technical Reports Server (NTRS)

    Cheung, Kar-Ming; Romer, Richard

    1996-01-01

    Modified mean-pyramid coding scheme requires transmission of slightly fewer data. Data-expansion factor reduced from 1/3 to 1/12. Schemes for progressive transmission of image data transmitted in sequence of frames in such way coarse version of image reconstructed after receipt of first frame and increasingly refined version of image reconstructed after receipt of each subsequent frame.

  19. Molecular phylogeny of 21 tropical bamboo species reconstructed by integrating non-coding internal transcribed spacer (ITS1 and 2) sequences and their consensus secondary structure.

    PubMed

    Ghosh, Jayadri Sekhar; Bhattacharya, Samik; Pal, Amita

    2017-06-01

    The unavailability of the reproductive structure and unpredictability of vegetative characters for the identification and phylogenetic study of bamboo prompted the application of molecular techniques for greater resolution and consensus. We first employed internal transcribed spacer (ITS1, 5.8S rRNA and ITS2) sequences to construct the phylogenetic tree of 21 tropical bamboo species. While the sequence alone could grossly reconstruct the traditional phylogeny amongst the 21-tropical species studied, some anomalies were encountered that prompted a further refinement of the phylogenetic analyses. Therefore, we integrated the secondary structure of the ITS sequences to derive individual sequence-structure matrix to gain more resolution on the phylogenetic reconstruction. The results showed that ITS sequence-structure is the reliable alternative to the conventional phenotypic method for the identification of bamboo species. The best-fit topology obtained by the sequence-structure based phylogeny over the sole sequence based one underscores closer clustering of all the studied Bambusa species (Sub-tribe Bambusinae), while Melocanna baccifera, which belongs to Sub-Tribe Melocanneae, disjointedly clustered as an out-group within the consensus phylogenetic tree. In this study, we demonstrated the dependability of the combined (ITS sequence+structure-based) approach over the only sequence-based analysis for phylogenetic relationship assessment of bamboo.

  20. Inverse Heat Conduction Methods in the CHAR Code for Aerothermal Flight Data Reconstruction

    NASA Technical Reports Server (NTRS)

    Oliver, A. Brandon; Amar, Adam J.

    2016-01-01

    Reconstruction of flight aerothermal environments often requires the solution of an inverse heat transfer problem, which is an ill-posed problem of determining boundary conditions from discrete measurements in the interior of the domain. This paper will present the algorithms implemented in the CHAR code for use in reconstruction of EFT-1 flight data and future testing activities. Implementation details will be discussed, and alternative hybrid-methods that are permitted by the implementation will be described. Results will be presented for a number of problems.

  1. Inverse Heat Conduction Methods in the CHAR Code for Aerothermal Flight Data Reconstruction

    NASA Technical Reports Server (NTRS)

    Oliver, A Brandon; Amar, Adam J.

    2016-01-01

    Reconstruction of flight aerothermal environments often requires the solution of an inverse heat transfer problem, which is an ill-posed problem of specifying boundary conditions from discrete measurements in the interior of the domain. This paper will present the algorithms implemented in the CHAR code for use in reconstruction of EFT-1 flight data and future testing activities. Implementation nuances will be discussed, and alternative hybrid-methods that are permitted by the implementation will be described. Results will be presented for a number of one-dimensional and multi-dimensional problems

  2. NVIDIA OptiX ray-tracing engine as a new tool for modelling medical imaging systems

    NASA Astrophysics Data System (ADS)

    Pietrzak, Jakub; Kacperski, Krzysztof; Cieślar, Marek

    2015-03-01

    The most accurate technique to model the X- and gamma radiation path through a numerically defined object is the Monte Carlo simulation which follows single photons according to their interaction probabilities. A simplified and much faster approach, which just integrates total interaction probabilities along selected paths, is known as ray tracing. Both techniques are used in medical imaging for simulating real imaging systems and as projectors required in iterative tomographic reconstruction algorithms. These approaches are ready for massive parallel implementation e.g. on Graphics Processing Units (GPU), which can greatly accelerate the computation time at a relatively low cost. In this paper we describe the application of the NVIDIA OptiX ray-tracing engine, popular in professional graphics and rendering applications, as a new powerful tool for X- and gamma ray-tracing in medical imaging. It allows the implementation of a variety of physical interactions of rays with pixel-, mesh- or nurbs-based objects, and recording any required quantities, like path integrals, interaction sites, deposited energies, and others. Using the OptiX engine we have implemented a code for rapid Monte Carlo simulations of Single Photon Emission Computed Tomography (SPECT) imaging, as well as the ray-tracing projector, which can be used in reconstruction algorithms. The engine generates efficient, scalable and optimized GPU code, ready to run on multi GPU heterogeneous systems. We have compared the results our simulations with the GATE package. With the OptiX engine the computation time of a Monte Carlo simulation can be reduced from days to minutes.

  3. The Gaugamela Battle Eclipse: An Archaeoastronomical Analysis

    NASA Astrophysics Data System (ADS)

    Polcaro, V. F.; Valsecchi, G. B.; Verderame, L.

    A total lunar eclipse occurred during the night preceding the decisive Battle of Gaugamela (20th September 331 BCE), when the Macedonian army, led by Alexander the Great, finally defeated the Persian king Darius and his army. This astronomical event, well known to historians, had a relevant role on the battle outcome. The eclipse was described in detail by Babylonian astronomers, though, unfortunately, the text of their report has only partially been preserved. We have reconstructed the evolution of the phenomenon as it appeared to the observer in Babylonia, by using the positional astronomy code "Planetario V2.0". On the base of this reconstruction we suggest a number of integrations to the lost part of the text, allowing a finer astrological interpretation of the eclipse and of its influence on the mood of the armies that set against each other on the following morning.

  4. High-SNR spectrum measurement based on Hadamard encoding and sparse reconstruction

    NASA Astrophysics Data System (ADS)

    Wang, Zhaoxin; Yue, Jiang; Han, Jing; Li, Long; Jin, Yong; Gao, Yuan; Li, Baoming

    2017-12-01

    The denoising capabilities of the H-matrix and cyclic S-matrix based on the sparse reconstruction, employed in the Pixel of Focal Plane Coded Visible Spectrometer for spectrum measurement are investigated, where the spectrum is sparse in a known basis. In the measurement process, the digital micromirror device plays an important role, which implements the Hadamard coding. In contrast with Hadamard transform spectrometry, based on the shift invariability, this spectrometer may have the advantage of a high efficiency. Simulations and experiments show that the nonlinear solution with a sparse reconstruction has a better signal-to-noise ratio than the linear solution and the H-matrix outperforms the cyclic S-matrix whether the reconstruction method is nonlinear or linear.

  5. Charm: Cosmic history agnostic reconstruction method

    NASA Astrophysics Data System (ADS)

    Porqueres, Natalia; Ensslin, Torsten A.

    2017-03-01

    Charm (cosmic history agnostic reconstruction method) reconstructs the cosmic expansion history in the framework of Information Field Theory. The reconstruction is performed via the iterative Wiener filter from an agnostic or from an informative prior. The charm code allows one to test the compatibility of several different data sets with the LambdaCDM model in a non-parametric way.

  6. PyPanda: a Python package for gene regulatory network reconstruction

    PubMed Central

    van IJzendoorn, David G.P.; Glass, Kimberly; Quackenbush, John; Kuijjer, Marieke L.

    2016-01-01

    Summary: PANDA (Passing Attributes between Networks for Data Assimilation) is a gene regulatory network inference method that uses message-passing to integrate multiple sources of ‘omics data. PANDA was originally coded in C ++. In this application note we describe PyPanda, the Python version of PANDA. PyPanda runs considerably faster than the C ++ version and includes additional features for network analysis. Availability and implementation: The open source PyPanda Python package is freely available at http://github.com/davidvi/pypanda. Contact: mkuijjer@jimmy.harvard.edu or d.g.p.van_ijzendoorn@lumc.nl PMID:27402905

  7. PyPanda: a Python package for gene regulatory network reconstruction.

    PubMed

    van IJzendoorn, David G P; Glass, Kimberly; Quackenbush, John; Kuijjer, Marieke L

    2016-11-01

    PANDA (Passing Attributes between Networks for Data Assimilation) is a gene regulatory network inference method that uses message-passing to integrate multiple sources of 'omics data. PANDA was originally coded in C ++. In this application note we describe PyPanda, the Python version of PANDA. PyPanda runs considerably faster than the C ++ version and includes additional features for network analysis. The open source PyPanda Python package is freely available at http://github.com/davidvi/pypanda CONTACT: mkuijjer@jimmy.harvard.edu or d.g.p.van_ijzendoorn@lumc.nl. © The Author 2016. Published by Oxford University Press.

  8. Coding Early Naturalists' Accounts into Long-Term Fish Community Changes in the Adriatic Sea (1800–2000)

    PubMed Central

    Fortibuoni, Tomaso; Libralato, Simone; Raicevich, Saša; Giovanardi, Otello; Solidoro, Cosimo

    2010-01-01

    The understanding of fish communities' changes over the past centuries has important implications for conservation policy and marine resource management. However, reconstructing these changes is difficult because information on marine communities before the second half of the 20th century is, in most cases, anecdotal and merely qualitative. Therefore, historical qualitative records and modern quantitative data are not directly comparable, and their integration for long-term analyses is not straightforward. We developed a methodology that allows the coding of qualitative information provided by early naturalists into semi-quantitative information through an intercalibration with landing proportions. This approach allowed us to reconstruct and quantitatively analyze a 200-year-long time series of fish community structure indicators in the Northern Adriatic Sea (Mediterranean Sea). Our analysis provides evidence of long-term changes in fish community structure, including the decline of Chondrichthyes, large-sized and late-maturing species. This work highlights the importance of broadening the time-frame through which we look at marine ecosystem changes and provides a methodology to exploit, in a quantitative framework, historical qualitative sources. To the purpose, naturalists' eyewitness accounts proved to be useful for extending the analysis on fish community back in the past, well before the onset of field-based monitoring programs. PMID:21103349

  9. Beyond filtered backprojection: A reconstruction software package for ion beam microtomography data

    NASA Astrophysics Data System (ADS)

    Habchi, C.; Gordillo, N.; Bourret, S.; Barberet, Ph.; Jovet, C.; Moretto, Ph.; Seznec, H.

    2013-01-01

    A new version of the TomoRebuild data reduction software package is presented, for the reconstruction of scanning transmission ion microscopy tomography (STIMT) and particle induced X-ray emission tomography (PIXET) images. First, we present a state of the art of the reconstruction codes available for ion beam microtomography. The algorithm proposed here brings several advantages. It is a portable, multi-platform code, designed in C++ with well-separated classes for easier use and evolution. Data reduction is separated in different steps and the intermediate results may be checked if necessary. Although no additional graphic library or numerical tool is required to run the program as a command line, a user friendly interface was designed in Java, as an ImageJ plugin. All experimental and reconstruction parameters may be entered either through this plugin or directly in text format files. A simple standard format is proposed for the input of experimental data. Optional graphic applications using the ROOT interface may be used separately to display and fit energy spectra. Regarding the reconstruction process, the filtered backprojection (FBP) algorithm, already present in the previous version of the code, was optimized so that it is about 10 times as fast. In addition, Maximum Likelihood Expectation Maximization (MLEM) and its accelerated version Ordered Subsets Expectation Maximization (OSEM) algorithms were implemented. A detailed user guide in English is available. A reconstruction example of experimental data from a biological sample is given. It shows the capability of the code to reduce noise in the sinograms and to deal with incomplete data, which puts a new perspective on tomography using low number of projections or limited angle.

  10. Joint reconstruction of multiview compressed images.

    PubMed

    Thirumalai, Vijayaraghavan; Frossard, Pascal

    2013-05-01

    Distributed representation of correlated multiview images is an important problem that arises in vision sensor networks. This paper concentrates on the joint reconstruction problem where the distributively compressed images are decoded together in order to take benefit from the image correlation. We consider a scenario where the images captured at different viewpoints are encoded independently using common coding solutions (e.g., JPEG) with a balanced rate distribution among different cameras. A central decoder first estimates the inter-view image correlation from the independently compressed data. The joint reconstruction is then cast as a constrained convex optimization problem that reconstructs total-variation (TV) smooth images, which comply with the estimated correlation model. At the same time, we add constraints that force the reconstructed images to be as close as possible to their compressed versions. We show through experiments that the proposed joint reconstruction scheme outperforms independent reconstruction in terms of image quality, for a given target bit rate. In addition, the decoding performance of our algorithm compares advantageously to state-of-the-art distributed coding schemes based on motion learning and on the DISCOVER algorithm.

  11. EvoluCode: Evolutionary Barcodes as a Unifying Framework for Multilevel Evolutionary Data.

    PubMed

    Linard, Benjamin; Nguyen, Ngoc Hoan; Prosdocimi, Francisco; Poch, Olivier; Thompson, Julie D

    2012-01-01

    Evolutionary systems biology aims to uncover the general trends and principles governing the evolution of biological networks. An essential part of this process is the reconstruction and analysis of the evolutionary histories of these complex, dynamic networks. Unfortunately, the methodologies for representing and exploiting such complex evolutionary histories in large scale studies are currently limited. Here, we propose a new formalism, called EvoluCode (Evolutionary barCode), which allows the integration of different evolutionary parameters (eg, sequence conservation, orthology, synteny …) in a unifying format and facilitates the multilevel analysis and visualization of complex evolutionary histories at the genome scale. The advantages of the approach are demonstrated by constructing barcodes representing the evolution of the complete human proteome. Two large-scale studies are then described: (i) the mapping and visualization of the barcodes on the human chromosomes and (ii) automatic clustering of the barcodes to highlight protein subsets sharing similar evolutionary histories and their functional analysis. The methodologies developed here open the way to the efficient application of other data mining and knowledge extraction techniques in evolutionary systems biology studies. A database containing all EvoluCode data is available at: http://lbgi.igbmc.fr/barcodes.

  12. Ultra-narrow bandwidth voice coding

    DOEpatents

    Holzrichter, John F [Berkeley, CA; Ng, Lawrence C [Danville, CA

    2007-01-09

    A system of removing excess information from a human speech signal and coding the remaining signal information, transmitting the coded signal, and reconstructing the coded signal. The system uses one or more EM wave sensors and one or more acoustic microphones to determine at least one characteristic of the human speech signal.

  13. Modeling of Plasma Pressure Effects on ELM Suppression With RMP in DIII-D

    NASA Astrophysics Data System (ADS)

    Orlov, D. M.; Moyer, R. A.; Mordijck, S.; Evans, T. E.; Osborne, T. H.; Snyder, P. B.; Unterberg, E. A.; Fenstermacher, M. E.

    2009-11-01

    Resonant magnetic perturbations (RMPs) are used to control the pedestal pressure gradient in both low and high (ν3^*) DIII-D plasmas. In this work we have analyzed several discharges with different levels of triangularity, different neutral beam injection power levels, and with, βN ranging from 1.5 to 2.3. The field line integration code TRIP3D was used to model the magnetic perturbation in ELMing and ELM suppressed phases during the RMP pulse. The results of this modeling showed very little effect of βN on the structure of the vacuum magnetic field during ELM suppression using n=3 RMPs. Kinetic equilibrium reconstructions showed a decrease in bootstrap current during RMP. Linear peeling-ballooning stability analysis performed with the ELITE code suggested that the ELMs, which persist during RMP, i.e. ELMing still is observed, are not Type I ELMs. Identification of these Dα spikes is an ongoing work.

  14. Coded mask telescopes for X-ray astronomy

    NASA Astrophysics Data System (ADS)

    Skinner, G. K.; Ponman, T. J.

    1987-04-01

    The principle of the coded mask techniques are discussed together with the methods of image reconstruction. The coded mask telescopes built at the University of Birmingham, including the SL 1501 coded mask X-ray telescope flown on the Skylark rocket and the Coded Mask Imaging Spectrometer (COMIS) projected for the Soviet space station Mir, are described. A diagram of a coded mask telescope and some designs for coded masks are included.

  15. NetMiner-an ensemble pipeline for building genome-wide and high-quality gene co-expression network using massive-scale RNA-seq samples.

    PubMed

    Yu, Hua; Jiao, Bingke; Lu, Lu; Wang, Pengfei; Chen, Shuangcheng; Liang, Chengzhi; Liu, Wei

    2018-01-01

    Accurately reconstructing gene co-expression network is of great importance for uncovering the genetic architecture underlying complex and various phenotypes. The recent availability of high-throughput RNA-seq sequencing has made genome-wide detecting and quantifying of the novel, rare and low-abundance transcripts practical. However, its potential merits in reconstructing gene co-expression network have still not been well explored. Using massive-scale RNA-seq samples, we have designed an ensemble pipeline, called NetMiner, for building genome-scale and high-quality Gene Co-expression Network (GCN) by integrating three frequently used inference algorithms. We constructed a RNA-seq-based GCN in one species of monocot rice. The quality of network obtained by our method was verified and evaluated by the curated gene functional association data sets, which obviously outperformed each single method. In addition, the powerful capability of network for associating genes with functions and agronomic traits was shown by enrichment analysis and case studies. In particular, we demonstrated the potential value of our proposed method to predict the biological roles of unknown protein-coding genes, long non-coding RNA (lncRNA) genes and circular RNA (circRNA) genes. Our results provided a valuable and highly reliable data source to select key candidate genes for subsequent experimental validation. To facilitate identification of novel genes regulating important biological processes and phenotypes in other plants or animals, we have published the source code of NetMiner, making it freely available at https://github.com/czllab/NetMiner.

  16. Spotted star mapping by light curve inversion: Tests and application to HD 12545

    NASA Astrophysics Data System (ADS)

    Kolbin, A. I.; Shimansky, V. V.

    2013-06-01

    A code for mapping the surfaces of spotted stars is developed. The concept of the code is to analyze rotational-modulated light curves. We simulate the process of reconstruction for the star surface and the results of simulation are presented. The reconstruction atrifacts caused by the ill-posed nature of the problem are deduced. The surface of the spotted component of system HD 12545 is mapped using the procedure.

  17. Maximising information recovery from rank-order codes

    NASA Astrophysics Data System (ADS)

    Sen, B.; Furber, S.

    2007-04-01

    The central nervous system encodes information in sequences of asynchronously generated voltage spikes, but the precise details of this encoding are not well understood. Thorpe proposed rank-order codes as an explanation of the observed speed of information processing in the human visual system. The work described in this paper is inspired by the performance of SpikeNET, a biologically inspired neural architecture using rank-order codes for information processing, and is based on the retinal model developed by VanRullen and Thorpe. This model mimics retinal information processing by passing an input image through a bank of Difference of Gaussian (DoG) filters and then encoding the resulting coefficients in rank-order. To test the effectiveness of this encoding in capturing the information content of an image, the rank-order representation is decoded to reconstruct an image that can be compared with the original. The reconstruction uses a look-up table to infer the filter coefficients from their rank in the encoded image. Since the DoG filters are approximately orthogonal functions, they are treated as their own inverses in the reconstruction process. We obtained a quantitative measure of the perceptually important information retained in the reconstructed image relative to the original using a slightly modified version of an objective metric proposed by Petrovic. It is observed that around 75% of the perceptually important information is retained in the reconstruction. In the present work we reconstruct the input using a pseudo-inverse of the DoG filter-bank with the aim of improving the reconstruction and thereby extracting more information from the rank-order encoded stimulus. We observe that there is an increase of 10 - 15% in the information retrieved from a reconstructed stimulus as a result of inverting the filter-bank.

  18. 42 CFR 73.3 - HHS select agents and toxins.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... virus Monkeypox virus Reconstructed replication competent forms of the 1918 pandemic influenza virus containing any portion of the coding regions of all eight gene segments (Reconstructed 1918 Influenza virus...

  19. Noniterative MAP reconstruction using sparse matrix representations.

    PubMed

    Cao, Guangzhi; Bouman, Charles A; Webb, Kevin J

    2009-09-01

    We present a method for noniterative maximum a posteriori (MAP) tomographic reconstruction which is based on the use of sparse matrix representations. Our approach is to precompute and store the inverse matrix required for MAP reconstruction. This approach has generally not been used in the past because the inverse matrix is typically large and fully populated (i.e., not sparse). In order to overcome this problem, we introduce two new ideas. The first idea is a novel theory for the lossy source coding of matrix transformations which we refer to as matrix source coding. This theory is based on a distortion metric that reflects the distortions produced in the final matrix-vector product, rather than the distortions in the coded matrix itself. The resulting algorithms are shown to require orthonormal transformations of both the measurement data and the matrix rows and columns before quantization and coding. The second idea is a method for efficiently storing and computing the required orthonormal transformations, which we call a sparse-matrix transform (SMT). The SMT is a generalization of the classical FFT in that it uses butterflies to compute an orthonormal transform; but unlike an FFT, the SMT uses the butterflies in an irregular pattern, and is numerically designed to best approximate the desired transforms. We demonstrate the potential of the noniterative MAP reconstruction with examples from optical tomography. The method requires offline computation to encode the inverse transform. However, once these offline computations are completed, the noniterative MAP algorithm is shown to reduce both storage and computation by well over two orders of magnitude, as compared to a linear iterative reconstruction methods.

  20. Filtered gradient reconstruction algorithm for compressive spectral imaging

    NASA Astrophysics Data System (ADS)

    Mejia, Yuri; Arguello, Henry

    2017-04-01

    Compressive sensing matrices are traditionally based on random Gaussian and Bernoulli entries. Nevertheless, they are subject to physical constraints, and their structure unusually follows a dense matrix distribution, such as the case of the matrix related to compressive spectral imaging (CSI). The CSI matrix represents the integration of coded and shifted versions of the spectral bands. A spectral image can be recovered from CSI measurements by using iterative algorithms for linear inverse problems that minimize an objective function including a quadratic error term combined with a sparsity regularization term. However, current algorithms are slow because they do not exploit the structure and sparse characteristics of the CSI matrices. A gradient-based CSI reconstruction algorithm, which introduces a filtering step in each iteration of a conventional CSI reconstruction algorithm that yields improved image quality, is proposed. Motivated by the structure of the CSI matrix, Φ, this algorithm modifies the iterative solution such that it is forced to converge to a filtered version of the residual ΦTy, where y is the compressive measurement vector. We show that the filtered-based algorithm converges to better quality performance results than the unfiltered version. Simulation results highlight the relative performance gain over the existing iterative algorithms.

  1. Development of an EMC3-EIRENE Synthetic Imaging Diagnostic

    NASA Astrophysics Data System (ADS)

    Meyer, William; Allen, Steve; Samuell, Cameron; Lore, Jeremy

    2017-10-01

    2D and 3D flow measurements are critical for validating numerical codes such as EMC3-EIRENE. Toroidal symmetry assumptions preclude tomographic reconstruction of 3D flows from single camera views. In addition, the resolution of the grids utilized in numerical code models can easily surpass the resolution of physical camera diagnostic geometries. For these reasons we have developed a Synthetic Imaging Diagnostic capability for forward projection comparisons of EMC3-EIRENE model solutions with the line integrated images from the Doppler Coherence Imaging diagnostic on DIII-D. The forward projection matrix is 2.8 Mpixel by 6.4 Mcells for the non-axisymmetric case we present. For flow comparisons, both simple line integral, and field aligned component matrices must be calculated. The calculation of these matrices is a massive embarrassingly parallel problem and performed with a custom dispatcher that allows processing platforms to join mid-problem as they become available, or drop out if resources are needed for higher priority tasks. The matrices are handled using standard sparse matrix techniques. Prepared by LLNL under Contract DE-AC52-07NA27344. This material is based upon work supported by the U.S. DOE, Office of Science, Office of Fusion Energy Sciences. LLNL-ABS-734800.

  2. A New Method for Coronal Magnetic Field Reconstruction

    NASA Astrophysics Data System (ADS)

    Yi, Sibaek; Choe, Gwang-Son; Cho, Kyung-Suk; Kim, Kap-Sung

    2017-08-01

    A precise way of coronal magnetic field reconstruction (extrapolation) is an indispensable tool for understanding of various solar activities. A variety of reconstruction codes have been developed so far and are available to researchers nowadays, but they more or less bear this and that shortcoming. In this paper, a new efficient method for coronal magnetic field reconstruction is presented. The method imposes only the normal components of magnetic field and current density at the bottom boundary to avoid the overspecification of the reconstruction problem, and employs vector potentials to guarantee the divergence-freeness. In our method, the normal component of current density is imposed, not by adjusting the tangential components of A, but by adjusting its normal component. This allows us to avoid a possible numerical instability that on and off arises in codes using A. In real reconstruction problems, the information for the lateral and top boundaries is absent. The arbitrariness of the boundary conditions imposed there as well as various preprocessing brings about the diversity of resulting solutions. We impose the source surface condition at the top boundary to accommodate flux imbalance, which always shows up in magnetograms. To enhance the convergence rate, we equip our code with a gradient-method type accelerator. Our code is tested on two analytical force-free solutions. When the solution is given only at the bottom boundary, our result surpasses competitors in most figures of merits devised by Schrijver et al. (2006). We have also applied our code to a real active region NOAA 11974, in which two M-class flares and a halo CME took place. The EUV observation shows a sudden appearance of an erupting loop before the first flare. Our numerical solutions show that two entwining flux tubes exist before the flare and their shackling is released after the CME with one of them opened up. We suggest that the erupting loop is created by magnetic reconnection between two entwining flux tubes and later appears in the coronagraph as the major constituent of the observed CME.

  3. Evaluation of the Xeon phi processor as a technology for the acceleration of real-time control in high-order adaptive optics systems

    NASA Astrophysics Data System (ADS)

    Barr, David; Basden, Alastair; Dipper, Nigel; Schwartz, Noah; Vick, Andy; Schnetler, Hermine

    2014-08-01

    We present wavefront reconstruction acceleration of high-order AO systems using an Intel Xeon Phi processor. The Xeon Phi is a coprocessor providing many integrated cores and designed for accelerating compute intensive, numerical codes. Unlike other accelerator technologies, it allows virtually unchanged C/C++ to be recompiled to run on the Xeon Phi, giving the potential of making development, upgrade and maintenance faster and less complex. We benchmark the Xeon Phi in the context of AO real-time control by running a matrix vector multiply (MVM) algorithm. We investigate variability in execution time and demonstrate a substantial speed-up in loop frequency. We examine the integration of a Xeon Phi into an existing RTC system and show that performance improvements can be achieved with limited development effort.

  4. The Potential of Low-Cost Rpas for Multi-View Reconstruction of Sub-Vertical Rock Faces

    NASA Astrophysics Data System (ADS)

    Thoeni, K.; Guccione, D. E.; Santise, M.; Giacomini, A.; Roncella, R.; Forlani, G.

    2016-06-01

    The current work investigates the potential of two low-cost off-the-shelf quadcopters for multi-view reconstruction of sub-vertical rock faces. The two platforms used are a DJI Phantom 1 equipped with a Gopro Hero 3+ Black and a DJI Phantom 3 Professional with integrated camera. The study area is a small sub-vertical rock face. Several flights were performed with both cameras set in time-lapse mode. Hence, images were taken automatically but the flights were performed manually as the investigated rock face is very irregular which required manual adjustment of the yaw and roll for optimal coverage. The digital images were processed with commercial SfM software packages. Several processing settings were investigated in order to find out the one providing the most accurate 3D reconstruction of the rock face. To this aim, all 3D models produced with both platforms are compared to a point cloud obtained with a terrestrial laser scanner. Firstly, the difference between the use of coded ground control targets and the use of natural features was studied. Coded targets generally provide the best accuracy, but they need to be placed on the surface, which is not always possible, as sub-vertical rock faces are not easily accessible. Nevertheless, natural features can provide a good alternative if wisely chosen as shown in this work. Secondly, the influence of using fixed interior orientation parameters or self-calibration was investigated. The results show that, in the case of the used sensors and camera networks, self-calibration provides better results. To support such empirical finding, a numerical investigation using a Monte Carlo simulation was performed.

  5. Accelerating the reconstruction of magnetic resonance imaging by three-dimensional dual-dictionary learning using CUDA.

    PubMed

    Jiansen Li; Jianqi Sun; Ying Song; Yanran Xu; Jun Zhao

    2014-01-01

    An effective way to improve the data acquisition speed of magnetic resonance imaging (MRI) is using under-sampled k-space data, and dictionary learning method can be used to maintain the reconstruction quality. Three-dimensional dictionary trains the atoms in dictionary in the form of blocks, which can utilize the spatial correlation among slices. Dual-dictionary learning method includes a low-resolution dictionary and a high-resolution dictionary, for sparse coding and image updating respectively. However, the amount of data is huge for three-dimensional reconstruction, especially when the number of slices is large. Thus, the procedure is time-consuming. In this paper, we first utilize the NVIDIA Corporation's compute unified device architecture (CUDA) programming model to design the parallel algorithms on graphics processing unit (GPU) to accelerate the reconstruction procedure. The main optimizations operate in the dictionary learning algorithm and the image updating part, such as the orthogonal matching pursuit (OMP) algorithm and the k-singular value decomposition (K-SVD) algorithm. Then we develop another version of CUDA code with algorithmic optimization. Experimental results show that more than 324 times of speedup is achieved compared with the CPU-only codes when the number of MRI slices is 24.

  6. Reconstruction of coded aperture images

    NASA Technical Reports Server (NTRS)

    Bielefeld, Michael J.; Yin, Lo I.

    1987-01-01

    Balanced correlation method and the Maximum Entropy Method (MEM) were implemented to reconstruct a laboratory X-ray source as imaged by a Uniformly Redundant Array (URA) system. Although the MEM method has advantages over the balanced correlation method, it is computationally time consuming because of the iterative nature of its solution. Massively Parallel Processing, with its parallel array structure is ideally suited for such computations. These preliminary results indicate that it is possible to use the MEM method in future coded-aperture experiments with the help of the MPP.

  7. Hierarchical image coding with diamond-shaped sub-bands

    NASA Technical Reports Server (NTRS)

    Li, Xiaohui; Wang, Jie; Bauer, Peter; Sauer, Ken

    1992-01-01

    We present a sub-band image coding/decoding system using a diamond-shaped pyramid frequency decomposition to more closely match visual sensitivities than conventional rectangular bands. Filter banks are composed of simple, low order IIR components. The coder is especially designed to function in a multiple resolution reconstruction setting, in situations such as variable capacity channels or receivers, where images must be reconstructed without the entire pyramid of sub-bands. We use a nonlinear interpolation technique for lost subbands to compensate for loss of aliasing cancellation.

  8. Study of noise propagation and the effects of insufficient numbers of projection angles and detector samplings for iterative reconstruction using planar-integral data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, B.; Zeng, G. L.

    2006-09-15

    A rotating slat collimator can be used to acquire planar-integral data. It achieves higher geometric efficiency than a parallel-hole collimator by accepting more photons, but the planar-integral data contain less tomographic information that may result in larger noise amplification in the reconstruction. Lodge evaluated the rotating slat system and the parallel-hole system based on noise behavior for an FBP reconstruction. Here, we evaluate the noise propagation properties of the two collimation systems for iterative reconstruction. We extend Huesman's noise propagation analysis of the line-integral system to the planar-integral case, and show that approximately 2.0(D/dp) SPECT angles, 2.5(D/dp) self-spinning angles atmore » each detector position, and a 0.5dp detector sampling interval are required in order for the planar-integral data to be efficiently utilized. Here, D is the diameter of the object and dp is the linear dimension of the voxels that subdivide the object. The noise propagation behaviors of the two systems are then compared based on a least-square reconstruction using the ratio of the SNR in the image reconstructed using a planar-integral system to that reconstructed using a line-integral system. The ratio is found to be proportional to {radical}(F/D), where F is a geometric efficiency factor. This result has been verified by computer simulations. It confirms that for an iterative reconstruction, the noise tradeoff of the two systems is not only dependent on the increase of the geometric efficiency afforded by the planar projection method, but also dependent on the size of the object. The planar-integral system works better for small objects, while the line-integral system performs better for large ones. This result is consistent with Lodge's results based on the FBP method.« less

  9. Economic factors affecting head and neck reconstructive microsurgery: the surgeons' and hospital's perspective.

    PubMed

    Deleyiannis, Frederic W-B; Porter, Andrew C

    2007-07-01

    The purpose of this study was to determine the relative financial value of providing the service of free-tissue transfer for head and neck reconstruction from the surgeons' and hospital's perspective. Medical and hospital accounting records of 58 consecutive patients undergoing head and neck resections and simultaneous free-flap reconstruction were reviewed. Software from the Center for Medicare and Medicaid Services was used to calculate anticipated Medicare payments to the surgeon based on current procedural terminology codes and to the hospital based on diagnosis-related group codes. The mean actual payment to the surgeon for a free flap was $2300.60. This payment was 91.6 percent ($2300 out of $2510) of the calculated payment if all payments had been reimbursed by Medicare. Total charges and total payment to the hospital for the 58 patients were $19,148,852 and $2,765,552, respectively. After covering direct costs, total hospital revenue (i.e., margin) was $1,056,886. The mostly commonly assigned diagnosis-related group code was 482 (n = 35). According to the fee schedule for that code, if Medicare had been the insurance plan for these 35 patients, the mean payment to the hospital would have been $45,840. The actual mean hospital payment was $44,133. This actual hospital payment represents 96 percent of the calculated Medicare hospital payment ($44,133 of $45,840). Free-flap reconstruction of the head and neck generates substantial revenue for the hospital. For their mutual benefit, hospitals should join with physicians in contract negotiations of physician reimbursement with insurance companies. Bolstered reimbursement figures would better attract and retain skilled surgeons dedicated to microvascular reconstruction.

  10. Class of near-perfect coded apertures

    NASA Technical Reports Server (NTRS)

    Cannon, T. M.; Fenimore, E. E.

    1977-01-01

    Coded aperture imaging of gamma ray sources has long promised an improvement in the sensitivity of various detector systems. The promise has remained largely unfulfilled, however, for either one of two reasons. First, the encoding/decoding method produces artifacts, which even in the absence of quantum noise, restrict the quality of the reconstructed image. This is true of most correlation-type methods. Second, if the decoding procedure is of the deconvolution variety, small terms in the transfer function of the aperture can lead to excessive noise in the reconstructed image. It is proposed to circumvent both of these problems by use of a uniformly redundant array (URA) as the coded aperture in conjunction with a special correlation decoding method.

  11. Scalable Coding of Plenoptic Images by Using a Sparse Set and Disparities.

    PubMed

    Li, Yun; Sjostrom, Marten; Olsson, Roger; Jennehag, Ulf

    2016-01-01

    One of the light field capturing techniques is the focused plenoptic capturing. By placing a microlens array in front of the photosensor, the focused plenoptic cameras capture both spatial and angular information of a scene in each microlens image and across microlens images. The capturing results in a significant amount of redundant information, and the captured image is usually of a large resolution. A coding scheme that removes the redundancy before coding can be of advantage for efficient compression, transmission, and rendering. In this paper, we propose a lossy coding scheme to efficiently represent plenoptic images. The format contains a sparse image set and its associated disparities. The reconstruction is performed by disparity-based interpolation and inpainting, and the reconstructed image is later employed as a prediction reference for the coding of the full plenoptic image. As an outcome of the representation, the proposed scheme inherits a scalable structure with three layers. The results show that plenoptic images are compressed efficiently with over 60 percent bit rate reduction compared with High Efficiency Video Coding intra coding, and with over 20 percent compared with an High Efficiency Video Coding block copying mode.

  12. Design and implementation of coded aperture coherent scatter spectral imaging of cancerous and healthy breast tissue samples

    PubMed Central

    Lakshmanan, Manu N.; Greenberg, Joel A.; Samei, Ehsan; Kapadia, Anuj J.

    2016-01-01

    Abstract. A scatter imaging technique for the differentiation of cancerous and healthy breast tissue in a heterogeneous sample is introduced in this work. Such a technique has potential utility in intraoperative margin assessment during lumpectomy procedures. In this work, we investigate the feasibility of the imaging method for tumor classification using Monte Carlo simulations and physical experiments. The coded aperture coherent scatter spectral imaging technique was used to reconstruct three-dimensional (3-D) images of breast tissue samples acquired through a single-position snapshot acquisition, without rotation as is required in coherent scatter computed tomography. We perform a quantitative assessment of the accuracy of the cancerous voxel classification using Monte Carlo simulations of the imaging system; describe our experimental implementation of coded aperture scatter imaging; show the reconstructed images of the breast tissue samples; and present segmentations of the 3-D images in order to identify the cancerous and healthy tissue in the samples. From the Monte Carlo simulations, we find that coded aperture scatter imaging is able to reconstruct images of the samples and identify the distribution of cancerous and healthy tissues (i.e., fibroglandular, adipose, or a mix of the two) inside them with a cancerous voxel identification sensitivity, specificity, and accuracy of 92.4%, 91.9%, and 92.0%, respectively. From the experimental results, we find that the technique is able to identify cancerous and healthy tissue samples and reconstruct differential coherent scatter cross sections that are highly correlated with those measured by other groups using x-ray diffraction. Coded aperture scatter imaging has the potential to provide scatter images that automatically differentiate cancerous and healthy tissue inside samples within a time on the order of a minute per slice. PMID:26962543

  13. Design and implementation of coded aperture coherent scatter spectral imaging of cancerous and healthy breast tissue samples.

    PubMed

    Lakshmanan, Manu N; Greenberg, Joel A; Samei, Ehsan; Kapadia, Anuj J

    2016-01-01

    A scatter imaging technique for the differentiation of cancerous and healthy breast tissue in a heterogeneous sample is introduced in this work. Such a technique has potential utility in intraoperative margin assessment during lumpectomy procedures. In this work, we investigate the feasibility of the imaging method for tumor classification using Monte Carlo simulations and physical experiments. The coded aperture coherent scatter spectral imaging technique was used to reconstruct three-dimensional (3-D) images of breast tissue samples acquired through a single-position snapshot acquisition, without rotation as is required in coherent scatter computed tomography. We perform a quantitative assessment of the accuracy of the cancerous voxel classification using Monte Carlo simulations of the imaging system; describe our experimental implementation of coded aperture scatter imaging; show the reconstructed images of the breast tissue samples; and present segmentations of the 3-D images in order to identify the cancerous and healthy tissue in the samples. From the Monte Carlo simulations, we find that coded aperture scatter imaging is able to reconstruct images of the samples and identify the distribution of cancerous and healthy tissues (i.e., fibroglandular, adipose, or a mix of the two) inside them with a cancerous voxel identification sensitivity, specificity, and accuracy of 92.4%, 91.9%, and 92.0%, respectively. From the experimental results, we find that the technique is able to identify cancerous and healthy tissue samples and reconstruct differential coherent scatter cross sections that are highly correlated with those measured by other groups using x-ray diffraction. Coded aperture scatter imaging has the potential to provide scatter images that automatically differentiate cancerous and healthy tissue inside samples within a time on the order of a minute per slice.

  14. Visually lossless compression of digital hologram sequences

    NASA Astrophysics Data System (ADS)

    Darakis, Emmanouil; Kowiel, Marcin; Näsänen, Risto; Naughton, Thomas J.

    2010-01-01

    Digital hologram sequences have great potential for the recording of 3D scenes of moving macroscopic objects as their numerical reconstruction can yield a range of perspective views of the scene. Digital holograms inherently have large information content and lossless coding of holographic data is rather inefficient due to the speckled nature of the interference fringes they contain. Lossy coding of still holograms and hologram sequences has shown promising results. By definition, lossy compression introduces errors in the reconstruction. In all of the previous studies, numerical metrics were used to measure the compression error and through it, the coding quality. Digital hologram reconstructions are highly speckled and the speckle pattern is very sensitive to data changes. Hence, numerical quality metrics can be misleading. For example, for low compression ratios, a numerically significant coding error can have visually negligible effects. Yet, in several cases, it is of high interest to know how much lossy compression can be achieved, while maintaining the reconstruction quality at visually lossless levels. Using an experimental threshold estimation method, the staircase algorithm, we determined the highest compression ratio that was not perceptible to human observers for objects compressed with Dirac and MPEG-4 compression methods. This level of compression can be regarded as the point below which compression is perceptually lossless although physically the compression is lossy. It was found that up to 4 to 7.5 fold compression can be obtained with the above methods without any perceptible change in the appearance of video sequences.

  15. Ultrafast and scalable cone-beam CT reconstruction using MapReduce in a cloud computing environment.

    PubMed

    Meng, Bowen; Pratx, Guillem; Xing, Lei

    2011-12-01

    Four-dimensional CT (4DCT) and cone beam CT (CBCT) are widely used in radiation therapy for accurate tumor target definition and localization. However, high-resolution and dynamic image reconstruction is computationally demanding because of the large amount of data processed. Efficient use of these imaging techniques in the clinic requires high-performance computing. The purpose of this work is to develop a novel ultrafast, scalable and reliable image reconstruction technique for 4D CBCT∕CT using a parallel computing framework called MapReduce. We show the utility of MapReduce for solving large-scale medical physics problems in a cloud computing environment. In this work, we accelerated the Feldcamp-Davis-Kress (FDK) algorithm by porting it to Hadoop, an open-source MapReduce implementation. Gated phases from a 4DCT scans were reconstructed independently. Following the MapReduce formalism, Map functions were used to filter and backproject subsets of projections, and Reduce function to aggregate those partial backprojection into the whole volume. MapReduce automatically parallelized the reconstruction process on a large cluster of computer nodes. As a validation, reconstruction of a digital phantom and an acquired CatPhan 600 phantom was performed on a commercial cloud computing environment using the proposed 4D CBCT∕CT reconstruction algorithm. Speedup of reconstruction time is found to be roughly linear with the number of nodes employed. For instance, greater than 10 times speedup was achieved using 200 nodes for all cases, compared to the same code executed on a single machine. Without modifying the code, faster reconstruction is readily achievable by allocating more nodes in the cloud computing environment. Root mean square error between the images obtained using MapReduce and a single-threaded reference implementation was on the order of 10(-7). Our study also proved that cloud computing with MapReduce is fault tolerant: the reconstruction completed successfully with identical results even when half of the nodes were manually terminated in the middle of the process. An ultrafast, reliable and scalable 4D CBCT∕CT reconstruction method was developed using the MapReduce framework. Unlike other parallel computing approaches, the parallelization and speedup required little modification of the original reconstruction code. MapReduce provides an efficient and fault tolerant means of solving large-scale computing problems in a cloud computing environment.

  16. Ultrafast and scalable cone-beam CT reconstruction using MapReduce in a cloud computing environment

    PubMed Central

    Meng, Bowen; Pratx, Guillem; Xing, Lei

    2011-01-01

    Purpose: Four-dimensional CT (4DCT) and cone beam CT (CBCT) are widely used in radiation therapy for accurate tumor target definition and localization. However, high-resolution and dynamic image reconstruction is computationally demanding because of the large amount of data processed. Efficient use of these imaging techniques in the clinic requires high-performance computing. The purpose of this work is to develop a novel ultrafast, scalable and reliable image reconstruction technique for 4D CBCT/CT using a parallel computing framework called MapReduce. We show the utility of MapReduce for solving large-scale medical physics problems in a cloud computing environment. Methods: In this work, we accelerated the Feldcamp–Davis–Kress (FDK) algorithm by porting it to Hadoop, an open-source MapReduce implementation. Gated phases from a 4DCT scans were reconstructed independently. Following the MapReduce formalism, Map functions were used to filter and backproject subsets of projections, and Reduce function to aggregate those partial backprojection into the whole volume. MapReduce automatically parallelized the reconstruction process on a large cluster of computer nodes. As a validation, reconstruction of a digital phantom and an acquired CatPhan 600 phantom was performed on a commercial cloud computing environment using the proposed 4D CBCT/CT reconstruction algorithm. Results: Speedup of reconstruction time is found to be roughly linear with the number of nodes employed. For instance, greater than 10 times speedup was achieved using 200 nodes for all cases, compared to the same code executed on a single machine. Without modifying the code, faster reconstruction is readily achievable by allocating more nodes in the cloud computing environment. Root mean square error between the images obtained using MapReduce and a single-threaded reference implementation was on the order of 10−7. Our study also proved that cloud computing with MapReduce is fault tolerant: the reconstruction completed successfully with identical results even when half of the nodes were manually terminated in the middle of the process. Conclusions: An ultrafast, reliable and scalable 4D CBCT/CT reconstruction method was developed using the MapReduce framework. Unlike other parallel computing approaches, the parallelization and speedup required little modification of the original reconstruction code. MapReduce provides an efficient and fault tolerant means of solving large-scale computing problems in a cloud computing environment. PMID:22149842

  17. Integration of TomoPy and the ASTRA toolbox for advanced processing and reconstruction of tomographic synchrotron data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pelt, Daniël M.; Gürsoy, Dogˇa; Palenstijn, Willem Jan

    2016-04-28

    The processing of tomographic synchrotron data requires advanced and efficient software to be able to produce accurate results in reasonable time. In this paper, the integration of two software toolboxes, TomoPy and the ASTRA toolbox, which, together, provide a powerful framework for processing tomographic data, is presented. The integration combines the advantages of both toolboxes, such as the user-friendliness and CPU-efficient methods of TomoPy and the flexibility and optimized GPU-based reconstruction methods of the ASTRA toolbox. It is shown that both toolboxes can be easily installed and used together, requiring only minor changes to existing TomoPy scripts. Furthermore, it ismore » shown that the efficient GPU-based reconstruction methods of the ASTRA toolbox can significantly decrease the time needed to reconstruct large datasets, and that advanced reconstruction methods can improve reconstruction quality compared with TomoPy's standard reconstruction method.« less

  18. Integration of TomoPy and the ASTRA toolbox for advanced processing and reconstruction of tomographic synchrotron data

    PubMed Central

    Pelt, Daniël M.; Gürsoy, Doǧa; Palenstijn, Willem Jan; Sijbers, Jan; De Carlo, Francesco; Batenburg, Kees Joost

    2016-01-01

    The processing of tomographic synchrotron data requires advanced and efficient software to be able to produce accurate results in reasonable time. In this paper, the integration of two software toolboxes, TomoPy and the ASTRA toolbox, which, together, provide a powerful framework for processing tomographic data, is presented. The integration combines the advantages of both toolboxes, such as the user-friendliness and CPU-efficient methods of TomoPy and the flexibility and optimized GPU-based reconstruction methods of the ASTRA toolbox. It is shown that both toolboxes can be easily installed and used together, requiring only minor changes to existing TomoPy scripts. Furthermore, it is shown that the efficient GPU-based reconstruction methods of the ASTRA toolbox can significantly decrease the time needed to reconstruct large datasets, and that advanced reconstruction methods can improve reconstruction quality compared with TomoPy’s standard reconstruction method. PMID:27140167

  19. Coding and transmission of subband coded images on the Internet

    NASA Astrophysics Data System (ADS)

    Wah, Benjamin W.; Su, Xiao

    2001-09-01

    Subband-coded images can be transmitted in the Internet using either the TCP or the UDP protocol. Delivery by TCP gives superior decoding quality but with very long delays when the network is unreliable, whereas delivery by UDP has negligible delays but with degraded quality when packets are lost. Although images are delivered currently over the Internet by TCP, we study in this paper the use of UDP to deliver multi-description reconstruction-based subband-coded images. First, in order to facilitate recovery from UDP packet losses, we propose a joint sender-receiver approach for designing optimized reconstruction-based subband transform (ORB-ST) in multi-description coding (MDC). Second, we carefully evaluate the delay-quality trade-offs between the TCP delivery of SDC images and the UDP and combined TCP/UDP delivery of MDC images. Experimental results show that our proposed ORB-ST performs well in real Internet tests, and UDP and combined TCP/UDP delivery of MDC images provide a range of attractive alternatives to TCP delivery.

  20. On Asymptotically Good Ramp Secret Sharing Schemes

    NASA Astrophysics Data System (ADS)

    Geil, Olav; Martin, Stefano; Martínez-Peñas, Umberto; Matsumoto, Ryutaroh; Ruano, Diego

    Asymptotically good sequences of linear ramp secret sharing schemes have been intensively studied by Cramer et al. in terms of sequences of pairs of nested algebraic geometric codes. In those works the focus is on full privacy and full reconstruction. In this paper we analyze additional parameters describing the asymptotic behavior of partial information leakage and possibly also partial reconstruction giving a more complete picture of the access structure for sequences of linear ramp secret sharing schemes. Our study involves a detailed treatment of the (relative) generalized Hamming weights of the considered codes.

  1. Dual-camera design for coded aperture snapshot spectral imaging.

    PubMed

    Wang, Lizhi; Xiong, Zhiwei; Gao, Dahua; Shi, Guangming; Wu, Feng

    2015-02-01

    Coded aperture snapshot spectral imaging (CASSI) provides an efficient mechanism for recovering 3D spectral data from a single 2D measurement. However, since the reconstruction problem is severely underdetermined, the quality of recovered spectral data is usually limited. In this paper we propose a novel dual-camera design to improve the performance of CASSI while maintaining its snapshot advantage. Specifically, a beam splitter is placed in front of the objective lens of CASSI, which allows the same scene to be simultaneously captured by a grayscale camera. This uncoded grayscale measurement, in conjunction with the coded CASSI measurement, greatly eases the reconstruction problem and yields high-quality 3D spectral data. Both simulation and experimental results demonstrate the effectiveness of the proposed method.

  2. Adaptive temporal compressive sensing for video with motion estimation

    NASA Astrophysics Data System (ADS)

    Wang, Yeru; Tang, Chaoying; Chen, Yueting; Feng, Huajun; Xu, Zhihai; Li, Qi

    2018-04-01

    In this paper, we present an adaptive reconstruction method for temporal compressive imaging with pixel-wise exposure. The motion of objects is first estimated from interpolated images with a designed coding mask. With the help of motion estimation, image blocks are classified according to the degree of motion and reconstructed with the corresponding dictionary, which was trained beforehand. Both the simulation and experiment results show that the proposed method can obtain accurate motion information before reconstruction and efficiently reconstruct compressive video.

  3. New shape models of asteroids reconstructed from sparse-in-time photometry

    NASA Astrophysics Data System (ADS)

    Durech, Josef; Hanus, Josef; Vanco, Radim; Oszkiewicz, Dagmara Anna

    2015-08-01

    Asteroid physical parameters - the shape, the sidereal rotation period, and the spin axis orientation - can be reconstructed from the disk-integrated photometry either dense (classical lightcurves) or sparse in time by the lightcurve inversion method. We will review our recent progress in asteroid shape reconstruction from sparse photometry. The problem of finding a unique solution of the inverse problem is time consuming because the sidereal rotation period has to be found by scanning a wide interval of possible periods. This can be efficiently solved by splitting the period parameter space into small parts that are sent to computers of volunteers and processed in parallel. We will show how this approach of distributed computing works with currently available sparse photometry processed in the framework of project Asteroids@home. In particular, we will show the results based on the Lowell Photometric Database. The method produce reliable asteroid models with very low rate of false solutions and the pipelines and codes can be directly used also to other sources of sparse photometry - Gaia data, for example. We will present the distribution of spin axis of hundreds of asteroids, discuss the dependence of the spin obliquity on the size of an asteroid,and show examples of spin-axis distribution in asteroid families that confirm the Yarkovsky/YORP evolution scenario.

  4. Transcriptome analysis on the exoskeleton formation in early developmetal stages and reconstruction scenario in growth-moulting in Litopenaeus vannamei.

    PubMed

    Gao, Yi; Wei, Jiankai; Yuan, Jianbo; Zhang, Xiaojun; Li, Fuhua; Xiang, Jianhai

    2017-04-24

    Exoskeleton construction is an important issue in shrimp. To better understand the molecular mechanism of exoskeleton formation, development and reconstruction, the transcriptome of the entire developmental process in Litopenaeus vannamei, including nine early developmental stages and eight adult-moulting stages, was sequenced and analysed using Illumina RNA-seq technology. A total of 117,539 unigenes were obtained, with 41.2% unigenes predicting the full-length coding sequence. Gene Ontology, Clusters of Orthologous Group (COG), the Kyoto Encyclopedia of Genes and Genomes (KEGG) analysis and functional annotation of all unigenes gave a better understanding of the exoskeleton developmental process in L. vannamei. As a result, more than six hundred unigenes related to exoskeleton development were identified both in the early developmental stages and adult-moulting. A cascade of sequential expression events of exoskeleton-related genes were summarized, including exoskeleton formation, regulation, synthesis, degradation, mineral absorption/reabsorption, calcification and hardening. This new insight on major transcriptional events provide a deep understanding for exoskeleton formation and reconstruction in L. vannamei. In conclusion, this is the first study that characterized the integrated transcriptomic profiles cover the entire exoskeleton development from zygote to adult-moulting in a crustacean, and these findings will serve as significant references for exoskeleton developmental biology and aquaculture research.

  5. Quarterly Research Performance Progress Report (2015 Q3). Ultrasonic Phased Arrays and Interactive Reflectivity Tomography for Nondestructive Inspection of Injection and Production Wells in Geothermal Energy Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Santos-Villalobos, Hector J; Polsky, Yarom; Kisner, Roger A

    2015-09-01

    For the past quarter, we have placed our effort in implementing the first version of the ModelBased Iterative Reconstruction (MBIR) algorithm, assembling and testing the electronics, designing transducers mounts, and defining our laboratory test samples. We have successfully developed the first implementation of MBIR for ultrasound imaging. The current algorithm was tested with synthetic data and we are currently making new modifications for the reconstruction of real ultrasound data. Beside assembling and testing the electronics, we developed a LabView graphic user interface (GUI) to fully control the ultrasonic phased array, adjust the time-delays of the transducers, and store the measuredmore » reflections. As part of preparing for a laboratory-scale demonstration, the design and fabrication of the laboratory samples has begun. Three cement blocks with embedded objects will be fabricated, characterized, and used to demonstrate the capabilities of the system. During the next quarter, we will continue to improve the current MBIR forward model and integrate the reconstruction code with the LabView GUI. In addition, we will define focal laws for the ultrasonic phased array and perform the laboratory demonstration. We expect to perform laboratory demonstration by the end of October 2015.« less

  6. Weighted spline based integration for reconstruction of freeform wavefront.

    PubMed

    Pant, Kamal K; Burada, Dali R; Bichra, Mohamed; Ghosh, Amitava; Khan, Gufran S; Sinzinger, Stefan; Shakher, Chandra

    2018-02-10

    In the present work, a spline-based integration technique for the reconstruction of a freeform wavefront from the slope data has been implemented. The slope data of a freeform surface contain noise due to their machining process and that introduces reconstruction error. We have proposed a weighted cubic spline based least square integration method (WCSLI) for the faithful reconstruction of a wavefront from noisy slope data. In the proposed method, the measured slope data are fitted into a piecewise polynomial. The fitted coefficients are determined by using a smoothing cubic spline fitting method. The smoothing parameter locally assigns relative weight to the fitted slope data. The fitted slope data are then integrated using the standard least squares technique to reconstruct the freeform wavefront. Simulation studies show the improved result using the proposed technique as compared to the existing cubic spline-based integration (CSLI) and the Southwell methods. The proposed reconstruction method has been experimentally implemented to a subaperture stitching-based measurement of a freeform wavefront using a scanning Shack-Hartmann sensor. The boundary artifacts are minimal in WCSLI which improves the subaperture stitching accuracy and demonstrates an improved Shack-Hartmann sensor for freeform metrology application.

  7. A robust coding scheme for packet video

    NASA Technical Reports Server (NTRS)

    Chen, Y. C.; Sayood, Khalid; Nelson, D. J.

    1991-01-01

    We present a layered packet video coding algorithm based on a progressive transmission scheme. The algorithm provides good compression and can handle significant packet loss with graceful degradation in the reconstruction sequence. Simulation results for various conditions are presented.

  8. A robust coding scheme for packet video

    NASA Technical Reports Server (NTRS)

    Chen, Yun-Chung; Sayood, Khalid; Nelson, Don J.

    1992-01-01

    A layered packet video coding algorithm based on a progressive transmission scheme is presented. The algorithm provides good compression and can handle significant packet loss with graceful degradation in the reconstruction sequence. Simulation results for various conditions are presented.

  9. From GCode to STL: Reconstruct Models from 3D Printing as a Service

    NASA Astrophysics Data System (ADS)

    Baumann, Felix W.; Schuermann, Martin; Odefey, Ulrich; Pfeil, Markus

    2017-12-01

    The authors present a method to reverse engineer 3D printer specific machine instructions (GCode) to a point cloud representation and then a STL (Stereolithography) file format. GCode is a machine code that is used for 3D printing among other applications, such as CNC routers. Such code files contain instructions for the 3D printer to move and control its actuator, in case of Fused Deposition Modeling (FDM), the printhead that extrudes semi-molten plastics. The reverse engineering method presented here is based on the digital simulation of the extrusion process of FDM type 3D printing. The reconstructed models and pointclouds do not accommodate for hollow structures, such as holes or cavities. The implementation is performed in Python and relies on open source software and libraries, such as Matplotlib and OpenCV. The reconstruction is performed on the model’s extrusion boundary and considers mechanical imprecision. The complete reconstruction mechanism is available as a RESTful (Representational State Transfer) Web service.

  10. Incoherent digital holograms acquired by interferenceless coded aperture correlation holography system without refractive lenses.

    PubMed

    Kumar, Manoj; Vijayakumar, A; Rosen, Joseph

    2017-09-14

    We present a lensless, interferenceless incoherent digital holography technique based on the principle of coded aperture correlation holography. The acquired digital hologram by this technique contains a three-dimensional image of some observed scene. Light diffracted by a point object (pinhole) is modulated using a random-like coded phase mask (CPM) and the intensity pattern is recorded and composed as a point spread hologram (PSH). A library of PSHs is created using the same CPM by moving the pinhole to all possible axial locations. Intensity diffracted through the same CPM from an object placed within the axial limits of the PSH library is recorded by a digital camera. The recorded intensity this time is composed as the object hologram. The image of the object at any axial plane is reconstructed by cross-correlating the object hologram with the corresponding component of the PSH library. The reconstruction noise attached to the image is suppressed by various methods. The reconstruction results of multiplane and thick objects by this technique are compared with regular lens-based imaging.

  11. Methods of evaluating the effects of coding on SAR data

    NASA Technical Reports Server (NTRS)

    Dutkiewicz, Melanie; Cumming, Ian

    1993-01-01

    It is recognized that mean square error (MSE) is not a sufficient criterion for determining the acceptability of an image reconstructed from data that has been compressed and decompressed using an encoding algorithm. In the case of Synthetic Aperture Radar (SAR) data, it is also deemed to be insufficient to display the reconstructed image (and perhaps error image) alongside the original and make a (subjective) judgment as to the quality of the reconstructed data. In this paper we suggest a number of additional evaluation criteria which we feel should be included as evaluation metrics in SAR data encoding experiments. These criteria have been specifically chosen to provide a means of ensuring that the important information in the SAR data is preserved. The paper also presents the results of an investigation into the effects of coding on SAR data fidelity when the coding is applied in (1) the signal data domain, and (2) the image domain. An analysis of the results highlights the shortcomings of the MSE criterion, and shows which of the suggested additional criterion have been found to be most important.

  12. Global tectonic reconstructions with continuously deforming and evolving rigid plates

    NASA Astrophysics Data System (ADS)

    Gurnis, Michael; Yang, Ting; Cannon, John; Turner, Mark; Williams, Simon; Flament, Nicolas; Müller, R. Dietmar

    2018-07-01

    Traditional plate reconstruction methodologies do not allow for plate deformation to be considered. Here we present software to construct and visualize global tectonic reconstructions with deforming plates within the context of rigid plates. Both deforming and rigid plates are defined by continuously evolving polygons. The deforming regions are tessellated with triangular meshes such that either strain rate or cumulative strain can be followed. The finite strain history, crustal thickness and stretching factor of points within the deformation zones are tracked as Lagrangian points. Integrating these tools within the interactive platform GPlates enables specialized users to build and refine deforming plate models and integrate them with other models in time and space. We demonstrate the integrated platform with regional reconstructions of Cenozoic western North America, the Mesozoic South American Atlantic margin, and Cenozoic southeast Asia, embedded within global reconstructions, using different data and reconstruction strategies.

  13. COnstrained Data Extrapolation (CODE): A new approach for high definition vascular imaging from low resolution data.

    PubMed

    Song, Yang; Hamtaei, Ehsan; Sethi, Sean K; Yang, Guang; Xie, Haibin; Mark Haacke, E

    2017-12-01

    To introduce a new approach to reconstruct high definition vascular images using COnstrained Data Extrapolation (CODE) and evaluate its capability in estimating vessel area and stenosis. CODE is based on the constraint that the full width half maximum of a vessel can be accurately estimated and, since it represents the best estimate for the width of the object, higher k-space data can be generated from this information. To demonstrate the potential of extracting high definition vessel edges using low resolution data, both simulated and human data were analyzed to better visualize the vessels and to quantify both area and stenosis measurements. The results from CODE using one-fourth of the fully sampled k-space data were compared with a compressed sensing (CS) reconstruction approach using the same total amount of data but spread out between the center of k-space and the outer portions of the original k-space to accelerate data acquisition by a factor of four. For a sufficiently high signal-to-noise ratio (SNR) such as 16 (8), we found that objects as small as 3 voxels in the 25% under-sampled data (6 voxels when zero-filled) could be used for CODE and CS and provide an estimate of area with an error <5% (10%). For estimating up to a 70% stenosis with an SNR of 4, CODE was found to be more robust to noise than CS having a smaller variance albeit a larger bias. Reconstruction times were >200 (30) times faster for CODE compared to CS in the simulated (human) data. CODE was capable of producing sharp sub-voxel edges and accurately estimating stenosis to within 5% for clinically relevant studies of vessels with a width of at least 3pixels in the low resolution images. Copyright © 2017 Elsevier Inc. All rights reserved.

  14. High resolution x-ray CMT: Reconstruction methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, J.K.

    This paper qualitatively discusses the primary characteristics of methods for reconstructing tomographic images from a set of projections. These reconstruction methods can be categorized as either {open_quotes}analytic{close_quotes} or {open_quotes}iterative{close_quotes} techniques. Analytic algorithms are derived from the formal inversion of equations describing the imaging process, while iterative algorithms incorporate a model of the imaging process and provide a mechanism to iteratively improve image estimates. Analytic reconstruction algorithms are typically computationally more efficient than iterative methods; however, analytic algorithms are available for a relatively limited set of imaging geometries and situations. Thus, the framework of iterative reconstruction methods is better suited formore » high accuracy, tomographic reconstruction codes.« less

  15. Probabilistic modeling of the evolution of gene synteny within reconciled phylogenies

    PubMed Central

    2015-01-01

    Background Most models of genome evolution concern either genetic sequences, gene content or gene order. They sometimes integrate two of the three levels, but rarely the three of them. Probabilistic models of gene order evolution usually have to assume constant gene content or adopt a presence/absence coding of gene neighborhoods which is blind to complex events modifying gene content. Results We propose a probabilistic evolutionary model for gene neighborhoods, allowing genes to be inserted, duplicated or lost. It uses reconciled phylogenies, which integrate sequence and gene content evolution. We are then able to optimize parameters such as phylogeny branch lengths, or probabilistic laws depicting the diversity of susceptibility of syntenic regions to rearrangements. We reconstruct a structure for ancestral genomes by optimizing a likelihood, keeping track of all evolutionary events at the level of gene content and gene synteny. Ancestral syntenies are associated with a probability of presence. We implemented the model with the restriction that at most one gene duplication separates two gene speciations in reconciled gene trees. We reconstruct ancestral syntenies on a set of 12 drosophila genomes, and compare the evolutionary rates along the branches and along the sites. We compare with a parsimony method and find a significant number of results not supported by the posterior probability. The model is implemented in the Bio++ library. It thus benefits from and enriches the classical models and methods for molecular evolution. PMID:26452018

  16. Fundamental Limits of Delay and Security in Device-to-Device Communication

    DTIC Science & Technology

    2013-01-01

    systematic MDS (maximum distance separable) codes and random binning strategies that achieve a Pareto optimal delayreconstruction tradeoff. The erasure MD...file, and a coding scheme based on erasure compression and Slepian-Wolf binning is presented. The coding scheme is shown to provide a Pareto optimal...ble) codes and random binning strategies that achieve a Pareto optimal delay- reconstruction tradeoff. The erasure MD setup is then used to propose a

  17. An efficient system for reliably transmitting image and video data over low bit rate noisy channels

    NASA Technical Reports Server (NTRS)

    Costello, Daniel J., Jr.; Huang, Y. F.; Stevenson, Robert L.

    1994-01-01

    This research project is intended to develop an efficient system for reliably transmitting image and video data over low bit rate noisy channels. The basic ideas behind the proposed approach are the following: employ statistical-based image modeling to facilitate pre- and post-processing and error detection, use spare redundancy that the source compression did not remove to add robustness, and implement coded modulation to improve bandwidth efficiency and noise rejection. Over the last six months, progress has been made on various aspects of the project. Through our studies of the integrated system, a list-based iterative Trellis decoder has been developed. The decoder accepts feedback from a post-processor which can detect channel errors in the reconstructed image. The error detection is based on the Huber Markov random field image model for the compressed image. The compression scheme used here is that of JPEG (Joint Photographic Experts Group). Experiments were performed and the results are quite encouraging. The principal ideas here are extendable to other compression techniques. In addition, research was also performed on unequal error protection channel coding, subband vector quantization as a means of source coding, and post processing for reducing coding artifacts. Our studies on unequal error protection (UEP) coding for image transmission focused on examining the properties of the UEP capabilities of convolutional codes. The investigation of subband vector quantization employed a wavelet transform with special emphasis on exploiting interband redundancy. The outcome of this investigation included the development of three algorithms for subband vector quantization. The reduction of transform coding artifacts was studied with the aid of a non-Gaussian Markov random field model. This results in improved image decompression. These studies are summarized and the technical papers included in the appendices.

  18. A Guide to Axial-Flow Turbine Off-Design Computer Program AXOD2

    NASA Technical Reports Server (NTRS)

    Chen, Shu-Cheng S.

    2014-01-01

    A Users Guide for the axial flow turbine off-design computer program AXOD2 is composed in this paper. This Users Guide is supplementary to the original Users Manual of AXOD. Three notable contributions of AXOD2 to its predecessor AXOD, both in the context of the Guide or in the functionality of the code, are described and discussed in length. These are: 1) a rational representation of the mathematical principles applied, with concise descriptions of the formulas implemented in the actual coding. Their physical implications are addressed; 2) the creation and documentation of an Addendum Listing of input namelist-parameters unique to AXOD2, that differ from or are in addition to the original input-namelists given in the Manual of AXOD. Their usages are discussed; and 3) the institution of proper stoppages of the code execution, encoding termination messaging and error messages of the execution to AXOD2. These measures are to safe-guard the integrity of the code execution, such that a failure mode encountered during a case-study would not plunge the code execution into indefinite loop, or cause a blow-out of the program execution. Details on these are discussed and illustrated in this paper. Moreover, this computer program has since been reconstructed substantially. Standard FORTRAN Langue was instituted, and the code was formatted in Double Precision (REAL*8). As the result, the code is now suited for use in a local Desktop Computer Environment, is perfectly portable to any Operating System, and can be executed by any FORTRAN compiler equivalent to a FORTRAN 9095 compiler. AXOD2 will be available through NASA Glenn Research Center (GRC) Software Repository.

  19. The clustering of galaxies in the completed SDSS-III Baryon Oscillation Spectroscopic Survey: cosmic flows and cosmic web from luminous red galaxies

    NASA Astrophysics Data System (ADS)

    Ata, Metin; Kitaura, Francisco-Shu; Chuang, Chia-Hsun; Rodríguez-Torres, Sergio; Angulo, Raul E.; Ferraro, Simone; Gil-Marín, Hector; McDonald, Patrick; Hernández Monteagudo, Carlos; Müller, Volker; Yepes, Gustavo; Autefage, Mathieu; Baumgarten, Falk; Beutler, Florian; Brownstein, Joel R.; Burden, Angela; Eisenstein, Daniel J.; Guo, Hong; Ho, Shirley; McBride, Cameron; Neyrinck, Mark; Olmstead, Matthew D.; Padmanabhan, Nikhil; Percival, Will J.; Prada, Francisco; Rossi, Graziano; Sánchez, Ariel G.; Schlegel, David; Schneider, Donald P.; Seo, Hee-Jong; Streblyanska, Alina; Tinker, Jeremy; Tojeiro, Rita; Vargas-Magana, Mariana

    2017-06-01

    We present a Bayesian phase-space reconstruction of the cosmic large-scale matter density and velocity fields from the Sloan Digital Sky Survey-III Baryon Oscillations Spectroscopic Survey Data Release 12 CMASS galaxy clustering catalogue. We rely on a given Λ cold dark matter cosmology, a mesh resolution in the range of 6-10 h-1 Mpc, and a lognormal-Poisson model with a redshift-dependent non-linear bias. The bias parameters are derived from the data and a general renormalized perturbation theory approach. We use combined Gibbs and Hamiltonian sampling, implemented in the argo code, to iteratively reconstruct the dark matter density field and the coherent peculiar velocities of individual galaxies, correcting hereby for coherent redshift space distortions. Our tests relying on accurate N-body-based mock galaxy catalogues show unbiased real space power spectra of the non-linear density field up to k ˜ 0.2 h Mpc-1, and vanishing quadrupoles down to r ˜ 20 h-1 Mpc. We also demonstrate that the non-linear cosmic web can be obtained from the tidal field tensor based on the Gaussian component of the reconstructed density field. We find that the reconstructed velocities have a statistical correlation coefficient compared to the true velocities of each individual light-cone mock galaxy of r ˜ 0.68 including about 10 per cent of satellite galaxies with virial motions (about r = 0.75 without satellites). The power spectra of the velocity divergence agree well with theoretical predictions up to k ˜ 0.2 h Mpc-1. This work will be especially useful to improve, for example, baryon acoustic oscillation reconstructions, kinematic Sunyaev-Zeldovich, integrated Sachs-Wolfe measurements or environmental studies.

  20. Organization of cytokeratin cytoskeleton and germ plasm in the vegetal cortex of Xenopus laevis oocytes depends on coding and non-coding RNAs: Three-dimensional and ultrastructural analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kloc, Malgorzata; Bilinski, Szczepan; Dougherty, Matthew T.

    2007-05-01

    Recent studies discovered a novel structural role of RNA in maintaining the integrity of the mitotic spindle and cellular cytoskeleton. In Xenopus laevis, non-coding Xlsirts and coding VegT RNAs play a structural role in anchoring localized RNAs, maintaining the organization of the cytokeratin cytoskeleton and germinal granules in the oocyte vegetal cortex and in subsequent development of the germline in the embryo. We studied the ultrastructural effects of antisense oligonucleotide driven ablation of Xlsirts and VegT RNAs on the organization of the cytokeratin, germ plasm and other components of the vegetal cortex. We developed a novel method to immunolabel andmore » visualize cytokeratin at the electron microscopy level, which allowed us to reconstruct the ultrastructural organization of the cytokeratin network relative to the components of the vegetal cortex in Xenopus oocytes. The removal of Xlsirts and VegT RNAs not only disrupts the cytokeratin cytoskeleton but also has a profound transcript-specific effect on the anchoring and distribution of germ plasm islands and their germinal granules and the arrangement of yolk platelets within the vegetal cortex. We suggest that the cytokeratin cytoskeleton plays a role in anchoring of germ plasm islands within the vegetal cortex and germinal granules within the germ plasm islands.« less

  1. Epoch of Reionization : An Investigation of the Semi-Analytic 21CMMC Code

    NASA Astrophysics Data System (ADS)

    Miller, Michelle

    2018-01-01

    After the Big Bang the universe was filled with neutral hydrogen that began to cool and collapse into the first structures. These first stars and galaxies began to emit radiation that eventually ionized all of the neutral hydrogen in the universe. 21CMMC is a semi-numerical code that takes simulated boxes of this ionized universe from another code called 21cmFAST. Mock measurements are taken from the simulated boxes in 21cmFAST. Those measurements are thrown into 21CMMC and help us determine three major parameters of this simulated universe: virial temperature, mean free path, and ionization efficiency. My project tests the robustness of 21CMMC on universe simulations other than 21cmFAST to see whether 21CMMC can properly reconstruct early universe parameters given a mock “measurement” in the form of power spectra. We determine that while two of the three EoR parameters (Virial Temperature and Efficiency) have some reconstructability, the mean free path parameter in the code is the least robust. This requires development of the 21CMMC code.

  2. Experimental Study of Super-Resolution Using a Compressive Sensing Architecture

    DTIC Science & Technology

    2015-03-01

    Intelligence 24(9), 1167–1183 (2002). [3] Lin, Z. and Shum, H.-Y., “Fundamental limits of reconstruction-based superresolution algorithms under local...IEEE Transactions on 52, 1289–1306 (April 2006). [9] Marcia, R. and Willett, R., “Compressive coded aperture superresolution image reconstruction,” in

  3. Reconstruction for time-domain in vivo EPR 3D multigradient oximetric imaging--a parallel processing perspective.

    PubMed

    Dharmaraj, Christopher D; Thadikonda, Kishan; Fletcher, Anthony R; Doan, Phuc N; Devasahayam, Nallathamby; Matsumoto, Shingo; Johnson, Calvin A; Cook, John A; Mitchell, James B; Subramanian, Sankaran; Krishna, Murali C

    2009-01-01

    Three-dimensional Oximetric Electron Paramagnetic Resonance Imaging using the Single Point Imaging modality generates unpaired spin density and oxygen images that can readily distinguish between normal and tumor tissues in small animals. It is also possible with fast imaging to track the changes in tissue oxygenation in response to the oxygen content in the breathing air. However, this involves dealing with gigabytes of data for each 3D oximetric imaging experiment involving digital band pass filtering and background noise subtraction, followed by 3D Fourier reconstruction. This process is rather slow in a conventional uniprocessor system. This paper presents a parallelization framework using OpenMP runtime support and parallel MATLAB to execute such computationally intensive programs. The Intel compiler is used to develop a parallel C++ code based on OpenMP. The code is executed on four Dual-Core AMD Opteron shared memory processors, to reduce the computational burden of the filtration task significantly. The results show that the parallel code for filtration has achieved a speed up factor of 46.66 as against the equivalent serial MATLAB code. In addition, a parallel MATLAB code has been developed to perform 3D Fourier reconstruction. Speedup factors of 4.57 and 4.25 have been achieved during the reconstruction process and oximetry computation, for a data set with 23 x 23 x 23 gradient steps. The execution time has been computed for both the serial and parallel implementations using different dimensions of the data and presented for comparison. The reported system has been designed to be easily accessible even from low-cost personal computers through local internet (NIHnet). The experimental results demonstrate that the parallel computing provides a source of high computational power to obtain biophysical parameters from 3D EPR oximetric imaging, almost in real-time.

  4. Vector tomography for reconstructing electric fields with non-zero divergence in bounded domains

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Koulouri, Alexandra, E-mail: koulouri@uni-muenster.de; Department of Electrical and Electronic Engineering, Imperial College London, Exhibition Road, London SW7 2BT; Brookes, Mike

    In vector tomography (VT), the aim is to reconstruct an unknown multi-dimensional vector field using line integral data. In the case of a 2-dimensional VT, two types of line integral data are usually required. These data correspond to integration of the parallel and perpendicular projection of the vector field along the integration lines and are called the longitudinal and transverse measurements, respectively. In most cases, however, the transverse measurements cannot be physically acquired. Therefore, the VT methods are typically used to reconstruct divergence-free (or source-free) velocity and flow fields that can be reconstructed solely from the longitudinal measurements. In thismore » paper, we show how vector fields with non-zero divergence in a bounded domain can also be reconstructed from the longitudinal measurements without the need of explicitly evaluating the transverse measurements. To the best of our knowledge, VT has not previously been used for this purpose. In particular, we study low-frequency, time-harmonic electric fields generated by dipole sources in convex bounded domains which arise, for example, in electroencephalography (EEG) source imaging. We explain in detail the theoretical background, the derivation of the electric field inverse problem and the numerical approximation of the line integrals. We show that fields with non-zero divergence can be reconstructed from the longitudinal measurements with the help of two sparsity constraints that are constructed from the transverse measurements and the vector Laplace operator. As a comparison to EEG source imaging, we note that VT does not require mathematical modeling of the sources. By numerical simulations, we show that the pattern of the electric field can be correctly estimated using VT and the location of the source activity can be determined accurately from the reconstructed magnitudes of the field. - Highlights: • Vector tomography is used to reconstruct electric fields generated by dipole sources. • Inverse solutions are based on longitudinal and transverse line integral measurements. • Transverse line integral measurements are used as a sparsity constraint. • Numerical procedure to approximate the line integrals is described in detail. • Patterns of the studied electric fields are correctly estimated.« less

  5. Technical Note: FreeCT_wFBP: A robust, efficient, open-source implementation of weighted filtered backprojection for helical, fan-beam CT.

    PubMed

    Hoffman, John; Young, Stefano; Noo, Frédéric; McNitt-Gray, Michael

    2016-03-01

    With growing interest in quantitative imaging, radiomics, and CAD using CT imaging, the need to explore the impacts of acquisition and reconstruction parameters has grown. This usually requires extensive access to the scanner on which the data were acquired and its workflow is not designed for large-scale reconstruction projects. Therefore, the authors have developed a freely available, open-source software package implementing a common reconstruction method, weighted filtered backprojection (wFBP), for helical fan-beam CT applications. FreeCT_wFBP is a low-dependency, GPU-based reconstruction program utilizing c for the host code and Nvidia CUDA C for GPU code. The software is capable of reconstructing helical scans acquired with arbitrary pitch-values, and sampling techniques such as flying focal spots and a quarter-detector offset. In this work, the software has been described and evaluated for reconstruction speed, image quality, and accuracy. Speed was evaluated based on acquisitions of the ACR CT accreditation phantom under four different flying focal spot configurations. Image quality was assessed using the same phantom by evaluating CT number accuracy, uniformity, and contrast to noise ratio (CNR). Finally, reconstructed mass-attenuation coefficient accuracy was evaluated using a simulated scan of a FORBILD thorax phantom and comparing reconstructed values to the known phantom values. The average reconstruction time evaluated under all flying focal spot configurations was found to be 17.4 ± 1.0 s for a 512 row × 512 column × 32 slice volume. Reconstructions of the ACR phantom were found to meet all CT Accreditation Program criteria including CT number, CNR, and uniformity tests. Finally, reconstructed mass-attenuation coefficient values of water within the FORBILD thorax phantom agreed with original phantom values to within 0.0001 mm(2)/g (0.01%). FreeCT_wFBP is a fast, highly configurable reconstruction package for third-generation CT available under the GNU GPL. It shows good performance with both clinical and simulated data.

  6. Reconstruction of Sensory Stimuli Encoded with Integrate-and-Fire Neurons with Random Thresholds

    PubMed Central

    Lazar, Aurel A.; Pnevmatikakis, Eftychios A.

    2013-01-01

    We present a general approach to the reconstruction of sensory stimuli encoded with leaky integrate-and-fire neurons with random thresholds. The stimuli are modeled as elements of a Reproducing Kernel Hilbert Space. The reconstruction is based on finding a stimulus that minimizes a regularized quadratic optimality criterion. We discuss in detail the reconstruction of sensory stimuli modeled as absolutely continuous functions as well as stimuli with absolutely continuous first-order derivatives. Reconstruction results are presented for stimuli encoded with single as well as a population of neurons. Examples are given that demonstrate the performance of the reconstruction algorithms as a function of threshold variability. PMID:24077610

  7. Certification Examination Cases of Candidates for Certification by the American Board of Plastic Surgery: Trends in Practice Profiles Spanning a Decade (2000–2009)

    PubMed Central

    Chung, Kevin C.; Song, Jae W.; Shauver, Melissa J.; Cullison, Terry M.; Noone, R. Barrett

    2011-01-01

    Background To evaluate the case mix of plastic surgeons in their early years of practice by examining candidate case-logs submitted for the Oral Examination. Methods De-identified data from 2000–2009 consisting of case-logs submitted by young plastic surgery candidates for the Oral Examination were analyzed. Data consisted of exam year, CPT (Current Procedural Terminology) Codes and the designation of each CPT code as cosmetic or reconstructive by the candidate, and patient age and gender. Subgroup analyses for comprehensive, cosmetic, craniomaxillofacial, and hand surgery modules were performed by using the CPT code list designated by the American Board of Plastic Surgery Maintenance of Certification in Plastic Surgery ( ) module framework. Results We examined case-logs from a yearly average of 261 candidates over 10 years. Wider variations in yearly percent change in median cosmetic surgery case volumes (−62.5% to 30%) were observed when compared to the reconstructive surgery case volumes (−18.0% to 25.7%). Compared to cosmetic surgery cases per candidate, which varied significantly from year-to-year (p<0.0001), reconstructive surgery cases per candidate did not vary significantly (p=0.954). Subgroup analyses of proportions of types of surgical procedures based on CPT code categories, revealed hand surgery to be the least performed procedure relative to comprehensive, craniomaxillofacial, and cosmetic surgery procedures. Conclusions Graduates of plastic surgery training programs are committed to performing a broad spectrum of reconstructive and cosmetic surgical procedures in their first year of practice. However, hand surgery continues to have a small presence in the practice profiles of young plastic surgeons. PMID:21788850

  8. Funding analysis of bilateral autologous free-flap breast reconstructions in Australia.

    PubMed

    Sinha, Shiba; Ruskin, Olivia; McCombe, David; Morrison, Wayne; Webb, Angela

    2015-08-01

    Bilateral breast reconstructions are being increasingly performed. Autologous free-flap reconstructions represent the gold standard for post-mastectomy breast reconstruction but are resource intensive. This study aims to investigate the difference between hospital reimbursement and true cost of bilateral autologous free-flap reconstructions. Retrospective analysis of patients who underwent bilateral autologous free-flap reconstructions at a single Australian tertiary referral centre was performed. Hospital reimbursement was determined from coding analysis. A true cost analysis was also performed. Comparisons were made considering the effect of timing, indication and complications of the procedure. Forty-six bilateral autologous free-flap procedures were performed (87 deep inferior epigastric perforators (DIEPs), four superficial inferior epigastric artery perforator flaps (SIEAs) and one muscle-sparing free transverse rectus abdominis myocutaneous flap (MS-TRAM)). The mean funding discrepancy between hospital reimbursement and actual cost was $12,137 ± $8539 (mean ± standard deviation (SD)) (n = 46). Twenty-four per cent (n = 11) of the cases had been coded inaccurately. If these cases were excluded from analysis, the mean funding discrepancy per case was $9168 ± $7453 (n = 35). Minor and major complications significantly increased the true cost and funding discrepancy (p = 0.02). Bilateral free-flap breast reconstructions performed in Australian public hospitals result in a funding discrepancy. Failure to be economically viable threatens the provision of this procedure in the public system. Plastic surgeons and hospital managers need to adopt measures in order to make these gold-standard procedures cost neutral. Copyright © 2015 British Association of Plastic, Reconstructive and Aesthetic Surgeons. Published by Elsevier Ltd. All rights reserved.

  9. High dynamic range coding imaging system

    NASA Astrophysics Data System (ADS)

    Wu, Renfan; Huang, Yifan; Hou, Guangqi

    2014-10-01

    We present a high dynamic range (HDR) imaging system design scheme based on coded aperture technique. This scheme can help us obtain HDR images which have extended depth of field. We adopt Sparse coding algorithm to design coded patterns. Then we utilize the sensor unit to acquire coded images under different exposure settings. With the guide of the multiple exposure parameters, a series of low dynamic range (LDR) coded images are reconstructed. We use some existing algorithms to fuse and display a HDR image by those LDR images. We build an optical simulation model and get some simulation images to verify the novel system.

  10. Evolutionary computation applied to the reconstruction of 3-D surface topography in the SEM.

    PubMed

    Kodama, Tetsuji; Li, Xiaoyuan; Nakahira, Kenji; Ito, Dai

    2005-10-01

    A genetic algorithm has been applied to the line profile reconstruction from the signals of the standard secondary electron (SE) and/or backscattered electron detectors in a scanning electron microscope. This method solves the topographical surface reconstruction problem as one of combinatorial optimization. To extend this optimization approach for three-dimensional (3-D) surface topography, this paper considers the use of a string coding where a 3-D surface topography is represented by a set of coordinates of vertices. We introduce the Delaunay triangulation, which attains the minimum roughness for any set of height data to capture the fundamental features of the surface being probed by an electron beam. With this coding, the strings are processed with a class of hybrid optimization algorithms that combine genetic algorithms and simulated annealing algorithms. Experimental results on SE images are presented.

  11. Post-flight trajectory reconstruction of suborbital free-flyers using GPS raw data

    NASA Astrophysics Data System (ADS)

    Ivchenko, N.; Yuan, Y.; Linden, E.

    2017-08-01

    This paper describes the reconstruction of postflight trajectories of suborbital free flying units by using logged GPS raw data. We took the reconstruction as a global least squares optimization problem, using both the pseudo-range and Doppler observables, and solved it by using the trust-region-reflective algorithm, which enabled navigational solutions of high accuracy. The code tracking was implemented with a large number of correlators and least squares curve fitting, in order to improve the precision of the code start times, while a more conventional phased lock loop was used for Doppler tracking. We proposed a weighting scheme to account for fast signal strength variation due to free-flier fast rotation, and a penalty for jerk to achieve a smooth solution. We applied these methods to flight data of two suborbital free flying units launched on REXUS 12 sounding rocket, reconstructing the trajectory, receiver clock error and wind up rates. The trajectory exhibits a parabola with the apogee around 80 km, and the velocity profile shows the details of payloadwobbling. The wind up rates obtained match the measurements from onboard angular rate sensors.

  12. High-accuracy 3D measurement system based on multi-view and structured light

    NASA Astrophysics Data System (ADS)

    Li, Mingyue; Weng, Dongdong; Li, Yufeng; Zhang, Longbin; Zhou, Haiyun

    2013-12-01

    3D surface reconstruction is one of the most important topics in Spatial Augmented Reality (SAR). Using structured light is a simple and rapid method to reconstruct the objects. In order to improve the precision of 3D reconstruction, we present a high-accuracy multi-view 3D measurement system based on Gray-code and Phase-shift. We use a camera and a light projector that casts structured light patterns on the objects. In this system, we use only one camera to take photos on the left and right sides of the object respectively. In addition, we use VisualSFM to process the relationships between each perspective, so the camera calibration can be omitted and the positions to place the camera are no longer limited. We also set appropriate exposure time to make the scenes covered by gray-code patterns more recognizable. All of the points above make the reconstruction more precise. We took experiments on different kinds of objects, and a large number of experimental results verify the feasibility and high accuracy of the system.

  13. Research on compression performance of ultrahigh-definition videos

    NASA Astrophysics Data System (ADS)

    Li, Xiangqun; He, Xiaohai; Qing, Linbo; Tao, Qingchuan; Wu, Di

    2017-11-01

    With the popularization of high-definition (HD) images and videos (1920×1080 pixels and above), there are even 4K (3840×2160) television signals and 8 K (8192×4320) ultrahigh-definition videos. The demand for HD images and videos is increasing continuously, along with the increasing data volume. The storage and transmission cannot be properly solved only by virtue of the expansion capacity of hard disks and the update and improvement of transmission devices. Based on the full use of the coding standard high-efficiency video coding (HEVC), super-resolution reconstruction technology, and the correlation between the intra- and the interprediction, we first put forward a "division-compensation"-based strategy to further improve the compression performance of a single image and frame I. Then, by making use of the above thought and HEVC encoder and decoder, a video compression coding frame is designed. HEVC is used inside the frame. Last, with the super-resolution reconstruction technology, the reconstructed video quality is further improved. The experiment shows that by the proposed compression method for a single image (frame I) and video sequence here, the performance is superior to that of HEVC in a low bit rate environment.

  14. PhyloBot: A Web Portal for Automated Phylogenetics, Ancestral Sequence Reconstruction, and Exploration of Mutational Trajectories.

    PubMed

    Hanson-Smith, Victor; Johnson, Alexander

    2016-07-01

    The method of phylogenetic ancestral sequence reconstruction is a powerful approach for studying evolutionary relationships among protein sequence, structure, and function. In particular, this approach allows investigators to (1) reconstruct and "resurrect" (that is, synthesize in vivo or in vitro) extinct proteins to study how they differ from modern proteins, (2) identify key amino acid changes that, over evolutionary timescales, have altered the function of the protein, and (3) order historical events in the evolution of protein function. Widespread use of this approach has been slow among molecular biologists, in part because the methods require significant computational expertise. Here we present PhyloBot, a web-based software tool that makes ancestral sequence reconstruction easy. Designed for non-experts, it integrates all the necessary software into a single user interface. Additionally, PhyloBot provides interactive tools to explore evolutionary trajectories between ancestors, enabling the rapid generation of hypotheses that can be tested using genetic or biochemical approaches. Early versions of this software were used in previous studies to discover genetic mechanisms underlying the functions of diverse protein families, including V-ATPase ion pumps, DNA-binding transcription regulators, and serine/threonine protein kinases. PhyloBot runs in a web browser, and is available at the following URL: http://www.phylobot.com. The software is implemented in Python using the Django web framework, and runs on elastic cloud computing resources from Amazon Web Services. Users can create and submit jobs on our free server (at the URL listed above), or use our open-source code to launch their own PhyloBot server.

  15. PhyloBot: A Web Portal for Automated Phylogenetics, Ancestral Sequence Reconstruction, and Exploration of Mutational Trajectories

    PubMed Central

    Hanson-Smith, Victor; Johnson, Alexander

    2016-01-01

    The method of phylogenetic ancestral sequence reconstruction is a powerful approach for studying evolutionary relationships among protein sequence, structure, and function. In particular, this approach allows investigators to (1) reconstruct and “resurrect” (that is, synthesize in vivo or in vitro) extinct proteins to study how they differ from modern proteins, (2) identify key amino acid changes that, over evolutionary timescales, have altered the function of the protein, and (3) order historical events in the evolution of protein function. Widespread use of this approach has been slow among molecular biologists, in part because the methods require significant computational expertise. Here we present PhyloBot, a web-based software tool that makes ancestral sequence reconstruction easy. Designed for non-experts, it integrates all the necessary software into a single user interface. Additionally, PhyloBot provides interactive tools to explore evolutionary trajectories between ancestors, enabling the rapid generation of hypotheses that can be tested using genetic or biochemical approaches. Early versions of this software were used in previous studies to discover genetic mechanisms underlying the functions of diverse protein families, including V-ATPase ion pumps, DNA-binding transcription regulators, and serine/threonine protein kinases. PhyloBot runs in a web browser, and is available at the following URL: http://www.phylobot.com. The software is implemented in Python using the Django web framework, and runs on elastic cloud computing resources from Amazon Web Services. Users can create and submit jobs on our free server (at the URL listed above), or use our open-source code to launch their own PhyloBot server. PMID:27472806

  16. Simultaneous EEG and MEG source reconstruction in sparse electromagnetic source imaging.

    PubMed

    Ding, Lei; Yuan, Han

    2013-04-01

    Electroencephalography (EEG) and magnetoencephalography (MEG) have different sensitivities to differently configured brain activations, making them complimentary in providing independent information for better detection and inverse reconstruction of brain sources. In the present study, we developed an integrative approach, which integrates a novel sparse electromagnetic source imaging method, i.e., variation-based cortical current density (VB-SCCD), together with the combined use of EEG and MEG data in reconstructing complex brain activity. To perform simultaneous analysis of multimodal data, we proposed to normalize EEG and MEG signals according to their individual noise levels to create unit-free measures. Our Monte Carlo simulations demonstrated that this integrative approach is capable of reconstructing complex cortical brain activations (up to 10 simultaneously activated and randomly located sources). Results from experimental data showed that complex brain activations evoked in a face recognition task were successfully reconstructed using the integrative approach, which were consistent with other research findings and validated by independent data from functional magnetic resonance imaging using the same stimulus protocol. Reconstructed cortical brain activations from both simulations and experimental data provided precise source localizations as well as accurate spatial extents of localized sources. In comparison with studies using EEG or MEG alone, the performance of cortical source reconstructions using combined EEG and MEG was significantly improved. We demonstrated that this new sparse ESI methodology with integrated analysis of EEG and MEG data could accurately probe spatiotemporal processes of complex human brain activations. This is promising for noninvasively studying large-scale brain networks of high clinical and scientific significance. Copyright © 2011 Wiley Periodicals, Inc.

  17. CONNJUR Workflow Builder: A software integration environment for spectral reconstruction

    PubMed Central

    Fenwick, Matthew; Weatherby, Gerard; Vyas, Jay; Sesanker, Colbert; Martyn, Timothy O.; Ellis, Heidi J.C.; Gryk, Michael R.

    2015-01-01

    CONNJUR Workflow Builder (WB) is an open-source software integration environment that leverages existing spectral reconstruction tools to create a synergistic, coherent platform for converting biomolecular NMR data from the time domain to the frequency domain. WB provides data integration of primary data and metadata using a relational database, and includes a library of pre-built workflows for processing time domain data. WB simplifies maximum entropy reconstruction, facilitating the processing of non-uniformly sampled time domain data. As will be shown in the paper, the unique features of WB provide it with novel abilities to enhance the quality, accuracy, and fidelity of the spectral reconstruction process. WB also provides features which promote collaboration, education, parameterization, and non-uniform data sets along with processing integrated with the Rowland NMR Toolkit (RNMRTK) and NMRPipe software packages. WB is available free of charge in perpetuity, dual-licensed under the MIT and GPL open source licenses. PMID:26066803

  18. CONNJUR Workflow Builder: a software integration environment for spectral reconstruction.

    PubMed

    Fenwick, Matthew; Weatherby, Gerard; Vyas, Jay; Sesanker, Colbert; Martyn, Timothy O; Ellis, Heidi J C; Gryk, Michael R

    2015-07-01

    CONNJUR Workflow Builder (WB) is an open-source software integration environment that leverages existing spectral reconstruction tools to create a synergistic, coherent platform for converting biomolecular NMR data from the time domain to the frequency domain. WB provides data integration of primary data and metadata using a relational database, and includes a library of pre-built workflows for processing time domain data. WB simplifies maximum entropy reconstruction, facilitating the processing of non-uniformly sampled time domain data. As will be shown in the paper, the unique features of WB provide it with novel abilities to enhance the quality, accuracy, and fidelity of the spectral reconstruction process. WB also provides features which promote collaboration, education, parameterization, and non-uniform data sets along with processing integrated with the Rowland NMR Toolkit (RNMRTK) and NMRPipe software packages. WB is available free of charge in perpetuity, dual-licensed under the MIT and GPL open source licenses.

  19. Monthly Mean Pressure Reconstructions for Europe (1780-1980) and North America (1858-1980) (1987) (NDP-025)

    DOE Data Explorer

    Jones, P D. [University of East Anglia, Norwich, United Kingdom; Wigley, T. M. L. [University of East Anglia, Norwich, United Kingdom; Briffa, K. R. [University of East Anglia, Norwich, United Kingdom

    2012-01-01

    Real and reconstructed measurements of monthly mean pressure data have been constructed for Europe for 1780 through 1980 and North America for 1858 through 1980. The reconstructions use early pressure, temperature, and precipitation data from a variety of sources including World Weather Records, meteorological and national archives, circulation maps, and daily chart series. Each record contains the year, monthly mean pressure, quality code, and annual mean pressure. These reconstructed gridded monthly pressures provide a reliable historical record of mean sea-level pressures for Europe and North America. The data are in two files: pressure reconstructions for Europe (1.47 MB) and for North America (0.72 MB).

  20. [Hand reconstruction by microsurgical free toe transfer].

    PubMed

    Stamate, T; Budurcă, A R; Hermeziu, Oana

    2003-01-01

    Reconstruction of complex hand mutilations with multi-digital or thumb amputations are best treated with microsurgical toe transfers. We present the results of the first 15 cases operated by the first author, of which 12 are thumb reconstructions (6 great toe and 6 second toe transfers) and 3 long fingers reconstructions with combined second and third toe transfers. There were no microsurgical complications. Cortical integration and functional integration was achieved for all transferred toes, with discriminatory sensibility (m2PD between 5 and 13 mm) and active mobility range between 30 and 60 degrees.

  1. Partial fourier and parallel MR image reconstruction with integrated gradient nonlinearity correction.

    PubMed

    Tao, Shengzhen; Trzasko, Joshua D; Shu, Yunhong; Weavers, Paul T; Huston, John; Gray, Erin M; Bernstein, Matt A

    2016-06-01

    To describe how integrated gradient nonlinearity (GNL) correction can be used within noniterative partial Fourier (homodyne) and parallel (SENSE and GRAPPA) MR image reconstruction strategies, and demonstrate that performing GNL correction during, rather than after, these routines mitigates the image blurring and resolution loss caused by postreconstruction image domain based GNL correction. Starting from partial Fourier and parallel magnetic resonance imaging signal models that explicitly account for GNL, noniterative image reconstruction strategies for each accelerated acquisition technique are derived under the same core mathematical assumptions as their standard counterparts. A series of phantom and in vivo experiments on retrospectively undersampled data were performed to investigate the spatial resolution benefit of integrated GNL correction over conventional postreconstruction correction. Phantom and in vivo results demonstrate that the integrated GNL correction reduces the image blurring introduced by the conventional GNL correction, while still correcting GNL-induced coarse-scale geometrical distortion. Images generated from undersampled data using the proposed integrated GNL strategies offer superior depiction of fine image detail, for example, phantom resolution inserts and anatomical tissue boundaries. Noniterative partial Fourier and parallel imaging reconstruction methods with integrated GNL correction reduce the resolution loss that occurs during conventional postreconstruction GNL correction while preserving the computational efficiency of standard reconstruction techniques. Magn Reson Med 75:2534-2544, 2016. © 2015 Wiley Periodicals, Inc. © 2015 Wiley Periodicals, Inc.

  2. TEM Video Compressive Sensing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stevens, Andrew; Kovarik, Libor; Abellan, Patricia

    One of the main limitations of imaging at high spatial and temporal resolution during in-situ TEM experiments is the frame rate of the camera being used to image the dynamic process. While the recent development of direct detectors has provided the hardware to achieve frame rates approaching 0.1ms, the cameras are expensive and must replace existing detectors. In this paper, we examine the use of coded aperture compressive sensing methods [1, 2, 3, 4] to increase the framerate of any camera with simple, low-cost hardware modifications. The coded aperture approach allows multiple sub-frames to be coded and integrated into amore » single camera frame during the acquisition process, and then extracted upon readout using statistical compressive sensing inversion. Our simulations show that it should be possible to increase the speed of any camera by at least an order of magnitude. Compressive Sensing (CS) combines sensing and compression in one operation, and thus provides an approach that could further improve the temporal resolution while correspondingly reducing the electron dose rate. Because the signal is measured in a compressive manner, fewer total measurements are required. When applied to TEM video capture, compressive imaging couled improve acquisition speed and reduce the electron dose rate. CS is a recent concept, and has come to the forefront due the seminal work of Candès [5]. Since the publication of Candès, there has been enormous growth in the application of CS and development of CS variants. For electron microscopy applications, the concept of CS has also been recently applied to electron tomography [6], and reduction of electron dose in scanning transmission electron microscopy (STEM) imaging [7]. To demonstrate the applicability of coded aperture CS video reconstruction for atomic level imaging, we simulate compressive sensing on observations of Pd nanoparticles and Ag nanoparticles during exposure to high temperatures and other environmental conditions. Figure 1 highlights the results from the Pd nanoparticle experiment. On the left, 10 frames are reconstructed from a single coded frame—the original frames are shown for comparison. On the right a selection of three frames are shown from reconstructions at compression levels 10,20,30. The reconstructions, which are not post-processed, are true to the original and degrade in a straightforward manner. The final choice of compression level will obviously depend on both the temporal and spatial resolution required for a specific imaging task, but the results indicate that an increase in speed of better than an order of magnitude should be possible for all experiments. References: [1] P Llull, X Liao, X Yuan et al. Optics express 21(9), (2013), p. 10526. [2] J Yang, X Yuan, X Liao et al. Image Processing, IEEE Trans 23(11), (2014), p. 4863. [3] X Yuan, J Yang, P Llull et al. In ICIP 2013 (IEEE), p. 14. [4] X Yuan, P Llull, X Liao et al. In CVPR 2014. p. 3318. [5] EJ Candès, J Romberg and T Tao. Information Theory, IEEE Trans 52(2), (2006), p. 489. [6] P Binev, W Dahmen, R DeVore et al. In Modeling Nanoscale Imaging in Electron Microscopy, eds. T Vogt, W Dahmen and P Binev (Springer US), Nanostructure Science and Technology (2012). p. 73. [7] A Stevens, H Yang, L Carin et al. Microscopy 63(1), (2014), pp. 41.« less

  3. Compression of computer generated phase-shifting hologram sequence using AVC and HEVC

    NASA Astrophysics Data System (ADS)

    Xing, Yafei; Pesquet-Popescu, Béatrice; Dufaux, Frederic

    2013-09-01

    With the capability of achieving twice the compression ratio of Advanced Video Coding (AVC) with similar reconstruction quality, High Efficiency Video Coding (HEVC) is expected to become the newleading technique of video coding. In order to reduce the storage and transmission burden of digital holograms, in this paper we propose to use HEVC for compressing the phase-shifting digital hologram sequences (PSDHS). By simulating phase-shifting digital holography (PSDH) interferometry, interference patterns between illuminated three dimensional( 3D) virtual objects and the stepwise phase changed reference wave are generated as digital holograms. The hologram sequences are obtained by the movement of the virtual objects and compressed by AVC and HEVC. The experimental results show that AVC and HEVC are efficient to compress PSDHS, with HEVC giving better performance. Good compression rate and reconstruction quality can be obtained with bitrate above 15000kbps.

  4. Committed to the Honor Code: An Investment Model Analysis of Academic Integrity

    ERIC Educational Resources Information Center

    Dix, Emily L.; Emery, Lydia F.; Le, Benjamin

    2014-01-01

    Educators worldwide face challenges surrounding academic integrity. The development of honor codes can promote academic integrity, but understanding how and why honor codes affect behavior is critical to their successful implementation. To date, research has not examined how students' "relationship" to an honor code predicts…

  5. 3D equilibrium reconstruction with islands

    NASA Astrophysics Data System (ADS)

    Cianciosa, M.; Hirshman, S. P.; Seal, S. K.; Shafer, M. W.

    2018-04-01

    This paper presents the development of a 3D equilibrium reconstruction tool and the results of the first-ever reconstruction of an island equilibrium. The SIESTA non-nested equilibrium solver has been coupled to the V3FIT 3D equilibrium reconstruction code. Computed from a coupled VMEC and SIESTA model, synthetic signals are matched to measured signals by finding an optimal set of equilibrium parameters. By using the normalized pressure in place of normalized flux, non-equilibrium quantities needed by diagnostic signals can be efficiently mapped to the equilibrium. The effectiveness of this tool is demonstrated by reconstructing an island equilibrium of a DIII-D inner wall limited L-mode case with an n = 1 error field applied. Flat spots in Thomson and ECE temperature diagnostics show the reconstructed islands have the correct size and phase. ).

  6. Rapid Prototyping Integrated With Nondestructive Evaluation and Finite Element Analysis

    NASA Technical Reports Server (NTRS)

    Abdul-Aziz, Ali; Baaklini, George Y.

    2001-01-01

    Most reverse engineering approaches involve imaging or digitizing an object then creating a computerized reconstruction that can be integrated, in three dimensions, into a particular design environment. Rapid prototyping (RP) refers to the practical ability to build high-quality physical prototypes directly from computer aided design (CAD) files. Using rapid prototyping, full-scale models or patterns can be built using a variety of materials in a fraction of the time required by more traditional prototyping techniques (refs. 1 and 2). Many software packages have been developed and are being designed to tackle the reverse engineering and rapid prototyping issues just mentioned. For example, image processing and three-dimensional reconstruction visualization software such as Velocity2 (ref. 3) are being used to carry out the construction process of three-dimensional volume models and the subsequent generation of a stereolithography file that is suitable for CAD applications. Producing three-dimensional models of objects from computed tomography (CT) scans is becoming a valuable nondestructive evaluation methodology (ref. 4). Real components can be rendered and subjected to temperature and stress tests using structural engineering software codes. For this to be achieved, accurate high-resolution images have to be obtained via CT scans and then processed, converted into a traditional file format, and translated into finite element models. Prototyping a three-dimensional volume of a composite structure by reading in a series of two-dimensional images generated via CT and by using and integrating commercial software (e.g. Velocity2, MSC/PATRAN (ref. 5), and Hypermesh (ref. 6)) is being applied successfully at the NASA Glenn Research Center. The building process from structural modeling to the analysis level is outlined in reference 7. Subsequently, a stress analysis of a composite cooling panel under combined thermomechanical loading conditions was performed to validate this process.

  7. A Qualitative Study of Breast Reconstruction Decision-Making among Asian Immigrant Women Living in the United States.

    PubMed

    Fu, Rose; Chang, Michelle Milee; Chen, Margaret; Rohde, Christine Hsu

    2017-02-01

    Despite research supporting improved psychosocial well-being, quality of life, and survival for patients undergoing postmastectomy breast reconstruction, Asian patients remain one-fifth as likely as Caucasians to choose reconstruction. This study investigates cultural factors, values, and perceptions held by Asian women that might impact breast reconstruction rates. The authors conducted semistructured interviews of immigrant East Asian women treated for breast cancer in the New York metropolitan area, investigating social structure, culture, attitudes toward surgery, and body image. Three investigators independently coded transcribed interviews, and then collectively evaluated them through axial coding of recurring themes. Thirty-five immigrant East Asian women who underwent surgical treatment for breast cancer were interviewed. Emerging themes include functionality, age, perceptions of plastic surgery, inconvenience, community/family, fear of implants, language, and information. Patients spoke about breasts as a function of their roles as a wife or mother, eliminating the need for breasts when these roles were fulfilled. Many addressed the fear of multiple operations. Quality and quantity of information, and communication with practitioners, impacted perceptions about treatment. Reconstructive surgery was often viewed as cosmetic. Community and family played a significant role in decision-making. Asian women are statistically less likely than Caucasians to pursue breast reconstruction. This is the first study to investigate culture-specific perceptions of breast reconstruction. Results from this study can be used to improve cultural competency in addressing patient concerns. Improving access to information regarding treatment options and surgical outcomes may improve informed decision-making among immigrant Asian women.

  8. Vector tomography for reconstructing electric fields with non-zero divergence in bounded domains

    NASA Astrophysics Data System (ADS)

    Koulouri, Alexandra; Brookes, Mike; Rimpiläinen, Ville

    2017-01-01

    In vector tomography (VT), the aim is to reconstruct an unknown multi-dimensional vector field using line integral data. In the case of a 2-dimensional VT, two types of line integral data are usually required. These data correspond to integration of the parallel and perpendicular projection of the vector field along the integration lines and are called the longitudinal and transverse measurements, respectively. In most cases, however, the transverse measurements cannot be physically acquired. Therefore, the VT methods are typically used to reconstruct divergence-free (or source-free) velocity and flow fields that can be reconstructed solely from the longitudinal measurements. In this paper, we show how vector fields with non-zero divergence in a bounded domain can also be reconstructed from the longitudinal measurements without the need of explicitly evaluating the transverse measurements. To the best of our knowledge, VT has not previously been used for this purpose. In particular, we study low-frequency, time-harmonic electric fields generated by dipole sources in convex bounded domains which arise, for example, in electroencephalography (EEG) source imaging. We explain in detail the theoretical background, the derivation of the electric field inverse problem and the numerical approximation of the line integrals. We show that fields with non-zero divergence can be reconstructed from the longitudinal measurements with the help of two sparsity constraints that are constructed from the transverse measurements and the vector Laplace operator. As a comparison to EEG source imaging, we note that VT does not require mathematical modeling of the sources. By numerical simulations, we show that the pattern of the electric field can be correctly estimated using VT and the location of the source activity can be determined accurately from the reconstructed magnitudes of the field.

  9. An Inversion Method for Reconstructing Hall Thruster Plume Parameters from the Line Integrated Measurements (Preprint)

    DTIC Science & Technology

    2007-06-05

    From - To) 05-06-2007 Technical Paper 4. TITLE AND SUBTITLE 5a. CONTRACT NUMBER An Inversion Method for Reconstructing Hall Thruster Plume...239.18 An Inversion Method for Reconstructing Hall Thruster Plume Parameters from Line Integrated Measurements (Preprint) Taylor S. Matlock∗ Jackson...dimensional estimate of the plume electron temperature using a published xenon collisional radiative model. I. Introduction The Hall thruster is a high

  10. A new art code for tomographic interferometry

    NASA Technical Reports Server (NTRS)

    Tan, H.; Modarress, D.

    1987-01-01

    A new algebraic reconstruction technique (ART) code based on the iterative refinement method of least squares solution for tomographic reconstruction is presented. Accuracy and the convergence of the technique is evaluated through the application of numerically generated interferometric data. It was found that, in general, the accuracy of the results was superior to other reported techniques. The iterative method unconditionally converged to a solution for which the residual was minimum. The effects of increased data were studied. The inversion error was found to be a function of the input data error only. The convergence rate, on the other hand, was affected by all three parameters. Finally, the technique was applied to experimental data, and the results are reported.

  11. Feasibility and validation of virtual autopsy for dental identification using the Interpol dental codes.

    PubMed

    Franco, Ademir; Thevissen, Patrick; Coudyzer, Walter; Develter, Wim; Van de Voorde, Wim; Oyen, Raymond; Vandermeulen, Dirk; Jacobs, Reinhilde; Willems, Guy

    2013-05-01

    Virtual autopsy is a medical imaging technique, using full body computed tomography (CT), allowing for a noninvasive and permanent observation of all body parts. For dental identification clinically and radiologically observed ante-mortem (AM) and post-mortem (PM) oral identifiers are compared. The study aimed to verify if a PM dental charting can be performed on virtual reconstructions of full-body CT's using the Interpol dental codes. A sample of 103 PM full-body CT's was collected from the forensic autopsy files of the Department of Forensic Medicine University Hospitals, KU Leuven, Belgium. For validation purposes, 3 of these bodies underwent a complete dental autopsy, a dental radiological and a full-body CT examination. The bodies were scanned in a Siemens Definition Flash CT Scanner (Siemens Medical Solutions, Germany). The images were examined on 8- and 12-bit screen resolution as three-dimensional (3D) reconstructions and as axial, coronal and sagittal slices. InSpace(®) (Siemens Medical Solutions, Germany) software was used for 3D reconstruction. The dental identifiers were charted on pink PM Interpol forms (F1, F2), using the related dental codes. Optimal dental charting was obtained by combining observations on 3D reconstructions and CT slices. It was not feasible to differentiate between different kinds of dental restoration materials. The 12-bit resolution enabled to collect more detailed evidences, mainly related to positions within a tooth. Oral identifiers, not implemented in the Interpol dental coding were observed. Amongst these, the observed (3D) morphological features of dental and maxillofacial structures are important identifiers. The latter can become particularly more relevant towards the future, not only because of the inherent spatial features, yet also because of the increasing preventive dental treatment, and the decreasing application of dental restorations. In conclusion, PM full-body CT examinations need to be implemented in the PM dental charting protocols and the Interpol dental codes should be adapted accordingly. Copyright © 2012 Elsevier Ltd and Faculty of Forensic and Legal Medicine. All rights reserved.

  12. SU-F-J-74: High Z Geometric Integrity and Beam Hardening Artifact Assessment Using a Retrospective Metal Artifact Reduction (MAR) Reconstruction Algorithm

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Woods, K; DiCostanzo, D; Gupta, N

    Purpose: To test the efficacy of a retrospective metal artifact reduction (MAR) reconstruction algorithm for a commercial computed tomography (CT) scanner for radiation therapy purposes. Methods: High Z geometric integrity and artifact reduction analysis was performed with three phantoms using General Electric’s (GE) Discovery CT. The three phantoms included: a Computerized Imaging Reference Systems (CIRS) electron density phantom (Model 062) with a 6.5 mm diameter titanium rod insert, a custom spine phantom using Synthes Spine hardware submerged in water, and a dental phantom with various high Z fillings submerged in water. Each phantom was reconstructed using MAR and compared againstmore » the original scan. Furthermore, each scenario was tested using standard and extended Hounsfield Unit (HU) ranges. High Z geometric integrity was performed using the CIRS phantom, while the artifact reduction was performed using all three phantoms. Results: Geometric integrity of the 6.5 mm diameter rod was slightly overestimated for non-MAR scans for both standard and extended HU. With MAR reconstruction, the rod was underestimated for both standard and extended HU. For artifact reduction, the mean and standard deviation was compared in a volume of interest (VOI) in the surrounding material (water and water equivalent material, ∼0HU). Overall, the mean value of the VOI was closer to 0 HU for the MAR reconstruction compared to the non-MAR scan for most phantoms. Additionally, the standard deviations for all phantoms were greatly reduced using MAR reconstruction. Conclusion: GE’s MAR reconstruction algorithm improves image quality with the presence of high Z material with minimal degradation of its geometric integrity. High Z delineation can be carried out with proper contouring techniques. The effects of beam hardening artifacts are greatly reduced with MAR reconstruction. Tissue corrections due to these artifacts can be eliminated for simple high Z geometries and greatly reduced for more complex geometries.« less

  13. Graphical programming interface: A development environment for MRI methods.

    PubMed

    Zwart, Nicholas R; Pipe, James G

    2015-11-01

    To introduce a multiplatform, Python language-based, development environment called graphical programming interface for prototyping MRI techniques. The interface allows developers to interact with their scientific algorithm prototypes visually in an event-driven environment making tasks such as parameterization, algorithm testing, data manipulation, and visualization an integrated part of the work-flow. Algorithm developers extend the built-in functionality through simple code interfaces designed to facilitate rapid implementation. This article shows several examples of algorithms developed in graphical programming interface including the non-Cartesian MR reconstruction algorithms for PROPELLER and spiral as well as spin simulation and trajectory visualization of a FLORET example. The graphical programming interface framework is shown to be a versatile prototyping environment for developing numeric algorithms used in the latest MR techniques. © 2014 Wiley Periodicals, Inc.

  14. Integration of real-time 3D capture, reconstruction, and light-field display

    NASA Astrophysics Data System (ADS)

    Zhang, Zhaoxing; Geng, Zheng; Li, Tuotuo; Pei, Renjing; Liu, Yongchun; Zhang, Xiao

    2015-03-01

    Effective integration of 3D acquisition, reconstruction (modeling) and display technologies into a seamless systems provides augmented experience of visualizing and analyzing real objects and scenes with realistic 3D sensation. Applications can be found in medical imaging, gaming, virtual or augmented reality and hybrid simulations. Although 3D acquisition, reconstruction, and display technologies have gained significant momentum in recent years, there seems a lack of attention on synergistically combining these components into a "end-to-end" 3D visualization system. We designed, built and tested an integrated 3D visualization system that is able to capture in real-time 3D light-field images, perform 3D reconstruction to build 3D model of the objects, and display the 3D model on a large autostereoscopic screen. In this article, we will present our system architecture and component designs, hardware/software implementations, and experimental results. We will elaborate on our recent progress on sparse camera array light-field 3D acquisition, real-time dense 3D reconstruction, and autostereoscopic multi-view 3D display. A prototype is finally presented with test results to illustrate the effectiveness of our proposed integrated 3D visualization system.

  15. Coded-Aperture X- or gamma -ray telescope with Least- squares image reconstruction. III. Data acquisition and analysis enhancements

    NASA Astrophysics Data System (ADS)

    Kohman, T. P.

    1995-05-01

    The design of a cosmic X- or gamma -ray telescope with least- squares image reconstruction and its simulated operation have been described (Rev. Sci. Instrum. 60, 3396 and 3410 (1989)). Use of an auxiliary open aperture ("limiter") ahead of the coded aperture limits the object field to fewer pixels than detector elements, permitting least-squares reconstruction with improved accuracy in the imaged field; it also yields a uniformly sensitive ("flat") central field. The design has been enhanced to provide for mask-antimask operation. This cancels and eliminates uncertainties in the detector background, and the simulated results have virtually the same statistical accuracy (pixel-by-pixel output-input RMSD) as with a single mask alone. The simulations have been made more realistic by incorporating instrumental blurring of sources. A second-stage least-squares procedure had been developed to determine the precise positions and total fluxes of point sources responsible for clusters of above-background pixels in the field resulting from the first-stage reconstruction. Another program converts source positions in the image plane to celestial coordinates and vice versa, the image being a gnomic projection of a region of the sky.

  16. 3D equilibrium reconstruction with islands

    DOE PAGES

    Cianciosa, M.; Hirshman, S. P.; Seal, S. K.; ...

    2018-02-15

    This study presents the development of a 3D equilibrium reconstruction tool and the results of the first-ever reconstruction of an island equilibrium. The SIESTA non-nested equilibrium solver has been coupled to the V3FIT 3D equilibrium reconstruction code. Computed from a coupled VMEC and SIESTA model, synthetic signals are matched to measured signals by finding an optimal set of equilibrium parameters. By using the normalized pressure in place of normalized flux, non-equilibrium quantities needed by diagnostic signals can be efficiently mapped to the equilibrium. The effectiveness of this tool is demonstrated by reconstructing an island equilibrium of a DIII-D inner wallmore » limited L-mode case with an n = 1 error field applied. Finally, flat spots in Thomson and ECE temperature diagnostics show the reconstructed islands have the correct size and phase.« less

  17. 3D equilibrium reconstruction with islands

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cianciosa, M.; Hirshman, S. P.; Seal, S. K.

    This study presents the development of a 3D equilibrium reconstruction tool and the results of the first-ever reconstruction of an island equilibrium. The SIESTA non-nested equilibrium solver has been coupled to the V3FIT 3D equilibrium reconstruction code. Computed from a coupled VMEC and SIESTA model, synthetic signals are matched to measured signals by finding an optimal set of equilibrium parameters. By using the normalized pressure in place of normalized flux, non-equilibrium quantities needed by diagnostic signals can be efficiently mapped to the equilibrium. The effectiveness of this tool is demonstrated by reconstructing an island equilibrium of a DIII-D inner wallmore » limited L-mode case with an n = 1 error field applied. Finally, flat spots in Thomson and ECE temperature diagnostics show the reconstructed islands have the correct size and phase.« less

  18. Chroma intra prediction based on inter-channel correlation for HEVC.

    PubMed

    Zhang, Xingyu; Gisquet, Christophe; François, Edouard; Zou, Feng; Au, Oscar C

    2014-01-01

    In this paper, we investigate a new inter-channel coding mode called LM mode proposed for the next generation video coding standard called high efficiency video coding. This mode exploits inter-channel correlation using reconstructed luma to predict chroma linearly with parameters derived from neighboring reconstructed luma and chroma pixels at both encoder and decoder to avoid overhead signaling. In this paper, we analyze the LM mode and prove that the LM parameters for predicting original chroma and reconstructed chroma are statistically the same. We also analyze the error sensitivity of the LM parameters. We identify some LM mode problematic situations and propose three novel LM-like modes called LMA, LML, and LMO to address the situations. To limit the increase in complexity due to the LM-like modes, we propose some fast algorithms with the help of some new cost functions. We further identify some potentially-problematic conditions in the parameter estimation (including regression dilution problem) and introduce a novel model correction technique to detect and correct those conditions. Simulation results suggest that considerable BD-rate reduction can be achieved by the proposed LM-like modes and model correction technique. In addition, the performance gain of the two techniques appears to be essentially additive when combined.

  19. An Inversion Method for Reconstructing Hall Thruster Plume Parameters from the Line Integrated Measurements (Postprint)

    DTIC Science & Technology

    2007-07-01

    Technical Paper 3. DATES COVERED (From - To) 4. TITLE AND SUBTITLE 5a. CONTRACT NUMBER An Inversion Method for Reconstructing Hall Thruster Plume...298 (Rev. 8-98) Prescribed by ANSI Std. 239.18 An Inversion Method for Reconstructing Hall Thruster Plume Parameters from Line Integrated Measurements... Hall thruster is a high specific impulse electric thruster that produces a highly ionized plasma inside an annular chamber through the use of high

  20. Design of a multimodal fibers optic system for small animal optical imaging.

    PubMed

    Spinelli, Antonello E; Pagliazzi, Marco; Boschi, Federico

    2015-02-01

    Small animals optical imaging systems are widely used in pre-clinical research to image in vivo the bio-distribution of light emitting probes using fluorescence or bioluminescence modalities. In this work we presented a set of simulated results of a novel small animal optical imaging module based on a fibers optics matrix, coupled with a position sensitive detector, devoted to acquire bioluminescence and Cerenkov images. Simulations were performed using GEANT 4 code with the GAMOS architecture using the tissue optics plugin. Results showed that it is possible to image a 30 × 30 mm region of interest using a fiber optics array containing 100 optical fibers without compromising the quality of the reconstruction. The number of fibers necessary to cover an adequate portion of a small animal is thus quite modest. This design allows integrating the module with magnetic resonance (MR) in order to acquire optical and MR images at the same time. A detailed model of the mouse anatomy, obtained by segmentation of 3D MRI images, will improve the quality of optical 3D reconstruction. Copyright © 2014 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  1. Fast and accurate computation of system matrix for area integral model-based algebraic reconstruction technique

    NASA Astrophysics Data System (ADS)

    Zhang, Shunli; Zhang, Dinghua; Gong, Hao; Ghasemalizadeh, Omid; Wang, Ge; Cao, Guohua

    2014-11-01

    Iterative algorithms, such as the algebraic reconstruction technique (ART), are popular for image reconstruction. For iterative reconstruction, the area integral model (AIM) is more accurate for better reconstruction quality than the line integral model (LIM). However, the computation of the system matrix for AIM is more complex and time-consuming than that for LIM. Here, we propose a fast and accurate method to compute the system matrix for AIM. First, we calculate the intersection of each boundary line of a narrow fan-beam with pixels in a recursive and efficient manner. Then, by grouping the beam-pixel intersection area into six types according to the slopes of the two boundary lines, we analytically compute the intersection area of the narrow fan-beam with the pixels in a simple algebraic fashion. Overall, experimental results show that our method is about three times faster than the Siddon algorithm and about two times faster than the distance-driven model (DDM) in computation of the system matrix. The reconstruction speed of our AIM-based ART is also faster than the LIM-based ART that uses the Siddon algorithm and DDM-based ART, for one iteration. The fast reconstruction speed of our method was accomplished without compromising the image quality.

  2. Transverse Phase Space Reconstruction and Emittance Measurement of Intense Electron Beams using a Tomography Technique

    NASA Astrophysics Data System (ADS)

    Stratakis, D.; Kishek, R. A.; Li, H.; Bernal, S.; Walter, M.; Tobin, J.; Quinn, B.; Reiser, M.; O'Shea, P. G.

    2006-11-01

    Tomography is the technique of reconstructing an image from its projections. It is widely used in the medical community to observe the interior of the human body by processing multiple x-ray images taken at different angles, A few pioneering researchers have adapted tomography to reconstruct detailed phase space maps of charged particle beams. Some questions arise regarding the limitations of tomography technique for space charge dominated beams. For instance is the linear space charge force a valid approximation? Does tomography equally reproduce phase space for complex, experimentally observed, initial particle distributions? Does tomography make any assumptions about the initial distribution? This study explores the use of accurate modeling with the particle-in-cell code WARP to address these questions, using a wide range of different initial distributions in the code. The study also includes a number of experimental results on tomographic phase space mapping performed on the University of Maryland Electron Ring (UMER).

  3. Format preferences of district attorneys for post-mortem medical imaging reports: understandability, cost effectiveness, and suitability for the courtroom: a questionnaire based study.

    PubMed

    Ampanozi, Garyfalia; Zimmermann, David; Hatch, Gary M; Ruder, Thomas D; Ross, Steffen; Flach, Patricia M; Thali, Michael J; Ebert, Lars C

    2012-05-01

    The objective of this study was to explore the perception of the legal authorities regarding different report types and visualization techniques for post-mortem radiological findings. A standardized digital questionnaire was developed and the district attorneys in the catchment area of the affiliated Forensic Institute were requested to evaluate four different types of forensic imaging reports based on four cases examples. Each case was described in four different report types (short written report only, gray-scale CT image with figure caption, color-coded CT image with figure caption, 3D-reconstruction with figure caption). The survey participants were asked to evaluate those types of reports regarding understandability, cost effectiveness and overall appropriateness for the courtroom. 3D reconstructions and color-coded CT images accompanied by written report were preferred regarding understandability and cost/effectiveness. 3D reconstructions of the forensic findings reviewed as most adequate for court. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  4. Overview of the NASA Glenn Flux Reconstruction Based High-Order Unstructured Grid Code

    NASA Technical Reports Server (NTRS)

    Spiegel, Seth C.; DeBonis, James R.; Huynh, H. T.

    2016-01-01

    A computational fluid dynamics code based on the flux reconstruction (FR) method is currently being developed at NASA Glenn Research Center to ultimately provide a large- eddy simulation capability that is both accurate and efficient for complex aeropropulsion flows. The FR approach offers a simple and efficient method that is easy to implement and accurate to an arbitrary order on common grid cell geometries. The governing compressible Navier-Stokes equations are discretized in time using various explicit Runge-Kutta schemes, with the default being the 3-stage/3rd-order strong stability preserving scheme. The code is written in modern Fortran (i.e., Fortran 2008) and parallelization is attained through MPI for execution on distributed-memory high-performance computing systems. An h- refinement study of the isentropic Euler vortex problem is able to empirically demonstrate the capability of the FR method to achieve super-accuracy for inviscid flows. Additionally, the code is applied to the Taylor-Green vortex problem, performing numerous implicit large-eddy simulations across a range of grid resolutions and solution orders. The solution found by a pseudo-spectral code is commonly used as a reference solution to this problem, and the FR code is able to reproduce this solution using approximately the same grid resolution. Finally, an examination of the code's performance demonstrates good parallel scaling, as well as an implementation of the FR method with a computational cost/degree- of-freedom/time-step that is essentially independent of the solution order of accuracy for structured geometries.

  5. Neutronic calculation of fast reactors by the EUCLID/V1 integrated code

    NASA Astrophysics Data System (ADS)

    Koltashev, D. A.; Stakhanova, A. A.

    2017-01-01

    This article considers neutronic calculation of a fast-neutron lead-cooled reactor BREST-OD-300 by the EUCLID/V1 integrated code. The main goal of development and application of integrated codes is a nuclear power plant safety justification. EUCLID/V1 is integrated code designed for coupled neutronics, thermomechanical and thermohydraulic fast reactor calculations under normal and abnormal operating conditions. EUCLID/V1 code is being developed in the Nuclear Safety Institute of the Russian Academy of Sciences. The integrated code has a modular structure and consists of three main modules: thermohydraulic module HYDRA-IBRAE/LM/V1, thermomechanical module BERKUT and neutronic module DN3D. In addition, the integrated code includes databases with fuel, coolant and structural materials properties. Neutronic module DN3D provides full-scale simulation of neutronic processes in fast reactors. Heat sources distribution, control rods movement, reactivity level changes and other processes can be simulated. Neutron transport equation in multigroup diffusion approximation is solved. This paper contains some calculations implemented as a part of EUCLID/V1 code validation. A fast-neutron lead-cooled reactor BREST-OD-300 transient simulation (fuel assembly floating, decompression of passive feedback system channel) and cross-validation with MCU-FR code results are presented in this paper. The calculations demonstrate EUCLID/V1 code application for BREST-OD-300 simulating and safety justification.

  6. Three-dimensional brain reconstruction of in vivo electrode tracks for neuroscience and neural prosthetic applications

    PubMed Central

    Markovitz, Craig D.; Tang, Tien T.; Edge, David P.; Lim, Hubert H.

    2012-01-01

    The brain is a densely interconnected network that relies on populations of neurons within and across multiple nuclei to code for features leading to perception and action. However, the neurophysiology field is still dominated by the characterization of individual neurons, rather than simultaneous recordings across multiple regions, without consistent spatial reconstruction of their locations for comparisons across studies. There are sophisticated histological and imaging techniques for performing brain reconstructions. However, what is needed is a method that is relatively easy and inexpensive to implement in a typical neurophysiology lab and provides consistent identification of electrode locations to make it widely used for pooling data across studies and research groups. This paper presents our initial development of such an approach for reconstructing electrode tracks and site locations within the guinea pig inferior colliculus (IC) to identify its functional organization for frequency coding relevant for a new auditory midbrain implant (AMI). Encouragingly, the spatial error associated with different individuals reconstructing electrode tracks for the same midbrain was less than 65 μm, corresponding to an error of ~1.5% relative to the entire IC structure (~4–5 mm diameter sphere). Furthermore, the reconstructed frequency laminae of the IC were consistently aligned across three sampled midbrains, demonstrating the ability to use our method to combine location data across animals. Hopefully, through further improvements in our reconstruction method, it can be used as a standard protocol across neurophysiology labs to characterize neural data not only within the IC but also within other brain regions to help bridge the gap between cellular activity and network function. Clinically, correlating function with location within and across multiple brain regions can guide optimal placement of electrodes for the growing field of neural prosthetics. PMID:22754502

  7. Metabolic Reconstruction of Setaria italica: A Systems Biology Approach for Integrating Tissue-Specific Omics and Pathway Analysis of Bioenergy Grasses.

    PubMed

    de Oliveira Dal'Molin, Cristiana G; Orellana, Camila; Gebbie, Leigh; Steen, Jennifer; Hodson, Mark P; Chrysanthopoulos, Panagiotis; Plan, Manuel R; McQualter, Richard; Palfreyman, Robin W; Nielsen, Lars K

    2016-01-01

    The urgent need for major gains in industrial crops productivity and in biofuel production from bioenergy grasses have reinforced attention on understanding C4 photosynthesis. Systems biology studies of C4 model plants may reveal important features of C4 metabolism. Here we chose foxtail millet (Setaria italica), as a C4 model plant and developed protocols to perform systems biology studies. As part of the systems approach, we have developed and used a genome-scale metabolic reconstruction in combination with the use of multi-omics technologies to gain more insights into the metabolism of S. italica. mRNA, protein, and metabolite abundances, were measured in mature and immature stem/leaf phytomers, and the multi-omics data were integrated into the metabolic reconstruction framework to capture key metabolic features in different developmental stages of the plant. RNA-Seq reads were mapped to the S. italica resulting for 83% coverage of the protein coding genes of S. italica. Besides revealing similarities and differences in central metabolism of mature and immature tissues, transcriptome analysis indicates significant gene expression of two malic enzyme isoforms (NADP- ME and NAD-ME). Although much greater expression levels of NADP-ME genes are observed and confirmed by the correspondent protein abundances in the samples, the expression of multiple genes combined to the significant abundance of metabolites that participates in C4 metabolism of NAD-ME and NADP-ME subtypes suggest that S. italica may use mixed decarboxylation modes of C4 photosynthetic pathways under different plant developmental stages. The overall analysis also indicates different levels of regulation in mature and immature tissues in carbon fixation, glycolysis, TCA cycle, amino acids, fatty acids, lignin, and cellulose syntheses. Altogether, the multi-omics analysis reveals different biological entities and their interrelation and regulation over plant development. With this study, we demonstrated that this systems approach is powerful enough to complement the functional metabolic annotation of bioenergy grasses.

  8. Metabolic Reconstruction of Setaria italica: A Systems Biology Approach for Integrating Tissue-Specific Omics and Pathway Analysis of Bioenergy Grasses

    PubMed Central

    de Oliveira Dal'Molin, Cristiana G.; Orellana, Camila; Gebbie, Leigh; Steen, Jennifer; Hodson, Mark P.; Chrysanthopoulos, Panagiotis; Plan, Manuel R.; McQualter, Richard; Palfreyman, Robin W.; Nielsen, Lars K.

    2016-01-01

    The urgent need for major gains in industrial crops productivity and in biofuel production from bioenergy grasses have reinforced attention on understanding C4 photosynthesis. Systems biology studies of C4 model plants may reveal important features of C4 metabolism. Here we chose foxtail millet (Setaria italica), as a C4 model plant and developed protocols to perform systems biology studies. As part of the systems approach, we have developed and used a genome-scale metabolic reconstruction in combination with the use of multi-omics technologies to gain more insights into the metabolism of S. italica. mRNA, protein, and metabolite abundances, were measured in mature and immature stem/leaf phytomers, and the multi-omics data were integrated into the metabolic reconstruction framework to capture key metabolic features in different developmental stages of the plant. RNA-Seq reads were mapped to the S. italica resulting for 83% coverage of the protein coding genes of S. italica. Besides revealing similarities and differences in central metabolism of mature and immature tissues, transcriptome analysis indicates significant gene expression of two malic enzyme isoforms (NADP- ME and NAD-ME). Although much greater expression levels of NADP-ME genes are observed and confirmed by the correspondent protein abundances in the samples, the expression of multiple genes combined to the significant abundance of metabolites that participates in C4 metabolism of NAD-ME and NADP-ME subtypes suggest that S. italica may use mixed decarboxylation modes of C4 photosynthetic pathways under different plant developmental stages. The overall analysis also indicates different levels of regulation in mature and immature tissues in carbon fixation, glycolysis, TCA cycle, amino acids, fatty acids, lignin, and cellulose syntheses. Altogether, the multi-omics analysis reveals different biological entities and their interrelation and regulation over plant development. With this study, we demonstrated that this systems approach is powerful enough to complement the functional metabolic annotation of bioenergy grasses. PMID:27559337

  9. Joint Chroma Subsampling and Distortion-Minimization-Based Luma Modification for RGB Color Images With Application.

    PubMed

    Chung, Kuo-Liang; Hsu, Tsu-Chun; Huang, Chi-Chao

    2017-10-01

    In this paper, we propose a novel and effective hybrid method, which joins the conventional chroma subsampling and the distortion-minimization-based luma modification together, to improve the quality of the reconstructed RGB full-color image. Assume the input RGB full-color image has been transformed to a YUV image, prior to compression. For each 2×2 UV block, one 4:2:0 subsampling is applied to determine the one subsampled U and V components, U s and V s . Based on U s , V s , and the corresponding 2×2 original RGB block, a main theorem is provided to determine the ideally modified 2×2 luma block in constant time such that the color peak signal-to-noise ratio (CPSNR) quality distortion between the original 2×2 RGB block and the reconstructed 2×2 RGB block can be minimized in a globally optimal sense. Furthermore, the proposed hybrid method and the delivered theorem are adjusted to tackle the digital time delay integration images and the Bayer mosaic images whose Bayer CFA structure has been widely used in modern commercial digital cameras. Based on the IMAX, Kodak, and screen content test image sets, the experimental results demonstrate that in high efficiency video coding, the proposed hybrid method has substantial quality improvement, in terms of the CPSNR quality, visual effect, CPSNR-bitrate trade-off, and Bjøntegaard delta PSNR performance, of the reconstructed RGB images when compared with existing chroma subsampling schemes.

  10. Reconstruction and downscaling of Eastern Mediterranean OSCAR satellite surface current data using DINEOF

    NASA Astrophysics Data System (ADS)

    Nikolaidis, Andreas; Stylianou, Stavros; Georgiou, Georgios; Hajimitsis, Diofantos; Gravanis, Elias; Akylas, Evangelos

    2015-04-01

    During the last decade, Rixen (2005) and Alvera-Azkarate (2010) presented the DINEOF (Data Interpolating Empirical Orthogonal Functions) method, a EOF-based technique to reconstruct missing data in satellite images. The application of DINEOF method, proved to provide relative success in various experimental trials (Wang and Liu, 2013; Nikolaidis et al., 2013;2014), and tends to be an effective and computationally affordable solution, on the problem of data reconstruction, for missing data from geophysical fields, such as chlorophyll-a, sea surface temperatures or salinity and geophysical fields derived from satellite data. Implementation of this method in a GIS system will provide with a more complete, integrated approach, permitting the expansion of the applicability over various aspects. This may be especially useful in studies where various data of different kind, have to be examined. For this purpose, in this study we have implemented and present a GIS toolbox that aims to automate the usage of the algorithm, incorporating the DINEOF codes provided by GHER (GeoHydrodynamics and Environment Research Group of University of Liege) into the ArcGIS®. ArcGIS® is a well known standard on Geographical Information Systems, used over the years for various remote sensing procedures, in sea and land environment alike. A case-study of filling the missing satellite derived current data in the Eastern Mediterranean Sea area, for a monthly period is analyzed, as an example for the effectiveness and simplicity of the usage of this toolbox. The specific study focuses to OSCAR satellite data (http://www.oscar.noaa.gov/) collected by NOAA/NESDIS Operational Surface Current Processing and Data Center, from the respective products of OSCAR Project Office Earth and Space Research organization, that provides free online access to unfiltered (1/3 degree) resolution. All the 5-day mean products data coverage were successfully reconstructed. KEY WORDS: Remote Sensing, Cyprus, Mediterranean, DINEOF, ArcGIS, data reconstruction.

  11. Integrated modeling of plasma ramp-up in DIII-D ITER-like and high bootstrap current scenario discharges

    NASA Astrophysics Data System (ADS)

    Wu, M. Q.; Pan, C. K.; Chan, V. S.; Li, G. Q.; Garofalo, A. M.; Jian, X.; Liu, L.; Ren, Q. L.; Chen, J. L.; Gao, X.; Gong, X. Z.; Ding, S. Y.; Qian, J. P.; Cfetr Physics Team

    2018-04-01

    Time-dependent integrated modeling of DIII-D ITER-like and high bootstrap current plasma ramp-up discharges has been performed with the equilibrium code EFIT, and the transport codes TGYRO and ONETWO. Electron and ion temperature profiles are simulated by TGYRO with the TGLF (SAT0 or VX model) turbulent and NEO neoclassical transport models. The VX model is a new empirical extension of the TGLF turbulent model [Jian et al., Nucl. Fusion 58, 016011 (2018)], which captures the physics of multi-scale interaction between low-k and high-k turbulence from nonlinear gyro-kinetic simulation. This model is demonstrated to accurately model low Ip discharges from the EAST tokamak. Time evolution of the plasma current density profile is simulated by ONETWO with the experimental current ramp-up rate. The general trend of the predicted evolution of the current density profile is consistent with that obtained from the equilibrium reconstruction with Motional Stark effect constraints. The predicted evolution of βN , li , and βP also agrees well with the experiments. For the ITER-like cases, the predicted electron and ion temperature profiles using TGLF_Sat0 agree closely with the experimental measured profiles, and are demonstrably better than other proposed transport models. For the high bootstrap current case, the predicted electron and ion temperature profiles perform better in the VX model. It is found that the SAT0 model works well at high IP (>0.76 MA) while the VX model covers a wider range of plasma current ( IP > 0.6 MA). The results reported in this paper suggest that the developed integrated modeling could be a candidate for ITER and CFETR ramp-up engineering design modeling.

  12. Light field reconstruction robust to signal dependent noise

    NASA Astrophysics Data System (ADS)

    Ren, Kun; Bian, Liheng; Suo, Jinli; Dai, Qionghai

    2014-11-01

    Capturing four dimensional light field data sequentially using a coded aperture camera is an effective approach but suffers from low signal noise ratio. Although multiplexing can help raise the acquisition quality, noise is still a big issue especially for fast acquisition. To address this problem, this paper proposes a noise robust light field reconstruction method. Firstly, scene dependent noise model is studied and incorporated into the light field reconstruction framework. Then, we derive an optimization algorithm for the final reconstruction. We build a prototype by hacking an off-the-shelf camera for data capturing and prove the concept. The effectiveness of this method is validated with experiments on the real captured data.

  13. Differential Binary Encoding Method for Calibrating Image Sensors Based on IOFBs

    PubMed Central

    Fernández, Pedro R.; Lázaro-Galilea, José Luis; Gardel, Alfredo; Espinosa, Felipe; Bravo, Ignacio; Cano, Ángel

    2012-01-01

    Image transmission using incoherent optical fiber bundles (IOFBs) requires prior calibration to obtain the spatial in-out fiber correspondence necessary to reconstruct the image captured by the pseudo-sensor. This information is recorded in a Look-Up Table called the Reconstruction Table (RT), used later for reordering the fiber positions and reconstructing the original image. This paper presents a very fast method based on image-scanning using spaces encoded by a weighted binary code to obtain the in-out correspondence. The results demonstrate that this technique yields a remarkable reduction in processing time and the image reconstruction quality is very good compared to previous techniques based on spot or line scanning, for example. PMID:22666023

  14. NIMROD modeling of quiescent H-mode: reconstruction considerations and saturation mechanism

    NASA Astrophysics Data System (ADS)

    King, J. R.; Burrell, K. H.; Garofalo, A. M.; Groebner, R. J.; Kruger, S. E.; Pankin, A. Y.; Snyder, P. B.

    2017-02-01

    The extended-MHD NIMROD code (Sovinec and King 2010 J. Comput. Phys. 229 5803) models broadband-MHD activity from a reconstruction of a quiescent H-mode shot on the DIII-D tokamak (Luxon 2002 Nucl. Fusion 42 614). Computations with the reconstructed toroidal and poloidal ion flows exhibit low-{{n}φ} perturbations ({{n}φ}≃ 1 -5) that grow and saturate into a turbulent-like MHD state. The workflow used to project the reconstructed state onto the NIMROD basis functions re-solves the Grad-Shafranov equation and extrapolates profiles to include scrape-off-layer currents. Evaluation of the transport from the turbulent-like MHD state leads to a relaxation of the density and temperature profiles.

  15. Proposing national identification number on dental prostheses as universal personal identification code - A revolution in forensic odontology

    PubMed Central

    Baad, Rajendra K.; Belgaumi, Uzma; Vibhute, Nupura; Kadashetti, Vidya; Chandrappa, Pramod Redder; Gugwad, Sushma

    2015-01-01

    The proper identification of a decedent is not only important for humanitarian and emotional reasons, but also for legal and administrative purposes. During the reconstructive identification process, all necessary information is gathered from the unknown body of the victim and hence that an objective reconstructed profile can be established. Denture marking systems are being used in various situations, and a number of direct and indirect methods are reported. We propose that national identification numbers be incorporated in all removable and fixed prostheses, so as to adopt a single and definitive universal personal identification code with the aim of achieving a uniform, standardized, easy, and fast identification method worldwide for forensic identification. PMID:26005294

  16. Temporal compressive imaging for video

    NASA Astrophysics Data System (ADS)

    Zhou, Qun; Zhang, Linxia; Ke, Jun

    2018-01-01

    In many situations, imagers are required to have higher imaging speed, such as gunpowder blasting analysis and observing high-speed biology phenomena. However, measuring high-speed video is a challenge to camera design, especially, in infrared spectrum. In this paper, we reconstruct a high-frame-rate video from compressive video measurements using temporal compressive imaging (TCI) with a temporal compression ratio T=8. This means that, 8 unique high-speed temporal frames will be obtained from a single compressive frame using a reconstruction algorithm. Equivalently, the video frame rates is increased by 8 times. Two methods, two-step iterative shrinkage/threshold (TwIST) algorithm and the Gaussian mixture model (GMM) method, are used for reconstruction. To reduce reconstruction time and memory usage, each frame of size 256×256 is divided into patches of size 8×8. The influence of different coded mask to reconstruction is discussed. The reconstruction qualities using TwIST and GMM are also compared.

  17. More IMPATIENT: A Gridding-Accelerated Toeplitz-based Strategy for Non-Cartesian High-Resolution 3D MRI on GPUs

    PubMed Central

    Gai, Jiading; Obeid, Nady; Holtrop, Joseph L.; Wu, Xiao-Long; Lam, Fan; Fu, Maojing; Haldar, Justin P.; Hwu, Wen-mei W.; Liang, Zhi-Pei; Sutton, Bradley P.

    2013-01-01

    Several recent methods have been proposed to obtain significant speed-ups in MRI image reconstruction by leveraging the computational power of GPUs. Previously, we implemented a GPU-based image reconstruction technique called the Illinois Massively Parallel Acquisition Toolkit for Image reconstruction with ENhanced Throughput in MRI (IMPATIENT MRI) for reconstructing data collected along arbitrary 3D trajectories. In this paper, we improve IMPATIENT by removing computational bottlenecks by using a gridding approach to accelerate the computation of various data structures needed by the previous routine. Further, we enhance the routine with capabilities for off-resonance correction and multi-sensor parallel imaging reconstruction. Through implementation of optimized gridding into our iterative reconstruction scheme, speed-ups of more than a factor of 200 are provided in the improved GPU implementation compared to the previous accelerated GPU code. PMID:23682203

  18. An Interactive and Comprehensive Working Environment for High-Energy Physics Software with Python and Jupyter Notebooks

    NASA Astrophysics Data System (ADS)

    Braun, N.; Hauth, T.; Pulvermacher, C.; Ritter, M.

    2017-10-01

    Today’s analyses for high-energy physics (HEP) experiments involve processing a large amount of data with highly specialized algorithms. The contemporary workflow from recorded data to final results is based on the execution of small scripts - often written in Python or ROOT macros which call complex compiled algorithms in the background - to perform fitting procedures and generate plots. During recent years interactive programming environments, such as Jupyter, became popular. Jupyter allows to develop Python-based applications, so-called notebooks, which bundle code, documentation and results, e.g. plots. Advantages over classical script-based approaches is the feature to recompute only parts of the analysis code, which allows for fast and iterative development, and a web-based user frontend, which can be hosted centrally and only requires a browser on the user side. In our novel approach, Python and Jupyter are tightly integrated into the Belle II Analysis Software Framework (basf2), currently being developed for the Belle II experiment in Japan. This allows to develop code in Jupyter notebooks for every aspect of the event simulation, reconstruction and analysis chain. These interactive notebooks can be hosted as a centralized web service via jupyterhub with docker and used by all scientists of the Belle II Collaboration. Because of its generality and encapsulation, the setup can easily be scaled to large installations.

  19. From Panoramic Photos to a Low-Cost Photogrammetric Workflow for Cultural Heritage 3d Documentation

    NASA Astrophysics Data System (ADS)

    D'Annibale, E.; Tassetti, A. N.; Malinverni, E. S.

    2013-07-01

    The research aims to optimize a workflow of architecture documentation: starting from panoramic photos, tackling available instruments and technologies to propose an integrated, quick and low-cost solution of Virtual Architecture. The broader research background shows how to use spherical panoramic images for the architectural metric survey. The input data (oriented panoramic photos), the level of reliability and Image-based Modeling methods constitute an integrated and flexible 3D reconstruction approach: from the professional survey of cultural heritage to its communication in virtual museum. The proposed work results from the integration and implementation of different techniques (Multi-Image Spherical Photogrammetry, Structure from Motion, Imagebased Modeling) with the aim to achieve high metric accuracy and photorealistic performance. Different documentation chances are possible within the proposed workflow: from the virtual navigation of spherical panoramas to complex solutions of simulation and virtual reconstruction. VR tools make for the integration of different technologies and the development of new solutions for virtual navigation. Image-based Modeling techniques allow 3D model reconstruction with photo realistic and high-resolution texture. High resolution of panoramic photo and algorithms of panorama orientation and photogrammetric restitution vouch high accuracy and high-resolution texture. Automated techniques and their following integration are subject of this research. Data, advisably processed and integrated, provide different levels of analysis and virtual reconstruction joining the photogrammetric accuracy to the photorealistic performance of the shaped surfaces. Lastly, a new solution of virtual navigation is tested. Inside the same environment, it proposes the chance to interact with high resolution oriented spherical panorama and 3D reconstructed model at once.

  20. An Energy-Efficient Compressive Image Coding for Green Internet of Things (IoT).

    PubMed

    Li, Ran; Duan, Xiaomeng; Li, Xu; He, Wei; Li, Yanling

    2018-04-17

    Aimed at a low-energy consumption of Green Internet of Things (IoT), this paper presents an energy-efficient compressive image coding scheme, which provides compressive encoder and real-time decoder according to Compressive Sensing (CS) theory. The compressive encoder adaptively measures each image block based on the block-based gradient field, which models the distribution of block sparse degree, and the real-time decoder linearly reconstructs each image block through a projection matrix, which is learned by Minimum Mean Square Error (MMSE) criterion. Both the encoder and decoder have a low computational complexity, so that they only consume a small amount of energy. Experimental results show that the proposed scheme not only has a low encoding and decoding complexity when compared with traditional methods, but it also provides good objective and subjective reconstruction qualities. In particular, it presents better time-distortion performance than JPEG. Therefore, the proposed compressive image coding is a potential energy-efficient scheme for Green IoT.

  1. Evolutionary Construction of Block-Based Neural Networks in Consideration of Failure

    NASA Astrophysics Data System (ADS)

    Takamori, Masahito; Koakutsu, Seiichi; Hamagami, Tomoki; Hirata, Hironori

    In this paper we propose a modified gene coding and an evolutionary construction in consideration of failure in evolutionary construction of Block-Based Neural Networks. In the modified gene coding, we arrange the genes of weights on a chromosome in consideration of the position relation of the genes of weight and structure. By the modified gene coding, the efficiency of search by crossover is increased. Thereby, it is thought that improvement of the convergence rate of construction and shortening of construction time can be performed. In the evolutionary construction in consideration of failure, the structure which is adapted for failure is built in the state where failure occured. Thereby, it is thought that BBNN can be reconstructed in a short time at the time of failure. To evaluate the proposed method, we apply it to pattern classification and autonomous mobile robot control problems. The computational experiments indicate that the proposed method can improve convergence rate of construction and shorten of construction and reconstruction time.

  2. Abstract ID: 176 Geant4 implementation of inter-atomic interference effect in small-angle coherent X-ray scattering for materials of medical interest.

    PubMed

    Paternò, Gianfranco; Cardarelli, Paolo; Contillo, Adriano; Gambaccini, Mauro; Taibi, Angelo

    2018-01-01

    Advanced applications of digital mammography such as dual-energy and tomosynthesis require multiple exposures and thus deliver higher dose compared to standard mammograms. A straightforward manner to reduce patient dose without affecting image quality would be removal of the anti-scatter grid, provided that the involved reconstruction algorithms are able to take the scatter figure into account [1]. Monte Carlo simulations are very well suited for the calculation of X-ray scatter distribution and can be used to integrate such information within the reconstruction software. Geant4 is an open source C++ particle tracking code widely used in several physical fields, including medical physics [2,3]. However, the coherent scattering cross section used by the standard Geant4 code does not take into account the influence of molecular interference. According to the independent atomic scattering approximation (the so-called free-atom model), coherent radiation is indistinguishable from primary radiation because its angular distribution is peaked in the forward direction. Since interference effects occur between x-rays scattered by neighbouring atoms in matter, it was shown experimentally that the scatter distribution is affected by the molecular structure of the target, even in amorphous materials. The most important consequence is that the coherent scatter distribution is not peaked in the forward direction, and the position of the maximum is strongly material-dependent [4]. In this contribution, we present the implementation of a method to take into account inter-atomic interference in small-angle coherent scattering in Geant4, including a dedicated data set of suitable molecular form factor values for several materials of clinical interest. Furthermore, we present scatter images of simple geometric phantoms in which the Rayleigh contribution is rigorously evaluated. Copyright © 2017.

  3. Integrated Approach to Reconstruction of Microbial Regulatory Networks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rodionov, Dmitry A; Novichkov, Pavel S

    2013-11-04

    This project had the goal(s) of development of integrated bioinformatics platform for genome-scale inference and visualization of transcriptional regulatory networks (TRNs) in bacterial genomes. The work was done in Sanford-Burnham Medical Research Institute (SBMRI, P.I. D.A. Rodionov) and Lawrence Berkeley National Laboratory (LBNL, co-P.I. P.S. Novichkov). The developed computational resources include: (1) RegPredict web-platform for TRN inference and regulon reconstruction in microbial genomes, and (2) RegPrecise database for collection, visualization and comparative analysis of transcriptional regulons reconstructed by comparative genomics. These analytical resources were selected as key components in the DOE Systems Biology KnowledgeBase (SBKB). The high-quality data accumulated inmore » RegPrecise will provide essential datasets of reference regulons in diverse microbes to enable automatic reconstruction of draft TRNs in newly sequenced genomes. We outline our progress toward the three aims of this grant proposal, which were: Develop integrated platform for genome-scale regulon reconstruction; Infer regulatory annotations in several groups of bacteria and building of reference collections of microbial regulons; and Develop KnowledgeBase on microbial transcriptional regulation.« less

  4. Definite Integrals, Some Involving Residue Theory Evaluated by Maple Code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bowman, Kimiko o

    2010-01-01

    The calculus of residue is applied to evaluate certain integrals in the range (-{infinity} to {infinity}) using the Maple symbolic code. These integrals are of the form {integral}{sub -{infinity}}{sup {infinity}} cos(x)/[(x{sup 2} + a{sup 2})(x{sup 2} + b{sup 2}) (x{sup 2} + c{sup 2})]dx and similar extensions. The Maple code is also applied to expressions in maximum likelihood estimator moments when sampling from the negative binomial distribution. In general the Maple code approach to the integrals gives correct answers to specified decimal places, but the symbolic result may be extremely long and complex.

  5. Maximum entropy reconstruction of poloidal magnetic field and radial electric field profiles in tokamaks

    NASA Astrophysics Data System (ADS)

    Chen, Yihang; Xiao, Chijie; Yang, Xiaoyi; Wang, Tianbo; Xu, Tianchao; Yu, Yi; Xu, Min; Wang, Long; Lin, Chen; Wang, Xiaogang

    2017-10-01

    The Laser-driven Ion beam trace probe (LITP) is a new diagnostic method for measuring poloidal magnetic field (Bp) and radial electric field (Er) in tokamaks. LITP injects a laser-driven ion beam into the tokamak, and Bp and Er profiles can be reconstructed using tomography methods. A reconstruction code has been developed to validate the LITP theory, and both 2D reconstruction of Bp and simultaneous reconstruction of Bp and Er have been attained. To reconstruct from experimental data with noise, Maximum Entropy and Gaussian-Bayesian tomography methods were applied and improved according to the characteristics of the LITP problem. With these improved methods, a reconstruction error level below 15% has been attained with a data noise level of 10%. These methods will be further tested and applied in the following LITP experiments. Supported by the ITER-CHINA program 2015GB120001, CHINA MOST under 2012YQ030142 and National Natural Science Foundation Abstract of China under 11575014 and 11375053.

  6. An international comparison of reimbursement for DIEAP flap breast reconstruction.

    PubMed

    Reid, A W N; Szpalski, C; Sheppard, N N; Morrison, C M; Blondeel, P N

    2015-11-01

    The deep inferior epigastric artery perforator (DIEAP) flap is currently considered the gold standard for autologous breast reconstruction. With the current economic climate and health cutbacks, we decided to survey reimbursement for DIEAP flaps performed at the main international centres in order to assess whether they are funded consistently. Data were collected confidentially from the main international centres by an anonymous questionnaire. Our results illustrate the wide disparity in international DIEAP flap breast reconstruction reimbursement: a unilateral DIEAP flap performed in New York, USA, attracts €20,759, whereas the same operation in Madrid, Spain, will only be reimbursed for €300. Only 35.7% of the surgeons can set up their own fee. Moreover, 85.7% of the participants estimated that the current fees are insufficient, and most of them feel that we are evolving towards an even lower reimbursement rate. In 55.8% of the countries represented, there is no DIEAP-specific coding; in comparison, 74.4% of the represented countries have a specific coding for transverse rectus abdominis (TRAM) flaps. Finally, despite the fact that DIEAP flaps have become the gold standard for breast reconstruction, they comprise only a small percentage of all the total number of breast reconstruction procedures performed (7-15%), with the only exception being Belgium (40%). Our results demonstrate that DIEAP flap breast reconstruction is inconsistently funded. Unfortunately though, it appears that the current reimbursement offered by many countries may dissuade institutions and surgeons from offering this procedure. However, substantial evidence exists supporting the cost-effectiveness of perforator flaps for breast reconstruction, and, in our opinion, the long-term clinical benefits for our patients are so important that this investment of time and money is absolutely essential. Copyright © 2015 British Association of Plastic, Reconstructive and Aesthetic Surgeons. Published by Elsevier Ltd. All rights reserved.

  7. Local Laplacian Coding From Theoretical Analysis of Local Coding Schemes for Locally Linear Classification.

    PubMed

    Pang, Junbiao; Qin, Lei; Zhang, Chunjie; Zhang, Weigang; Huang, Qingming; Yin, Baocai

    2015-12-01

    Local coordinate coding (LCC) is a framework to approximate a Lipschitz smooth function by combining linear functions into a nonlinear one. For locally linear classification, LCC requires a coding scheme that heavily determines the nonlinear approximation ability, posing two main challenges: 1) the locality making faraway anchors have smaller influences on current data and 2) the flexibility balancing well between the reconstruction of current data and the locality. In this paper, we address the problem from the theoretical analysis of the simplest local coding schemes, i.e., local Gaussian coding and local student coding, and propose local Laplacian coding (LPC) to achieve the locality and the flexibility. We apply LPC into locally linear classifiers to solve diverse classification tasks. The comparable or exceeded performances of state-of-the-art methods demonstrate the effectiveness of the proposed method.

  8. RAVE—a Detector-independent vertex reconstruction toolkit

    NASA Astrophysics Data System (ADS)

    Waltenberger, Wolfgang; Mitaroff, Winfried; Moser, Fabian

    2007-10-01

    A detector-independent toolkit for vertex reconstruction (RAVE ) is being developed, along with a standalone framework (VERTIGO ) for testing, analyzing and debugging. The core algorithms represent state of the art for geometric vertex finding and fitting by both linear (Kalman filter) and robust estimation methods. Main design goals are ease of use, flexibility for embedding into existing software frameworks, extensibility, and openness. The implementation is based on modern object-oriented techniques, is coded in C++ with interfaces for Java and Python, and follows an open-source approach. A beta release is available. VERTIGO = "vertex reconstruction toolkit and interface to generic objects".

  9. Real-time diamagnetic flux measurements on ASDEX Upgrade.

    PubMed

    Giannone, L; Geiger, B; Bilato, R; Maraschek, M; Odstrčil, T; Fischer, R; Fuchs, J C; McCarthy, P J; Mertens, V; Schuhbeck, K H

    2016-05-01

    Real-time diamagnetic flux measurements are now available on ASDEX Upgrade. In contrast to the majority of diamagnetic flux measurements on other tokamaks, no analog summation of signals is necessary for measuring the change in toroidal flux or for removing contributions arising from unwanted coupling to the plasma and poloidal field coil currents. To achieve the highest possible sensitivity, the diamagnetic measurement and compensation coil integrators are triggered shortly before plasma initiation when the toroidal field coil current is close to its maximum. In this way, the integration time can be chosen to measure only the small changes in flux due to the presence of plasma. Two identical plasma discharges with positive and negative magnetic field have shown that the alignment error with respect to the plasma current is negligible. The measured diamagnetic flux is compared to that predicted by TRANSP simulations. The poloidal beta inferred from the diamagnetic flux measurement is compared to the values calculated from magnetic equilibrium reconstruction codes. The diamagnetic flux measurement and TRANSP simulation can be used together to estimate the coupled power in discharges with dominant ion cyclotron resonance heating.

  10. Sparse coded image super-resolution using K-SVD trained dictionary based on regularized orthogonal matching pursuit.

    PubMed

    Sajjad, Muhammad; Mehmood, Irfan; Baik, Sung Wook

    2015-01-01

    Image super-resolution (SR) plays a vital role in medical imaging that allows a more efficient and effective diagnosis process. Usually, diagnosing is difficult and inaccurate from low-resolution (LR) and noisy images. Resolution enhancement through conventional interpolation methods strongly affects the precision of consequent processing steps, such as segmentation and registration. Therefore, we propose an efficient sparse coded image SR reconstruction technique using a trained dictionary. We apply a simple and efficient regularized version of orthogonal matching pursuit (ROMP) to seek the coefficients of sparse representation. ROMP has the transparency and greediness of OMP and the robustness of the L1-minization that enhance the dictionary learning process to capture feature descriptors such as oriented edges and contours from complex images like brain MRIs. The sparse coding part of the K-SVD dictionary training procedure is modified by substituting OMP with ROMP. The dictionary update stage allows simultaneously updating an arbitrary number of atoms and vectors of sparse coefficients. In SR reconstruction, ROMP is used to determine the vector of sparse coefficients for the underlying patch. The recovered representations are then applied to the trained dictionary, and finally, an optimization leads to high-resolution output of high-quality. Experimental results demonstrate that the super-resolution reconstruction quality of the proposed scheme is comparatively better than other state-of-the-art schemes.

  11. Limitations of bootstrap current models

    DOE PAGES

    Belli, Emily A.; Candy, Jefferey M.; Meneghini, Orso; ...

    2014-03-27

    We assess the accuracy and limitations of two analytic models of the tokamak bootstrap current: (1) the well-known Sauter model and (2) a recent modification of the Sauter model by Koh et al. For this study, we use simulations from the first-principles kinetic code NEO as the baseline to which the models are compared. Tests are performed using both theoretical parameter scans as well as core- to-edge scans of real DIII-D and NSTX plasma profiles. The effects of extreme aspect ratio, large impurity fraction, energetic particles, and high collisionality are studied. In particular, the error in neglecting cross-species collisional couplingmore » – an approximation inherent to both analytic models – is quantified. Moreover, the implications of the corrections from kinetic NEO simulations on MHD equilibrium reconstructions is studied via integrated modeling with kinetic EFIT.« less

  12. Non-axisymmetric equilibrium reconstruction of a current-carrying stellarator using external magnetic and soft x-ray inversion radius measurements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ma, X., E-mail: xzm0005@auburn.edu; Maurer, D. A.; Knowlton, S. F.

    2015-12-15

    Non-axisymmetric free-boundary equilibrium reconstructions of stellarator plasmas are performed for discharges in which the magnetic configuration is strongly modified by ohmically driven plasma current. These studies were performed on the compact toroidal hybrid device using the V3FIT reconstruction code with a set of 50 magnetic diagnostics external to the plasma. With the assumption of closed magnetic flux surfaces, the reconstructions using external magnetic measurements allow accurate estimates of the net toroidal flux within the last closed flux surface, the edge safety factor, and the plasma shape of these highly non-axisymmetric plasmas. The inversion radius of standard sawteeth is used tomore » infer the current profile near the magnetic axis; with external magnetic diagnostics alone, the current density profile is imprecisely reconstructed.« less

  13. Non-axisymmetric equilibrium reconstruction of a current-carrying stellarator using external magnetic and soft x-ray inversion radius measurements

    NASA Astrophysics Data System (ADS)

    Ma, X.; Maurer, D. A.; Knowlton, S. F.; ArchMiller, M. C.; Cianciosa, M. R.; Ennis, D. A.; Hanson, J. D.; Hartwell, G. J.; Hebert, J. D.; Herfindal, J. L.; Pandya, M. D.; Roberds, N. A.; Traverso, P. J.

    2015-12-01

    Non-axisymmetric free-boundary equilibrium reconstructions of stellarator plasmas are performed for discharges in which the magnetic configuration is strongly modified by ohmically driven plasma current. These studies were performed on the compact toroidal hybrid device using the V3FIT reconstruction code with a set of 50 magnetic diagnostics external to the plasma. With the assumption of closed magnetic flux surfaces, the reconstructions using external magnetic measurements allow accurate estimates of the net toroidal flux within the last closed flux surface, the edge safety factor, and the plasma shape of these highly non-axisymmetric plasmas. The inversion radius of standard sawteeth is used to infer the current profile near the magnetic axis; with external magnetic diagnostics alone, the current density profile is imprecisely reconstructed.

  14. 3D tomographic reconstruction using geometrical models

    NASA Astrophysics Data System (ADS)

    Battle, Xavier L.; Cunningham, Gregory S.; Hanson, Kenneth M.

    1997-04-01

    We address the issue of reconstructing an object of constant interior density in the context of 3D tomography where there is prior knowledge about the unknown shape. We explore the direct estimation of the parameters of a chosen geometrical model from a set of radiographic measurements, rather than performing operations (segmentation for example) on a reconstructed volume. The inverse problem is posed in the Bayesian framework. A triangulated surface describes the unknown shape and the reconstruction is computed with a maximum a posteriori (MAP) estimate. The adjoint differentiation technique computes the derivatives needed for the optimization of the model parameters. We demonstrate the usefulness of the approach and emphasize the techniques of designing forward and adjoint codes. We use the system response of the University of Arizona Fast SPECT imager to illustrate this method by reconstructing the shape of a heart phantom.

  15. Non-axisymmetric equilibrium reconstruction of a current-carrying stellarator using external magnetic and soft x-ray inversion radius measurements

    DOE PAGES

    Ma, X.; Maurer, D. A.; Knowlton, Stephen F.; ...

    2015-12-22

    Non-axisymmetric free-boundary equilibrium reconstructions of stellarator plasmas are performed for discharges in which the magnetic configuration is strongly modified by ohmically driven plasma current. These studies were performed on the compact toroidal hybrid device using the V3FIT reconstruction code with a set of 50 magnetic diagnostics external to the plasma. With the assumption of closed magnetic flux surfaces, the reconstructions using external magnetic measurements allow accurate estimates of the net toroidal flux within the last closed flux surface, the edge safety factor, and the plasma shape of these highly non-axisymmetric plasmas. Lastly, the inversion radius of standard saw-teeth is usedmore » to infer the current profile near the magnetic axis; with external magnetic diagnostics alone, the current density profile is imprecisely reconstructed.« less

  16. Comparison of Two Coronal Magnetic Field Models to Reconstruct a Sigmoidal Solar Active Region with Coronal Loops

    NASA Astrophysics Data System (ADS)

    Duan, Aiying; Jiang, Chaowei; Hu, Qiang; Zhang, Huai; Gary, G. Allen; Wu, S. T.; Cao, Jinbin

    2017-06-01

    Magnetic field extrapolation is an important tool to study the three-dimensional (3D) solar coronal magnetic field, which is difficult to directly measure. Various analytic models and numerical codes exist, but their results often drastically differ. Thus, a critical comparison of the modeled magnetic field lines with the observed coronal loops is strongly required to establish the credibility of the model. Here we compare two different non-potential extrapolation codes, a nonlinear force-free field code (CESE-MHD-NLFFF) and a non-force-free field (NFFF) code, in modeling a solar active region (AR) that has a sigmoidal configuration just before a major flare erupted from the region. A 2D coronal-loop tracing and fitting method is employed to study the 3D misalignment angles between the extrapolated magnetic field lines and the EUV loops as imaged by SDO/AIA. It is found that the CESE-MHD-NLFFF code with preprocessed magnetogram performs the best, outputting a field that matches the coronal loops in the AR core imaged in AIA 94 Å with a misalignment angle of ˜10°. This suggests that the CESE-MHD-NLFFF code, even without using the information of the coronal loops in constraining the magnetic field, performs as good as some coronal-loop forward-fitting models. For the loops as imaged by AIA 171 Å in the outskirts of the AR, all the codes including the potential field give comparable results of the mean misalignment angle (˜30°). Thus, further improvement of the codes is needed for a better reconstruction of the long loops enveloping the core region.

  17. Comparison of Two Coronal Magnetic Field Models to Reconstruct a Sigmoidal Solar Active Region with Coronal Loops

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Duan, Aiying; Zhang, Huai; Jiang, Chaowei

    Magnetic field extrapolation is an important tool to study the three-dimensional (3D) solar coronal magnetic field, which is difficult to directly measure. Various analytic models and numerical codes exist, but their results often drastically differ. Thus, a critical comparison of the modeled magnetic field lines with the observed coronal loops is strongly required to establish the credibility of the model. Here we compare two different non-potential extrapolation codes, a nonlinear force-free field code (CESE–MHD–NLFFF) and a non-force-free field (NFFF) code, in modeling a solar active region (AR) that has a sigmoidal configuration just before a major flare erupted from themore » region. A 2D coronal-loop tracing and fitting method is employed to study the 3D misalignment angles between the extrapolated magnetic field lines and the EUV loops as imaged by SDO /AIA. It is found that the CESE–MHD–NLFFF code with preprocessed magnetogram performs the best, outputting a field that matches the coronal loops in the AR core imaged in AIA 94 Å with a misalignment angle of ∼10°. This suggests that the CESE–MHD–NLFFF code, even without using the information of the coronal loops in constraining the magnetic field, performs as good as some coronal-loop forward-fitting models. For the loops as imaged by AIA 171 Å in the outskirts of the AR, all the codes including the potential field give comparable results of the mean misalignment angle (∼30°). Thus, further improvement of the codes is needed for a better reconstruction of the long loops enveloping the core region.« less

  18. 3D Equilibrium Effects Due to RMP Application on DIII-D

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    S. Lazerson, E. Lazarus, S. Hudson, N. Pablant and D. Gates

    2012-06-20

    The mitigation and suppression of edge localized modes (ELMs) through application of resonant magnetic perturbations (RMPs) in Tokamak plasmas is a well documented phenomenon [1]. Vacuum calculations suggest the formation of edge islands and stochastic regions when RMPs are applied to the axisymmetric equilibria. Self-consistent calculations of the plasma equilibrium with the VMEC [2] and SPEC [3] codes have been performed for an up-down symmetric shot (142603) in DIII-D. In these codes, a self-consistent calculation of the plasma response due to the RMP coils is calculated. The VMEC code globally enforces the constraints of ideal MHD; consequently, a continuously nestedmore » family of flux surfaces is enforced throughout the plasma domain. This approach necessarily precludes the observation of islands or field-line chaos. The SPEC code relaxes the constraints of ideal MHD locally, and allows for islands and field line chaos at or near the rational surfaces. Equilibria with finite pressure gradients are approximated by a set of discrete "ideal-interfaces" at the most irrational flux surfaces and where the strongest pressure gradients are observed. Both the VMEC and SPEC calculations are initialized from EFIT reconstructions of the plasma that are consistent with the experimental pressure and current profiles. A 3D reconstruction using the STELLOPT code, which fits VMEC equilibria to experimental measurements, has also been performed. Comparisons between the equilibria generated by the 3D codes and between STELLOPT and EFIT are presented.« less

  19. 3D Equilibrium Effects Due to RMP Application on DIII-D

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lazerson, S.; Lazarus, E.; Hudson, S.

    2012-06-20

    The mitigation and suppression of edge localized modes (ELMs) through application of resonant magnetic perturbations (RMPs) in Tokamak plasmas is a well documented phenomenon. Vacuum calculations suggest the formation of edge islands and stochastic regions when RMPs are applied to the axisymmetric equilibria. Self-consistent calculations of the plasma equilibrium with the VMEC and SPEC codes have been performed for an up-down symmetric shot in DIII-D. In these codes, a self-consistent calculation of the plasma response due to the RMP coils is calculated. The VMEC code globally enforces the constraints of ideal MHD; consequently, a continuously nested family of flux surfacesmore » is enforced throughout the plasma domain. This approach necessarily precludes the observation of islands or field-line chaos. The SPEC code relaxes the constraints of ideal MHD locally, and allows for islands and field line chaos at or near the rational surfaces. Equilibria with finite pressure gradients are approximated by a set of discrete "ideal-interfaces" at the most irrational flux surfaces and where the strongest pressure gradients are observed. Both the VMEC and SPEC calculations are initialized from EFIT reconstructions of the plasma that are consistent with the experimental pressure and current profiles. A 3D reconstruction using the STELLOPT code, which fits VMEC equilibria to experimental measurements, has also been performed. Comparisons between the equilibria generated by the 3D codes and between STELLOPT and EFIT are presented.« less

  20. An Overview of the Greyscales Lethality Assessment Methodology

    DTIC Science & Technology

    2011-01-01

    code has already been integrated into the Weapon Systems Division MECA and DUEL missile engagement simulations. It can also be integrated into...incorporated into a variety of simulations. The code has already been integrated into the Weapon Systems Division MECA and DUEL missile engagement...capable of being incorporated into a variety of simulations. The code has already been integrated into the Weapon Systems Division MECA and DUEL missile

  1. A Detailed History of Intron-rich Eukaryotic Ancestors Inferred from a Global Survey of 100 Complete Genomes

    PubMed Central

    Csuros, Miklos; Rogozin, Igor B.; Koonin, Eugene V.

    2011-01-01

    Protein-coding genes in eukaryotes are interrupted by introns, but intron densities widely differ between eukaryotic lineages. Vertebrates, some invertebrates and green plants have intron-rich genes, with 6–7 introns per kilobase of coding sequence, whereas most of the other eukaryotes have intron-poor genes. We reconstructed the history of intron gain and loss using a probabilistic Markov model (Markov Chain Monte Carlo, MCMC) on 245 orthologous genes from 99 genomes representing the three of the five supergroups of eukaryotes for which multiple genome sequences are available. Intron-rich ancestors are confidently reconstructed for each major group, with 53 to 74% of the human intron density inferred with 95% confidence for the Last Eukaryotic Common Ancestor (LECA). The results of the MCMC reconstruction are compared with the reconstructions obtained using Maximum Likelihood (ML) and Dollo parsimony methods. An excellent agreement between the MCMC and ML inferences is demonstrated whereas Dollo parsimony introduces a noticeable bias in the estimations, typically yielding lower ancestral intron densities than MCMC and ML. Evolution of eukaryotic genes was dominated by intron loss, with substantial gain only at the bases of several major branches including plants and animals. The highest intron density, 120 to 130% of the human value, is inferred for the last common ancestor of animals. The reconstruction shows that the entire line of descent from LECA to mammals was intron-rich, a state conducive to the evolution of alternative splicing. PMID:21935348

  2. Event Reconstruction in the PandaRoot framework

    NASA Astrophysics Data System (ADS)

    Spataro, Stefano

    2012-12-01

    The PANDA experiment will study the collisions of beams of anti-protons, with momenta ranging from 2-15 GeV/c, with fixed proton and nuclear targets in the charm energy range, and will be built at the FAIR facility. In preparation for the experiment, the PandaRoot software framework is under development for detector simulation, reconstruction and data analysis, running on an Alien2-based grid. The basic features are handled by the FairRoot framework, based on ROOT and Virtual Monte Carlo, while the PANDA detector specifics and reconstruction code are implemented inside PandaRoot. The realization of Technical Design Reports for the tracking detectors has pushed the finalization of the tracking reconstruction code, which is complete for the Target Spectrometer, and of the analysis tools. Particle Identification algorithms are currently implemented using Bayesian approach and compared to Multivariate Analysis methods. Moreover, the PANDA data acquisition foresees a triggerless operation in which events are not defined by a hardware 1st level trigger decision, but all the signals are stored with time stamps requiring a deconvolution by the software. This has led to a redesign of the software from an event basis to a time-ordered structure. In this contribution, the reconstruction capabilities of the Panda spectrometer will be reported, focusing on the performances of the tracking system and the results for the analysis of physics benchmark channels, as well as the new (and challenging) concept of time-based simulation and its implementation.

  3. Recovering star formation histories: Integrated-light analyses vs. stellar colour-magnitude diagrams

    NASA Astrophysics Data System (ADS)

    Ruiz-Lara, T.; Pérez, I.; Gallart, C.; Alloin, D.; Monelli, M.; Koleva, M.; Pompei, E.; Beasley, M.; Sánchez-Blázquez, P.; Florido, E.; Aparicio, A.; Fleurence, E.; Hardy, E.; Hidalgo, S.; Raimann, D.

    2015-11-01

    Context. Accurate star formation histories (SFHs) of galaxies are fundamental for understanding the build-up of their stellar content. However, the most accurate SFHs - those obtained from colour-magnitude diagrams (CMDs) of resolved stars reaching the oldest main-sequence turnoffs (oMSTO) - are presently limited to a few systems in the Local Group. It is therefore crucial to determine the reliability and range of applicability of SFHs derived from integrated light spectroscopy, as this affects our understanding of unresolved galaxies from low to high redshift. Aims: We evaluate the reliability of current full spectral fitting techniques in deriving SFHs from integrated light spectroscopy by comparing SFHs from integrated spectra to those obtained from deep CMDs of resolved stars. Methods: We have obtained a high signal-to-noise (S/N ~ 36.3 per Å) integrated spectrum of a field in the bar of the Large Magellanic Cloud (LMC) using EFOSC2 at the 3.6-metre telescope at La Silla Observatory. For this same field, resolved stellar data reaching the oMSTO are available. We have compared the star formation rate (SFR) as a function of time and the age-metallicity relation (AMR) obtained from the integrated spectrum using STECKMAP, and the CMD using the IAC-star/MinnIAC/IAC-pop set of routines. For the sake of completeness we also use and discuss other synthesis codes (STARLIGHT and ULySS) to derive the SFR and AMR from the integrated LMC spectrum. Results: We find very good agreement (average differences ~4.1%) between the SFR (t) and the AMR obtained using STECKMAP on the integrated light spectrum, and the CMD analysis. STECKMAP minimizes the impact of the age-metallicity degeneracy and has the advantage of preferring smooth solutions to recover complex SFHs by means of a penalized χ2. We find that the use of single stellar populations (SSPs) to recover the stellar content, using for instance STARLIGHT or ULySS codes, hampers the reconstruction of the SFR (t) and AMR shapes, yielding larger discrepancies with respect to the CMD results. These discrepancies can be reduced if spectral templates based on known and complex SFHs are employed rather than SSPs. Based on observations obtained at the 3.6 m ESO telescope on La Silla (Chile) and with the Hubble Space Telescope, operated by NASA.Appendices are available in electronic form at http://www.aanda.org

  4. Tomographic reconstruction of tracer gas concentration profiles in a room with the use of a single OP-FTIR and two iterative algorithms: ART and PWLS.

    PubMed

    Park, D Y; Fessler, J A; Yost, M G; Levine, S P

    2000-03-01

    Computed tomographic (CT) reconstructions of air contaminant concentration fields were conducted in a room-sized chamber employing a single open-path Fourier transform infrared (OP-FTIR) instrument and a combination of 52 flat mirrors and 4 retroreflectors. A total of 56 beam path data were repeatedly collected for around 1 hr while maintaining a stable concentration gradient. The plane of the room was divided into 195 pixels (13 x 15) for reconstruction. The algebraic reconstruction technique (ART) failed to reconstruct the original concentration gradient patterns for most cases. These poor results were caused by the "highly underdetermined condition" in which the number of unknown values (156 pixels) exceeds that of known data (56 path integral concentrations) in the experimental setting. A new CT algorithm, called the penalized weighted least-squares (PWLS), was applied to remedy this condition. The peak locations were correctly positioned in the PWLS-CT reconstructions. A notable feature of the PWLS-CT reconstructions was a significant reduction of highly irregular noise peaks found in the ART-CT reconstructions. However, the peak heights were slightly reduced in the PWLS-CT reconstructions due to the nature of the PWLS algorithm. PWLS could converge on the original concentration gradient even when a fairly high error was embedded into some experimentally measured path integral concentrations. It was also found in the simulation tests that the PWLS algorithm was very robust with respect to random errors in the path integral concentrations. This beam geometry and the use of a single OP-FTIR scanning system, in combination with the PWLS algorithm, is a system applicable to both environmental and industrial settings.

  5. Tomographic Reconstruction of Tracer Gas Concentration Profiles in a Room with the Use of a Single OP-FTIR and Two Iterative Algorithms: ART and PWLS.

    PubMed

    Park, Doo Y; Fessier, Jeffrey A; Yost, Michael G; Levine, Steven P

    2000-03-01

    Computed tomographic (CT) reconstructions of air contaminant concentration fields were conducted in a room-sized chamber employing a single open-path Fourier transform infrared (OP-FTIR) instrument and a combination of 52 flat mirrors and 4 retroreflectors. A total of 56 beam path data were repeatedly collected for around 1 hr while maintaining a stable concentration gradient. The plane of the room was divided into 195 pixels (13 × 15) for reconstruction. The algebraic reconstruction technique (ART) failed to reconstruct the original concentration gradient patterns for most cases. These poor results were caused by the "highly underdetermined condition" in which the number of unknown values (156 pixels) exceeds that of known data (56 path integral concentrations) in the experimental setting. A new CT algorithm, called the penalized weighted least-squares (PWLS), was applied to remedy this condition. The peak locations were correctly positioned in the PWLS-CT reconstructions. A notable feature of the PWLS-CT reconstructions was a significant reduction of highly irregular noise peaks found in the ART-CT reconstructions. However, the peak heights were slightly reduced in the PWLS-CT reconstructions due to the nature of the PWLS algorithm. PWLS could converge on the original concentration gradient even when a fairly high error was embedded into some experimentally measured path integral concentrations. It was also found in the simulation tests that the PWLS algorithm was very robust with respect to random errors in the path integral concentrations. This beam geometry and the use of a single OP-FTIR scanning system, in combination with the PWLS algorithm, is a system applicable to both environmental and industrial settings.

  6. NIMROD modeling of quiescent H-mode: Reconstruction considerations and saturation mechanism

    DOE PAGES

    King, Jacob R.; Burrell, Keith H.; Garofalo, Andrea M.; ...

    2016-09-30

    The extended-MHD NIMROD code (Sovinec and King 2010 J. Comput. Phys. 229 5803) models broadband-MHD activity from a reconstruction of a quiescent H-mode shot on the DIII-D tokamak (Luxon 2002 Nucl. Fusion 42 614). Computations with the reconstructed toroidal and poloidal ion flows exhibit low-n Φ perturbations (n Φ ≃1–5) that grow and saturate into a turbulent-like MHD state. The workflow used to project the reconstructed state onto the NIMROD basis functions re-solves the Grad–Shafranov equation and extrapolates profiles to include scrape-off-layer currents. In conclusion, evaluation of the transport from the turbulent-like MHD state leads to a relaxation of themore » density and temperature profiles.« less

  7. NIMROD modeling of quiescent H-mode: Reconstruction considerations and saturation mechanism

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    King, Jacob R.; Burrell, Keith H.; Garofalo, Andrea M.

    The extended-MHD NIMROD code (Sovinec and King 2010 J. Comput. Phys. 229 5803) models broadband-MHD activity from a reconstruction of a quiescent H-mode shot on the DIII-D tokamak (Luxon 2002 Nucl. Fusion 42 614). Computations with the reconstructed toroidal and poloidal ion flows exhibit low-n Φ perturbations (n Φ ≃1–5) that grow and saturate into a turbulent-like MHD state. The workflow used to project the reconstructed state onto the NIMROD basis functions re-solves the Grad–Shafranov equation and extrapolates profiles to include scrape-off-layer currents. In conclusion, evaluation of the transport from the turbulent-like MHD state leads to a relaxation of themore » density and temperature profiles.« less

  8. Path integration guided with a quality map for shape reconstruction in the fringe reflection technique

    NASA Astrophysics Data System (ADS)

    Jing, Xiaoli; Cheng, Haobo; Wen, Yongfu

    2018-04-01

    A new local integration algorithm called quality map path integration (QMPI) is reported for shape reconstruction in the fringe reflection technique. A quality map is proposed to evaluate the quality of gradient data locally, and functions as a guideline for the integrated path. The presented method can be employed in wavefront estimation from its slopes over the general shaped surface with slope noise equivalent to that in practical measurements. Moreover, QMPI is much better at handling the slope data with local noise, which may be caused by the irregular shapes of the surface under test. The performance of QMPI is discussed by simulations and experiment. It is shown that QMPI not only improves the accuracy of local integration, but can also be easily implemented with no iteration compared to Southwell zonal reconstruction (SZR). From an engineering point-of-view, the proposed method may also provide an efficient and stable approach for different shapes with high-precise demand.

  9. Automatic alignment for three-dimensional tomographic reconstruction

    NASA Astrophysics Data System (ADS)

    van Leeuwen, Tristan; Maretzke, Simon; Joost Batenburg, K.

    2018-02-01

    In tomographic reconstruction, the goal is to reconstruct an unknown object from a collection of line integrals. Given a complete sampling of such line integrals for various angles and directions, explicit inverse formulas exist to reconstruct the object. Given noisy and incomplete measurements, the inverse problem is typically solved through a regularized least-squares approach. A challenge for both approaches is that in practice the exact directions and offsets of the x-rays are only known approximately due to, e.g. calibration errors. Such errors lead to artifacts in the reconstructed image. In the case of sufficient sampling and geometrically simple misalignment, the measurements can be corrected by exploiting so-called consistency conditions. In other cases, such conditions may not apply and we have to solve an additional inverse problem to retrieve the angles and shifts. In this paper we propose a general algorithmic framework for retrieving these parameters in conjunction with an algebraic reconstruction technique. The proposed approach is illustrated by numerical examples for both simulated data and an electron tomography dataset.

  10. Pre-Service Teachers' Perception of Quick Response (QR) Code Integration in Classroom Activities

    ERIC Educational Resources Information Center

    Ali, Nagla; Santos, Ieda M.; Areepattamannil, Shaljan

    2017-01-01

    Quick Response (QR) codes have been discussed in the literature as adding value to teaching and learning. Despite their potential in education, more research is needed to inform practice and advance knowledge in this field. This paper investigated the integration of the QR code in classroom activities and the perceptions of the integration by…

  11. High Resolution Spatiotemporal Climate Reconstruction and Variability in East Asia during Little Ice Age

    NASA Astrophysics Data System (ADS)

    Lin, K. H. E.; Wang, P. K.; Lee, S. Y.; Liao, Y. C.; Fan, I. C.; Liao, H. M.

    2017-12-01

    The Little ice Age (LIA) is one of the most prominent epochs in paleoclimate reconstruction of the Common Era. While the signals of LIA were generally discovered across hemispheres, wide arrays of regional variability were found, and the reconstructed anomalies were sometimes inconsistent across studies by using various proxy data or historical records. This inconsistency is mainly attributed to limited data coverage at fine resolution that can assist high-resolution climate reconstruction in the continuous spatiotemporal trends. Qing dynasty (1644-1911 CE) of China existed in the coldest period of LIA. Owing to a long-standing tradition that acquired local officials to record odds and social or meteorological events, thousands of local chronicles were left. Zhang eds. (2004) took two decades to compile all these meteorological records in a compendium, for which we then digitized and coded all records into our REACHS database system for reconstructing climate. There were in total 1,435 points (sites) in our database for over 80,000 events in the period of time. After implementing two-rounds coding check for data quality control (accuracy rate 87.2%), multiple indexes were retrieved for reconstructing annually and seasonally resolved temperature and precipitation series for North, Central, and South China. The reconstruction methods include frequency count and grading, with usage of multiple regression models to test sensitivity and to calculate correlations among several reconstructed series. Validation was also conducted through comparison with instrumental data and with other reconstructed series in previous studies. Major research results reveal interannual (3-5 years), decadal (8-12 years), and interdecadal (≈30 years) variabilities with strong regional expressions across East China. Cooling effect was not homogenously distributed in space and time. Flood and drought conditions frequently repeated but the spatiotemporal pattern was variant, indicating likely different climate regimes that can be linked to the dynamism of large atmospheric circulation and East Asian monsoon. Spatiotemporal analysis of extreme events such as typhoons and extreme droughts also indicated similar patterns. More detailed analysis are undertaken to explain the physical mechanisms that can drive these changes.

  12. Three-dimensional image acquisition and reconstruction system on a mobile device based on computer-generated integral imaging.

    PubMed

    Erdenebat, Munkh-Uchral; Kim, Byeong-Jun; Piao, Yan-Ling; Park, Seo-Yeon; Kwon, Ki-Chul; Piao, Mei-Lan; Yoo, Kwan-Hee; Kim, Nam

    2017-10-01

    A mobile three-dimensional image acquisition and reconstruction system using a computer-generated integral imaging technique is proposed. A depth camera connected to the mobile device acquires the color and depth data of a real object simultaneously, and an elemental image array is generated based on the original three-dimensional information for the object, with lens array specifications input into the mobile device. The three-dimensional visualization of the real object is reconstructed on the mobile display through optical or digital reconstruction methods. The proposed system is implemented successfully and the experimental results certify that the system is an effective and interesting method of displaying real three-dimensional content on a mobile device.

  13. Does the graft-tunnel friction influence knee joint kinematics and biomechanics after anterior cruciate ligament reconstruction? A finite element study.

    PubMed

    Wan, Chao; Hao, Zhixiu

    2018-02-01

    Graft tissues within bone tunnels remain mobile for a long time after anterior cruciate ligament (ACL) reconstruction. However, whether the graft-tunnel friction affects the finite element (FE) simulation of the ACL reconstruction is still unclear. Four friction coefficients (from 0 to 0.3) were simulated in the ACL-reconstructed joint model as well as two loading levels of anterior tibial drawer. The graft-tunnel friction did not affect joint kinematics and the maximal principal strain of the graft. By contrast, both the relative graft-tunnel motion and equivalent strain for the bone tunnels were altered, which corresponded to different processes of graft-tunnel integration and bone remodeling, respectively. It implies that the graft-tunnel friction should be defined properly for studying the graft-tunnel integration or bone remodeling after ACL reconstruction using numerical simulation.

  14. SU-C-201-03: Coded Aperture Gamma-Ray Imaging Using Pixelated Semiconductor Detectors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Joshi, S; Kaye, W; Jaworski, J

    2015-06-15

    Purpose: Improved localization of gamma-ray emissions from radiotracers is essential to the progress of nuclear medicine. Polaris is a portable, room-temperature operated gamma-ray imaging spectrometer composed of two 3×3 arrays of thick CdZnTe (CZT) detectors, which detect gammas between 30keV and 3MeV with energy resolution of <1% FWHM at 662keV. Compton imaging is used to map out source distributions in 4-pi space; however, is only effective above 300keV where Compton scatter is dominant. This work extends imaging to photoelectric energies (<300keV) using coded aperture imaging (CAI), which is essential for localization of Tc-99m (140keV). Methods: CAI, similar to the pinholemore » camera, relies on an attenuating mask, with open/closed elements, placed between the source and position-sensitive detectors. Partial attenuation of the source results in a “shadow” or count distribution that closely matches a portion of the mask pattern. Ideally, each source direction corresponds to a unique count distribution. Using backprojection reconstruction, the source direction is determined within the field of view. The knowledge of 3D position of interaction results in improved image quality. Results: Using a single array of detectors, a coded aperture mask, and multiple Co-57 (122keV) point sources, image reconstruction is performed in real-time, on an event-by-event basis, resulting in images with an angular resolution of ∼6 degrees. Although material nonuniformities contribute to image degradation, the superposition of images from individual detectors results in improved SNR. CAI was integrated with Compton imaging for a seamless transition between energy regimes. Conclusion: For the first time, CAI has been applied to thick, 3D position sensitive CZT detectors. Real-time, combined CAI and Compton imaging is performed using two 3×3 detector arrays, resulting in a source distribution in space. This system has been commercialized by H3D, Inc. and is being acquired for various applications worldwide, including proton therapy imaging R&D.« less

  15. Comprehensive Identification of Long Non-coding RNAs in Purified Cell Types from the Brain Reveals Functional LncRNA in OPC Fate Determination.

    PubMed

    Dong, Xiaomin; Chen, Kenian; Cuevas-Diaz Duran, Raquel; You, Yanan; Sloan, Steven A; Zhang, Ye; Zong, Shan; Cao, Qilin; Barres, Ben A; Wu, Jia Qian

    2015-12-01

    Long non-coding RNAs (lncRNAs) (> 200 bp) play crucial roles in transcriptional regulation during numerous biological processes. However, it is challenging to comprehensively identify lncRNAs, because they are often expressed at low levels and with more cell-type specificity than are protein-coding genes. In the present study, we performed ab initio transcriptome reconstruction using eight purified cell populations from mouse cortex and detected more than 5000 lncRNAs. Predicting the functions of lncRNAs using cell-type specific data revealed their potential functional roles in Central Nervous System (CNS) development. We performed motif searches in ENCODE DNase I digital footprint data and Mouse ENCODE promoters to infer transcription factor (TF) occupancy. By integrating TF binding and cell-type specific transcriptomic data, we constructed a novel framework that is useful for systematically identifying lncRNAs that are potentially essential for brain cell fate determination. Based on this integrative analysis, we identified lncRNAs that are regulated during Oligodendrocyte Precursor Cell (OPC) differentiation from Neural Stem Cells (NSCs) and that are likely to be involved in oligodendrogenesis. The top candidate, lnc-OPC, shows highly specific expression in OPCs and remarkable sequence conservation among placental mammals. Interestingly, lnc-OPC is significantly up-regulated in glial progenitors from experimental autoimmune encephalomyelitis (EAE) mouse models compared to wild-type mice. OLIG2-binding sites in the upstream regulatory region of lnc-OPC were identified by ChIP (chromatin immunoprecipitation)-Sequencing and validated by luciferase assays. Loss-of-function experiments confirmed that lnc-OPC plays a functional role in OPC genesis. Overall, our results substantiated the role of lncRNA in OPC fate determination and provided an unprecedented data source for future functional investigations in CNS cell types. We present our datasets and analysis results via the interactive genome browser at our laboratory website that is freely accessible to the research community. This is the first lncRNA expression database of collective populations of glia, vascular cells, and neurons. We anticipate that these studies will advance the knowledge of this major class of non-coding genes and their potential roles in neurological development and diseases.

  16. Integral imaging based light field display with enhanced viewing resolution using holographic diffuser

    NASA Astrophysics Data System (ADS)

    Yan, Zhiqiang; Yan, Xingpeng; Jiang, Xiaoyu; Gao, Hui; Wen, Jun

    2017-11-01

    An integral imaging based light field display method is proposed by use of holographic diffuser, and enhanced viewing resolution is gained over conventional integral imaging systems. The holographic diffuser is fabricated with controlled diffusion characteristics, which interpolates the discrete light field of the reconstructed points to approximate the original light field. The viewing resolution can thus be improved and independent of the limitation imposed by Nyquist sampling frequency. An integral imaging system with low Nyquist sampling frequency is constructed, and reconstructed scenes of high viewing resolution using holographic diffuser are demonstrated, verifying the feasibility of the method.

  17. 42 CFR 73.3 - HHS select agents and toxins.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... replication competent forms of the 1918 pandemic influenza virus containing any portion of the coding regions of all eight gene segments (Reconstructed 1918 Influenza virus) Ricin Rickettsia prowazekii SARS...

  18. 42 CFR 73.3 - HHS select agents and toxins.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... replication competent forms of the 1918 pandemic influenza virus containing any portion of the coding regions of all eight gene segments (Reconstructed 1918 Influenza virus) Ricin Rickettsia prowazekii SARS...

  19. High-Performance 3D Compressive Sensing MRI Reconstruction Using Many-Core Architectures.

    PubMed

    Kim, Daehyun; Trzasko, Joshua; Smelyanskiy, Mikhail; Haider, Clifton; Dubey, Pradeep; Manduca, Armando

    2011-01-01

    Compressive sensing (CS) describes how sparse signals can be accurately reconstructed from many fewer samples than required by the Nyquist criterion. Since MRI scan duration is proportional to the number of acquired samples, CS has been gaining significant attention in MRI. However, the computationally intensive nature of CS reconstructions has precluded their use in routine clinical practice. In this work, we investigate how different throughput-oriented architectures can benefit one CS algorithm and what levels of acceleration are feasible on different modern platforms. We demonstrate that a CUDA-based code running on an NVIDIA Tesla C2050 GPU can reconstruct a 256 × 160 × 80 volume from an 8-channel acquisition in 19 seconds, which is in itself a significant improvement over the state of the art. We then show that Intel's Knights Ferry can perform the same 3D MRI reconstruction in only 12 seconds, bringing CS methods even closer to clinical viability.

  20. Microsurgical Chest Wall Reconstruction After Oncologic Resections

    PubMed Central

    Sauerbier, Michael; Dittler, S.; Kreutzer, C.

    2011-01-01

    Defect reconstruction after radical oncologic resection of malignant chest wall tumors requires adequate soft tissue reconstruction with function, stability, integrity, and an aesthetically acceptable result of the chest wall. The purpose of this article is to describe possible reconstructive microsurgical pathways after full-thickness oncologic resections of the chest wall. Several reliable free flaps are described, and morbidity and mortality rates of patients are discussed. PMID:22294944

  1. Fully-Implicit Reconstructed Discontinuous Galerkin Method for Stiff Multiphysics Problems

    NASA Astrophysics Data System (ADS)

    Nourgaliev, Robert

    2015-11-01

    A new reconstructed Discontinuous Galerkin (rDG) method, based on orthogonal basis/test functions, is developed for fluid flows on unstructured meshes. Orthogonality of basis functions is essential for enabling robust and efficient fully-implicit Newton-Krylov based time integration. The method is designed for generic partial differential equations, including transient, hyperbolic, parabolic or elliptic operators, which are attributed to many multiphysics problems. We demonstrate the method's capabilities for solving compressible fluid-solid systems (in the low Mach number limit), with phase change (melting/solidification), as motivated by applications in Additive Manufacturing. We focus on the method's accuracy (in both space and time), as well as robustness and solvability of the system of linear equations involved in the linearization steps of Newton-based methods. The performance of the developed method is investigated for highly-stiff problems with melting/solidification, emphasizing the advantages from tight coupling of mass, momentum and energy conservation equations, as well as orthogonality of basis functions, which leads to better conditioning of the underlying (approximate) Jacobian matrices, and rapid convergence of the Krylov-based linear solver. This work was performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344, and funded by the LDRD at LLNL under project tracking code 13-SI-002.

  2. Mitochondrial analysis of a Byzantine population reveals the differential impact of multiple historical events in South Anatolia

    PubMed Central

    Ottoni, Claudio; Ricaut, François-X; Vanderheyden, Nancy; Brucato, Nicolas; Waelkens, Marc; Decorte, Ronny

    2011-01-01

    The archaeological site of Sagalassos is located in Southwest Turkey, in the western part of the Taurus mountain range. Human occupation of its territory is attested from the late 12th millennium BP up to the 13th century AD. By analysing the mtDNA variation in 85 skeletons from Sagalassos dated to the 11th–13th century AD, this study attempts to reconstruct the genetic signature potentially left in this region of Anatolia by the many civilizations, which succeeded one another over the centuries until the mid-Byzantine period (13th century BC). Authentic ancient DNA data were determined from the control region and some SNPs in the coding region of the mtDNA in 53 individuals. Comparative analyses with up to 157 modern populations allowed us to reconstruct the origin of the mid-Byzantine people still dwelling in dispersed hamlets in Sagalassos, and to detect the maternal contribution of their potential ancestors. By integrating the genetic data with historical and archaeological information, we were able to attest in Sagalassos a significant maternal genetic signature of Balkan/Greek populations, as well as ancient Persians and populations from the Italian peninsula. Some contribution from the Levant has been also detected, whereas no contribution from Central Asian population could be ascertained. PMID:21224890

  3. Data integration of structured and unstructured sources for assigning clinical codes to patient stays

    PubMed Central

    Luyckx, Kim; Luyten, Léon; Daelemans, Walter; Van den Bulcke, Tim

    2016-01-01

    Objective Enormous amounts of healthcare data are becoming increasingly accessible through the large-scale adoption of electronic health records. In this work, structured and unstructured (textual) data are combined to assign clinical diagnostic and procedural codes (specifically ICD-9-CM) to patient stays. We investigate whether integrating these heterogeneous data types improves prediction strength compared to using the data types in isolation. Methods Two separate data integration approaches were evaluated. Early data integration combines features of several sources within a single model, and late data integration learns a separate model per data source and combines these predictions with a meta-learner. This is evaluated on data sources and clinical codes from a broad set of medical specialties. Results When compared with the best individual prediction source, late data integration leads to improvements in predictive power (eg, overall F-measure increased from 30.6% to 38.3% for International Classification of Diseases, Ninth Revision, Clinical Modification (ICD-9-CM) diagnostic codes), while early data integration is less consistent. The predictive strength strongly differs between medical specialties, both for ICD-9-CM diagnostic and procedural codes. Discussion Structured data provides complementary information to unstructured data (and vice versa) for predicting ICD-9-CM codes. This can be captured most effectively by the proposed late data integration approach. Conclusions We demonstrated that models using multiple electronic health record data sources systematically outperform models using data sources in isolation in the task of predicting ICD-9-CM codes over a broad range of medical specialties. PMID:26316458

  4. Development of an integrated thermal-hydraulics capability incorporating RELAP5 and PANTHER neutronics code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Page, R.; Jones, J.R.

    1997-07-01

    Ensuring that safety analysis needs are met in the future is likely to lead to the development of new codes and the further development of existing codes. It is therefore advantageous to define standards for data interfaces and to develop software interfacing techniques which can readily accommodate changes when they are made. Defining interface standards is beneficial but is necessarily restricted in application if future requirements are not known in detail. Code interfacing methods are of particular relevance with the move towards automatic grid frequency response operation where the integration of plant dynamic, core follow and fault study calculation toolsmore » is considered advantageous. This paper describes the background and features of a new code TALINK (Transient Analysis code LINKage program) used to provide a flexible interface to link the RELAP5 thermal hydraulics code with the PANTHER neutron kinetics and the SIBDYM whole plant dynamic modelling codes used by Nuclear Electric. The complete package enables the codes to be executed in parallel and provides an integrated whole plant thermal-hydraulics and neutron kinetics model. In addition the paper discusses the capabilities and pedigree of the component codes used to form the integrated transient analysis package and the details of the calculation of a postulated Sizewell `B` Loss of offsite power fault transient.« less

  5. The Influence of Reconstruction Kernel on Bone Mineral and Strength Estimates Using Quantitative Computed Tomography and Finite Element Analysis.

    PubMed

    Michalski, Andrew S; Edwards, W Brent; Boyd, Steven K

    2017-10-17

    Quantitative computed tomography has been posed as an alternative imaging modality to investigate osteoporosis. We examined the influence of computed tomography convolution back-projection reconstruction kernels on the analysis of bone quantity and estimated mechanical properties in the proximal femur. Eighteen computed tomography scans of the proximal femur were reconstructed using both a standard smoothing reconstruction kernel and a bone-sharpening reconstruction kernel. Following phantom-based density calibration, we calculated typical bone quantity outcomes of integral volumetric bone mineral density, bone volume, and bone mineral content. Additionally, we performed finite element analysis in a standard sideways fall on the hip loading configuration. Significant differences for all outcome measures, except integral bone volume, were observed between the 2 reconstruction kernels. Volumetric bone mineral density measured using images reconstructed by the standard kernel was significantly lower (6.7%, p < 0.001) when compared with images reconstructed using the bone-sharpening kernel. Furthermore, the whole-bone stiffness and the failure load measured in images reconstructed by the standard kernel were significantly lower (16.5%, p < 0.001, and 18.2%, p < 0.001, respectively) when compared with the image reconstructed by the bone-sharpening kernel. These data suggest that for future quantitative computed tomography studies, a standardized reconstruction kernel will maximize reproducibility, independent of the use of a quantitative calibration phantom. Copyright © 2017 The International Society for Clinical Densitometry. Published by Elsevier Inc. All rights reserved.

  6. Pediatric primary care psychologists' reported level of integration, billing practices, and reimbursement frequency.

    PubMed

    Riley, Andrew R; Grennan, Allison; Menousek, Kathryn; Hoffses, Kathryn W

    2018-03-01

    Integration of psychological services into pediatric primary care is increasingly common, but models of integration vary with regard to their level of coordination, colocation, and integration. High-integration models may provide some distinct advantages, such as preventative care and brief consultation for subclinical behavior concerns; however, psychologists face barriers to seeking reimbursement for these services. Alternatives to traditional psychotherapy and psychological testing codes, specifically Health & Behavior (H&B) codes, have been proposed as 1 method for supporting integrated care. The aim of this study was to investigate the relationships between psychologists' reported billing practices, reimbursement rates, and model of integration in pediatric primary care. As part of a larger survey study, 55 psychologists working in pediatric primary care reported on characteristics of their practice's model of integration, billing practices, and frequency of reimbursement for consultative services. Compared with those who categorized their integrated care model as colocated, psychologists who endorsed working in integrated models reported a significantly higher usage of H&B codes and more frequent reimbursement for consultations. Overall, use of H&B codes was associated with higher reported levels of coordination and integration. Survey results showed a clear pattern of higher integration being associated with greater utilization of H&B codes and better reimbursement for consultation activities. These results underscore the importance of establishing and maintaining billing and reimbursement systems that adequately support integrated care. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  7. Reconstruction of Bulk Operators within the Entanglement Wedge in Gauge-Gravity Duality

    NASA Astrophysics Data System (ADS)

    Dong, Xi; Harlow, Daniel; Wall, Aron C.

    2016-07-01

    In this Letter we prove a simple theorem in quantum information theory, which implies that bulk operators in the anti-de Sitter/conformal field theory (AdS/CFT) correspondence can be reconstructed as CFT operators in a spatial subregion A , provided that they lie in its entanglement wedge. This is an improvement on existing reconstruction methods, which have at most succeeded in the smaller causal wedge. The proof is a combination of the recent work of Jafferis, Lewkowycz, Maldacena, and Suh on the quantum relative entropy of a CFT subregion with earlier ideas interpreting the correspondence as a quantum error correcting code.

  8. Reconstruction of Bulk Operators within the Entanglement Wedge in Gauge-Gravity Duality.

    PubMed

    Dong, Xi; Harlow, Daniel; Wall, Aron C

    2016-07-08

    In this Letter we prove a simple theorem in quantum information theory, which implies that bulk operators in the anti-de Sitter/conformal field theory (AdS/CFT) correspondence can be reconstructed as CFT operators in a spatial subregion A, provided that they lie in its entanglement wedge. This is an improvement on existing reconstruction methods, which have at most succeeded in the smaller causal wedge. The proof is a combination of the recent work of Jafferis, Lewkowycz, Maldacena, and Suh on the quantum relative entropy of a CFT subregion with earlier ideas interpreting the correspondence as a quantum error correcting code.

  9. Regional Atmospheric Transport Code for Hanford Emission Tracking (RATCHET). Hanford Environmental Dose Reconstruction Project

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ramsdell, J.V. Jr.; Simonen, C.A.; Burk, K.W.

    1994-02-01

    The purpose of the Hanford Environmental Dose Reconstruction (HEDR) Project is to estimate radiation doses that individuals may have received from operations at the Hanford Site since 1944. This report deals specifically with the atmospheric transport model, Regional Atmospheric Transport Code for Hanford Emission Tracking (RATCHET). RATCHET is a major rework of the MESOILT2 model used in the first phase of the HEDR Project; only the bookkeeping framework escaped major changes. Changes to the code include (1) significant changes in the representation of atmospheric processes and (2) incorporation of Monte Carlo methods for representing uncertainty in input data, model parameters,more » and coefficients. To a large extent, the revisions to the model are based on recommendations of a peer working group that met in March 1991. Technical bases for other portions of the atmospheric transport model are addressed in two other documents. This report has three major sections: a description of the model, a user`s guide, and a programmer`s guide. These sections discuss RATCHET from three different perspectives. The first provides a technical description of the code with emphasis on details such as the representation of the model domain, the data required by the model, and the equations used to make the model calculations. The technical description is followed by a user`s guide to the model with emphasis on running the code. The user`s guide contains information about the model input and output. The third section is a programmer`s guide to the code. It discusses the hardware and software required to run the code. The programmer`s guide also discusses program structure and each of the program elements.« less

  10. Uncertainty Analysis in 3D Equilibrium Reconstruction

    DOE PAGES

    Cianciosa, Mark R.; Hanson, James D.; Maurer, David A.

    2018-02-21

    Reconstruction is an inverse process where a parameter space is searched to locate a set of parameters with the highest probability of describing experimental observations. Due to systematic errors and uncertainty in experimental measurements, this optimal set of parameters will contain some associated uncertainty. This uncertainty in the optimal parameters leads to uncertainty in models derived using those parameters. V3FIT is a three-dimensional (3D) equilibrium reconstruction code that propagates uncertainty from the input signals, to the reconstructed parameters, and to the final model. Here in this paper, we describe the methods used to propagate uncertainty in V3FIT. Using the resultsmore » of whole shot 3D equilibrium reconstruction of the Compact Toroidal Hybrid, this propagated uncertainty is validated against the random variation in the resulting parameters. Two different model parameterizations demonstrate how the uncertainty propagation can indicate the quality of a reconstruction. As a proxy for random sampling, the whole shot reconstruction results in a time interval that will be used to validate the propagated uncertainty from a single time slice.« less

  11. Uncertainty Analysis in 3D Equilibrium Reconstruction

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cianciosa, Mark R.; Hanson, James D.; Maurer, David A.

    Reconstruction is an inverse process where a parameter space is searched to locate a set of parameters with the highest probability of describing experimental observations. Due to systematic errors and uncertainty in experimental measurements, this optimal set of parameters will contain some associated uncertainty. This uncertainty in the optimal parameters leads to uncertainty in models derived using those parameters. V3FIT is a three-dimensional (3D) equilibrium reconstruction code that propagates uncertainty from the input signals, to the reconstructed parameters, and to the final model. Here in this paper, we describe the methods used to propagate uncertainty in V3FIT. Using the resultsmore » of whole shot 3D equilibrium reconstruction of the Compact Toroidal Hybrid, this propagated uncertainty is validated against the random variation in the resulting parameters. Two different model parameterizations demonstrate how the uncertainty propagation can indicate the quality of a reconstruction. As a proxy for random sampling, the whole shot reconstruction results in a time interval that will be used to validate the propagated uncertainty from a single time slice.« less

  12. 140 GHz EC waves propagation and absorption for normal/oblique injection on FTU tokamak

    NASA Astrophysics Data System (ADS)

    Nowak, S.; Airoldi, A.; Bruschi, A.; Buratti, P.; Cirant, S.; Gandini, F.; Granucci, G.; Lazzaro, E.; Panaccione, L.; Ramponi, G.; Simonetto, A.; Sozzi, C.; Tudisco, O.; Zerbini, M.

    1999-09-01

    Most of the interest in ECRH experiments is linked to the high localization of EC waves absorption in well known portions of the plasma volume. In order to take full advantage of this capability a reliable code has been developed for beam tracing and absorption calculations. The code is particularly important for oblique (poloidal and toroidal) injection, when the absorbing layer is not simply dependent on the position of the EC resonance only. An experimental estimate of the local heating power density is given by the jump in the time derivative of the local electron pressure at the switching ON of the gyrotron power. The evolution of the temperature profile increase (from ECE polychromator) during the nearly adiabatic phase is also considered for ECRH profile reconstruction. An indirect estimate of optical thickness and of the overall absorption coefficient is given by the measure of the residual e.m. power at the tokamak walls. Beam tracing code predictions of the power deposition profile are compared with experimental estimates. The impact of the finite spatial resolution of the temperature diagnostic on profile reconstruction is also discussed.

  13. Sparse Coding for N-Gram Feature Extraction and Training for File Fragment Classification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Felix; Quach, Tu-Thach; Wheeler, Jason

    File fragment classification is an important step in the task of file carving in digital forensics. In file carving, files must be reconstructed based on their content as a result of their fragmented storage on disk or in memory. Existing methods for classification of file fragments typically use hand-engineered features such as byte histograms or entropy measures. In this paper, we propose an approach using sparse coding that enables automated feature extraction. Sparse coding, or sparse dictionary learning, is an unsupervised learning algorithm, and is capable of extracting features based simply on how well those features can be used tomore » reconstruct the original data. With respect to file fragments, we learn sparse dictionaries for n-grams, continuous sequences of bytes, of different sizes. These dictionaries may then be used to estimate n-gram frequencies for a given file fragment, but for significantly larger n-gram sizes than are typically found in existing methods which suffer from combinatorial explosion. To demonstrate the capability of our sparse coding approach, we used the resulting features to train standard classifiers such as support vector machines (SVMs) over multiple file types. Experimentally, we achieved significantly better classification results with respect to existing methods, especially when the features were used in supplement to existing hand-engineered features.« less

  14. Toward enhancing the distributed video coder under a multiview video codec framework

    NASA Astrophysics Data System (ADS)

    Lee, Shih-Chieh; Chen, Jiann-Jone; Tsai, Yao-Hong; Chen, Chin-Hua

    2016-11-01

    The advance of video coding technology enables multiview video (MVV) or three-dimensional television (3-D TV) display for users with or without glasses. For mobile devices or wireless applications, a distributed video coder (DVC) can be utilized to shift the encoder complexity to decoder under the MVV coding framework, denoted as multiview distributed video coding (MDVC). We proposed to exploit both inter- and intraview video correlations to enhance side information (SI) and improve the MDVC performance: (1) based on the multiview motion estimation (MVME) framework, a categorized block matching prediction with fidelity weights (COMPETE) was proposed to yield a high quality SI frame for better DVC reconstructed images. (2) The block transform coefficient properties, i.e., DCs and ACs, were exploited to design the priority rate control for the turbo code, such that the DVC decoding can be carried out with fewest parity bits. In comparison, the proposed COMPETE method demonstrated lower time complexity, while presenting better reconstructed video quality. Simulations show that the proposed COMPETE can reduce the time complexity of MVME to 1.29 to 2.56 times smaller, as compared to previous hybrid MVME methods, while the image peak signal to noise ratios (PSNRs) of a decoded video can be improved 0.2 to 3.5 dB, as compared to H.264/AVC intracoding.

  15. Sparse Coding for N-Gram Feature Extraction and Training for File Fragment Classification

    DOE PAGES

    Wang, Felix; Quach, Tu-Thach; Wheeler, Jason; ...

    2018-04-05

    File fragment classification is an important step in the task of file carving in digital forensics. In file carving, files must be reconstructed based on their content as a result of their fragmented storage on disk or in memory. Existing methods for classification of file fragments typically use hand-engineered features such as byte histograms or entropy measures. In this paper, we propose an approach using sparse coding that enables automated feature extraction. Sparse coding, or sparse dictionary learning, is an unsupervised learning algorithm, and is capable of extracting features based simply on how well those features can be used tomore » reconstruct the original data. With respect to file fragments, we learn sparse dictionaries for n-grams, continuous sequences of bytes, of different sizes. These dictionaries may then be used to estimate n-gram frequencies for a given file fragment, but for significantly larger n-gram sizes than are typically found in existing methods which suffer from combinatorial explosion. To demonstrate the capability of our sparse coding approach, we used the resulting features to train standard classifiers such as support vector machines (SVMs) over multiple file types. Experimentally, we achieved significantly better classification results with respect to existing methods, especially when the features were used in supplement to existing hand-engineered features.« less

  16. Two-stage sparse coding of region covariance via Log-Euclidean kernels to detect saliency.

    PubMed

    Zhang, Ying-Ying; Yang, Cai; Zhang, Ping

    2017-05-01

    In this paper, we present a novel bottom-up saliency detection algorithm from the perspective of covariance matrices on a Riemannian manifold. Each superpixel is described by a region covariance matrix on Riemannian Manifolds. We carry out a two-stage sparse coding scheme via Log-Euclidean kernels to extract salient objects efficiently. In the first stage, given background dictionary on image borders, sparse coding of each region covariance via Log-Euclidean kernels is performed. The reconstruction error on the background dictionary is regarded as the initial saliency of each superpixel. In the second stage, an improvement of the initial result is achieved by calculating reconstruction errors of the superpixels on foreground dictionary, which is extracted from the first stage saliency map. The sparse coding in the second stage is similar to the first stage, but is able to effectively highlight the salient objects uniformly from the background. Finally, three post-processing methods-highlight-inhibition function, context-based saliency weighting, and the graph cut-are adopted to further refine the saliency map. Experiments on four public benchmark datasets show that the proposed algorithm outperforms the state-of-the-art methods in terms of precision, recall and mean absolute error, and demonstrate the robustness and efficiency of the proposed method. Copyright © 2017 Elsevier Ltd. All rights reserved.

  17. DynamiX, numerical tool for design of next-generation x-ray telescopes.

    PubMed

    Chauvin, Maxime; Roques, Jean-Pierre

    2010-07-20

    We present a new code aimed at the simulation of grazing-incidence x-ray telescopes subject to deformations and demonstrate its ability with two test cases: the Simbol-X and the International X-ray Observatory (IXO) missions. The code, based on Monte Carlo ray tracing, computes the full photon trajectories up to the detector plane, accounting for the x-ray interactions and for the telescope motion and deformation. The simulation produces images and spectra for any telescope configuration using Wolter I mirrors and semiconductor detectors. This numerical tool allows us to study the telescope performance in terms of angular resolution, effective area, and detector efficiency, accounting for the telescope behavior. We have implemented an image reconstruction method based on the measurement of the detector drifts by an optical sensor metrology. Using an accurate metrology, this method allows us to recover the loss of angular resolution induced by the telescope instability. In the framework of the Simbol-X mission, this code was used to study the impacts of the parameters on the telescope performance. In this paper we present detailed performance analysis of Simbol-X, taking into account the satellite motions and the image reconstruction. To illustrate the versatility of the code, we present an additional performance analysis with a particular configuration of IXO.

  18. Calculation of Eddy Currents In the CTH Vacuum Vessel and Coil Frame

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    A. Zolfaghari, A. Brooks, A. Michaels, J. Hanson, and G. Hartwell

    2012-09-25

    Knowledge of eddy currents in the vacuum vessel walls and nearby conducting support structures can significantly contribute to the accuracy of Magnetohydrodynamics (MHD) equilibrium reconstruction in toroidal plasmas. Moreover, the magnetic fields produced by the eddy currents could generate error fields that may give rise to islands at rational surfaces or cause field lines to become chaotic. In the Compact Toroidal Hybrid (CTH) device (R0 = 0.75 m, a = 0.29 m, B ≤ 0.7 T), the primary driver of the eddy currents during the plasma discharge is the changing flux of the ohmic heating transformer. Electromagnetic simulations are usedmore » to calculate eddy current paths and profile in the vacuum vessel and in the coil frame pieces with known time dependent currents in the ohmic heating coils. MAXWELL and SPARK codes were used for the Electromagnetic modeling and simulation. MAXWELL code was used for detailed 3D finite-element analysis of the eddy currents in the structures. SPARK code was used to calculate the eddy currents in the structures as modeled with shell/surface elements, with each element representing a current loop. In both cases current filaments representing the eddy currents were prepared for input into VMEC code for MHD equilibrium reconstruction of the plasma discharge. __________________________________________________« less

  19. Investigating the Use of the Intel Xeon Phi for Event Reconstruction

    NASA Astrophysics Data System (ADS)

    Sherman, Keegan; Gilfoyle, Gerard

    2014-09-01

    The physics goal of Jefferson Lab is to understand how quarks and gluons form nuclei and it is being upgraded to a higher, 12-GeV beam energy. The new CLAS12 detector in Hall B will collect 5-10 terabytes of data per day and will require considerable computing resources. We are investigating tools, such as the Intel Xeon Phi, to speed up the event reconstruction. The Kalman Filter is one of the methods being studied. It is a linear algebra algorithm that estimates the state of a system by combining existing data and predictions of those measurements. The tools required to apply this technique (i.e. matrix multiplication, matrix inversion) are being written using C++ intrinsics for Intel's Xeon Phi Coprocessor, which uses the Many Integrated Cores (MIC) architecture. The Intel MIC is a new high-performance chip that connects to a host machine through the PCIe bus and is built to run highly vectorized and parallelized code making it a well-suited device for applications such as the Kalman Filter. Our tests of the MIC optimized algorithms needed for the filter show significant increases in speed. For example, matrix multiplication of 5x5 matrices on the MIC was able to run up to 69 times faster than the host core. The physics goal of Jefferson Lab is to understand how quarks and gluons form nuclei and it is being upgraded to a higher, 12-GeV beam energy. The new CLAS12 detector in Hall B will collect 5-10 terabytes of data per day and will require considerable computing resources. We are investigating tools, such as the Intel Xeon Phi, to speed up the event reconstruction. The Kalman Filter is one of the methods being studied. It is a linear algebra algorithm that estimates the state of a system by combining existing data and predictions of those measurements. The tools required to apply this technique (i.e. matrix multiplication, matrix inversion) are being written using C++ intrinsics for Intel's Xeon Phi Coprocessor, which uses the Many Integrated Cores (MIC) architecture. The Intel MIC is a new high-performance chip that connects to a host machine through the PCIe bus and is built to run highly vectorized and parallelized code making it a well-suited device for applications such as the Kalman Filter. Our tests of the MIC optimized algorithms needed for the filter show significant increases in speed. For example, matrix multiplication of 5x5 matrices on the MIC was able to run up to 69 times faster than the host core. Work supported by the University of Richmond and the US Department of Energy.

  20. Matrix factorization-based data fusion for the prediction of lncRNA-disease associations.

    PubMed

    Fu, Guangyuan; Wang, Jun; Domeniconi, Carlotta; Yu, Guoxian

    2018-05-01

    Long non-coding RNAs (lncRNAs) play crucial roles in complex disease diagnosis, prognosis, prevention and treatment, but only a small portion of lncRNA-disease associations have been experimentally verified. Various computational models have been proposed to identify lncRNA-disease associations by integrating heterogeneous data sources. However, existing models generally ignore the intrinsic structure of data sources or treat them as equally relevant, while they may not be. To accurately identify lncRNA-disease associations, we propose a Matrix Factorization based LncRNA-Disease Association prediction model (MFLDA in short). MFLDA decomposes data matrices of heterogeneous data sources into low-rank matrices via matrix tri-factorization to explore and exploit their intrinsic and shared structure. MFLDA can select and integrate the data sources by assigning different weights to them. An iterative solution is further introduced to simultaneously optimize the weights and low-rank matrices. Next, MFLDA uses the optimized low-rank matrices to reconstruct the lncRNA-disease association matrix and thus to identify potential associations. In 5-fold cross validation experiments to identify verified lncRNA-disease associations, MFLDA achieves an area under the receiver operating characteristic curve (AUC) of 0.7408, at least 3% higher than those given by state-of-the-art data fusion based computational models. An empirical study on identifying masked lncRNA-disease associations again shows that MFLDA can identify potential associations more accurately than competing models. A case study on identifying lncRNAs associated with breast, lung and stomach cancers show that 38 out of 45 (84%) associations predicted by MFLDA are supported by recent biomedical literature and further proves the capability of MFLDA in identifying novel lncRNA-disease associations. MFLDA is a general data fusion framework, and as such it can be adopted to predict associations between other biological entities. The source code for MFLDA is available at: http://mlda.swu.edu.cn/codes.php? name = MFLDA. gxyu@swu.edu.cn. Supplementary data are available at Bioinformatics online.

  1. MC 2 -3: Multigroup Cross Section Generation Code for Fast Reactor Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, Changho; Yang, Won Sik

    This paper presents the methods and performance of the MC2 -3 code, which is a multigroup cross-section generation code for fast reactor analysis, developed to improve the resonance self-shielding and spectrum calculation methods of MC2 -2 and to simplify the current multistep schemes generating region-dependent broad-group cross sections. Using the basic neutron data from ENDF/B data files, MC2 -3 solves the consistent P1 multigroup transport equation to determine the fundamental mode spectra for use in generating multigroup neutron cross sections. A homogeneous medium or a heterogeneous slab or cylindrical unit cell problem is solved in ultrafine (2082) or hyperfine (~400more » 000) group levels. In the resolved resonance range, pointwise cross sections are reconstructed with Doppler broadening at specified temperatures. The pointwise cross sections are directly used in the hyperfine group calculation, whereas for the ultrafine group calculation, self-shielded cross sections are prepared by numerical integration of the pointwise cross sections based upon the narrow resonance approximation. For both the hyperfine and ultrafine group calculations, unresolved resonances are self-shielded using the analytic resonance integral method. The ultrafine group calculation can also be performed for a two-dimensional whole-core problem to generate region-dependent broad-group cross sections. Verification tests have been performed using the benchmark problems for various fast critical experiments including Los Alamos National Laboratory critical assemblies; Zero-Power Reactor, Zero-Power Physics Reactor, and Bundesamt für Strahlenschutz experiments; Monju start-up core; and Advanced Burner Test Reactor. Verification and validation results with ENDF/B-VII.0 data indicated that eigenvalues from MC2 -3/DIF3D agreed well with Monte Carlo N-Particle5 MCNP5 or VIM Monte Carlo solutions within 200 pcm and regionwise one-group fluxes were in good agreement with Monte Carlo solutions.« less

  2. Enhancing the performance of the light field microscope using wavefront coding

    PubMed Central

    Cohen, Noy; Yang, Samuel; Andalman, Aaron; Broxton, Michael; Grosenick, Logan; Deisseroth, Karl; Horowitz, Mark; Levoy, Marc

    2014-01-01

    Light field microscopy has been proposed as a new high-speed volumetric computational imaging method that enables reconstruction of 3-D volumes from captured projections of the 4-D light field. Recently, a detailed physical optics model of the light field microscope has been derived, which led to the development of a deconvolution algorithm that reconstructs 3-D volumes with high spatial resolution. However, the spatial resolution of the reconstructions has been shown to be non-uniform across depth, with some z planes showing high resolution and others, particularly at the center of the imaged volume, showing very low resolution. In this paper, we enhance the performance of the light field microscope using wavefront coding techniques. By including phase masks in the optical path of the microscope we are able to address this non-uniform resolution limitation. We have also found that superior control over the performance of the light field microscope can be achieved by using two phase masks rather than one, placed at the objective’s back focal plane and at the microscope’s native image plane. We present an extended optical model for our wavefront coded light field microscope and develop a performance metric based on Fisher information, which we use to choose adequate phase masks parameters. We validate our approach using both simulated data and experimental resolution measurements of a USAF 1951 resolution target; and demonstrate the utility for biological applications with in vivo volumetric calcium imaging of larval zebrafish brain. PMID:25322056

  3. Enhancing the performance of the light field microscope using wavefront coding.

    PubMed

    Cohen, Noy; Yang, Samuel; Andalman, Aaron; Broxton, Michael; Grosenick, Logan; Deisseroth, Karl; Horowitz, Mark; Levoy, Marc

    2014-10-06

    Light field microscopy has been proposed as a new high-speed volumetric computational imaging method that enables reconstruction of 3-D volumes from captured projections of the 4-D light field. Recently, a detailed physical optics model of the light field microscope has been derived, which led to the development of a deconvolution algorithm that reconstructs 3-D volumes with high spatial resolution. However, the spatial resolution of the reconstructions has been shown to be non-uniform across depth, with some z planes showing high resolution and others, particularly at the center of the imaged volume, showing very low resolution. In this paper, we enhance the performance of the light field microscope using wavefront coding techniques. By including phase masks in the optical path of the microscope we are able to address this non-uniform resolution limitation. We have also found that superior control over the performance of the light field microscope can be achieved by using two phase masks rather than one, placed at the objective's back focal plane and at the microscope's native image plane. We present an extended optical model for our wavefront coded light field microscope and develop a performance metric based on Fisher information, which we use to choose adequate phase masks parameters. We validate our approach using both simulated data and experimental resolution measurements of a USAF 1951 resolution target; and demonstrate the utility for biological applications with in vivo volumetric calcium imaging of larval zebrafish brain.

  4. Nested Dissection Interface Reconstruction in Pececillo

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jibben, Zechariah Joel; Carlson, Neil N.; Francois, Marianne M.

    A nested dissection method for interface reconstruction in a volume tracking framework has been implemented in Pececillo, a mini-app for Truchas, which is the ASC code for casting and additive manufacturing. This method provides a significant improvement over the traditional onion-skin method, which does not appropriately handle T-shaped multimaterial intersections and dynamic contact lines present in additive manufacturing simulations. The resulting implementation lays the groundwork for further research in contact angle estimates and surface tension calculations.

  5. Study on the key technology of optical encryption based on compressive ghost imaging with double random-phase encoding

    NASA Astrophysics Data System (ADS)

    Zhang, Leihong; Pan, Zilan; Liang, Dong; Ma, Xiuhua; Zhang, Dawei

    2015-12-01

    An optical encryption method based on compressive ghost imaging (CGI) with double random-phase encoding (DRPE), named DRPE-CGI, is proposed. The information is first encrypted by the sender with DRPE, the DRPE-coded image is encrypted by the system of computational ghost imaging with a secret key. The key of N random-phase vectors is generated by the sender and will be shared with the receiver who is the authorized user. The receiver decrypts the DRPE-coded image with the key, with the aid of CGI and a compressive sensing technique, and then reconstructs the original information by the technique of DRPE-decoding. The experiments suggest that cryptanalysts cannot get any useful information about the original image even if they eavesdrop 60% of the key at a given time, so the security of DRPE-CGI is higher than that of the security of conventional ghost imaging. Furthermore, this method can reduce 40% of the information quantity compared with ghost imaging while the qualities of reconstructing the information are the same. It can also improve the quality of the reconstructed plaintext information compared with DRPE-GI with the same sampling times. This technique can be immediately applied to encryption and data storage with the advantages of high security, fast transmission, and high quality of reconstructed information.

  6. Optimal bit allocation for hybrid scalable/multiple-description video transmission over wireless channels

    NASA Astrophysics Data System (ADS)

    Jubran, Mohammad K.; Bansal, Manu; Kondi, Lisimachos P.

    2006-01-01

    In this paper, we consider the problem of optimal bit allocation for wireless video transmission over fading channels. We use a newly developed hybrid scalable/multiple-description codec that combines the functionality of both scalable and multiple-description codecs. It produces a base layer and multiple-description enhancement layers. Any of the enhancement layers can be decoded (in a non-hierarchical manner) with the base layer to improve the reconstructed video quality. Two different channel coding schemes (Rate-Compatible Punctured Convolutional (RCPC)/Cyclic Redundancy Check (CRC) coding and, product code Reed Solomon (RS)+RCPC/CRC coding) are used for unequal error protection of the layered bitstream. Optimal allocation of the bitrate between source and channel coding is performed for discrete sets of source coding rates and channel coding rates. Experimental results are presented for a wide range of channel conditions. Also, comparisons with classical scalable coding show the effectiveness of using hybrid scalable/multiple-description coding for wireless transmission.

  7. Towards explaining spatial touch perception: Weighted integration of multiple location codes

    PubMed Central

    Badde, Stephanie; Heed, Tobias

    2016-01-01

    ABSTRACT Touch is bound to the skin – that is, to the boundaries of the body. Yet, the activity of neurons in primary somatosensory cortex just mirrors the spatial distribution of the sensors across the skin. To determine the location of a tactile stimulus on the body, the body's spatial layout must be considered. Moreover, to relate touch to the external world, body posture has to be evaluated. In this review, we argue that posture is incorporated, by default, for any tactile stimulus. However, the relevance of the external location and, thus, its expression in behaviour, depends on various sensory and cognitive factors. Together, these factors imply that an external representation of touch dominates over the skin-based, anatomical when our focus is on the world rather than on our own body. We conclude that touch localization is a reconstructive process that is adjusted to the context while maintaining all available spatial information. PMID:27327353

  8. RootJS: Node.js Bindings for ROOT 6

    NASA Astrophysics Data System (ADS)

    Beffart, Theo; Früh, Maximilian; Haas, Christoph; Rajgopal, Sachin; Schwabe, Jonas; Wolff, Christoph; Szuba, Marek

    2017-10-01

    We present rootJS, an interface making it possible to seamlessly integrate ROOT 6 into applications written for Node.js, the JavaScript runtime platform increasingly commonly used to create high-performance Web applications. ROOT features can be called both directly from Node.js code and by JIT-compiling C++ macros. All rootJS methods are invoked asynchronously and support callback functions, allowing non-blocking operation of Node.js applications using them. Last but not least, our bindings have been designed to platform-independent and should therefore work on all systems supporting both ROOT 6 and Node.js. Thanks to rootJS it is now possible to create ROOT-aware Web applications taking full advantage of the high performance and extensive capabilities of Node.js. Examples include platforms for the quality assurance of acquired, reconstructed or simulated data, book-keeping and e-log systems, and even Web browser-based data visualisation and analysis.

  9. The reconstructive microsurgery ladder in orthopaedics.

    PubMed

    Tintle, Scott M; Levin, L Scott

    2013-03-01

    Since the advent of the operating microscope by Julius Jacobson in 1960, reconstructive microsurgery has become an integral part of extremity reconstruction and orthopaedics. During World War I, with the influx of severe extremity trauma Harold Gillies introduced the concept of the reconstructive ladder for wound closure. The concept of the reconstructive ladder goes from simple to complex means of attaining wound closure. Over the last half century microsurgery has continued to evolve and progress. We now have a microsurgical reconstructive ladder. The microsurgical reconstruction ladder is based upon the early work on revascularization and replantation extending through the procedures that are described in this article. Copyright © 2013. Published by Elsevier Ltd.

  10. Health-related quality-of-life assessment and surgical outcomes for auricular reconstruction using autologous costal cartilage.

    PubMed

    Soukup, Benjamin; Mashhadi, Syed A; Bulstrode, Neil W

    2012-03-01

    This study aims to assess the health-related quality-of-life benefit following auricular reconstruction using autologous costal cartilage in children. In addition, key aspects of the surgical reconstruction are assessed. After auricular reconstruction, patients completed two questionnaires. The first was a postinterventional health-related quality-of-life assessment tool, the Glasgow Benefit Inventory. A score of 0 signifies no change in health-related quality-of-life, +100 indicates maximal improvement, and -100 indicates maximal negative impact. The second questionnaire assessed surgical outcomes in auricular reconstruction across three areas: facial integration, aesthetic auricular units, and costal reconstruction. These were recorded on a five-point ordinal scale and are presented as mean scores of a total of 5. The mean total Glasgow Benefit Inventory score was 48.1; significant improvements were seen in all three Glasgow Benefit Inventory subscales (p < 0.0001). A mean integration score of 3.8 and a mean aesthetic auricular unit reconstruction score of 3.4 were recorded. Skin color matching (4.3) of the ear was most successfully reconstructed and auricular cartilage reconstruction scored lowest (3.5). Of the aesthetic units, the helix scored highest (3.6) and the tragus/antitragus scored lowest (3.3). Donor-site reconstruction scored 3.9. Correlation analysis revealed that higher reconstruction scores are associated with a greater health-related quality-of-life gain (r = 0.5). Ninety-six percent of patients would recommend the procedure to a friend. Auricular reconstruction with autologous cartilage results in significant improvements in health-related quality-of-life. In addition, better surgical outcomes lead to a greater improvement in health-related quality-of-life. Comparatively poorer reconstructed areas of the ear were identified so that surgical techniques may be improved. Therapeutic, IV.

  11. Reconstructing genome-wide regulatory network of E. coli using transcriptome data and predicted transcription factor activities

    PubMed Central

    2011-01-01

    Background Gene regulatory networks play essential roles in living organisms to control growth, keep internal metabolism running and respond to external environmental changes. Understanding the connections and the activity levels of regulators is important for the research of gene regulatory networks. While relevance score based algorithms that reconstruct gene regulatory networks from transcriptome data can infer genome-wide gene regulatory networks, they are unfortunately prone to false positive results. Transcription factor activities (TFAs) quantitatively reflect the ability of the transcription factor to regulate target genes. However, classic relevance score based gene regulatory network reconstruction algorithms use models do not include the TFA layer, thus missing a key regulatory element. Results This work integrates TFA prediction algorithms with relevance score based network reconstruction algorithms to reconstruct gene regulatory networks with improved accuracy over classic relevance score based algorithms. This method is called Gene expression and Transcription factor activity based Relevance Network (GTRNetwork). Different combinations of TFA prediction algorithms and relevance score functions have been applied to find the most efficient combination. When the integrated GTRNetwork method was applied to E. coli data, the reconstructed genome-wide gene regulatory network predicted 381 new regulatory links. This reconstructed gene regulatory network including the predicted new regulatory links show promising biological significances. Many of the new links are verified by known TF binding site information, and many other links can be verified from the literature and databases such as EcoCyc. The reconstructed gene regulatory network is applied to a recent transcriptome analysis of E. coli during isobutanol stress. In addition to the 16 significantly changed TFAs detected in the original paper, another 7 significantly changed TFAs have been detected by using our reconstructed network. Conclusions The GTRNetwork algorithm introduces the hidden layer TFA into classic relevance score-based gene regulatory network reconstruction processes. Integrating the TFA biological information with regulatory network reconstruction algorithms significantly improves both detection of new links and reduces that rate of false positives. The application of GTRNetwork on E. coli gene transcriptome data gives a set of potential regulatory links with promising biological significance for isobutanol stress and other conditions. PMID:21668997

  12. Equilibrium Spline Interface (ESI) for magnetic confinement codes

    NASA Astrophysics Data System (ADS)

    Li, Xujing; Zakharov, Leonid E.

    2017-12-01

    A compact and comprehensive interface between magneto-hydrodynamic (MHD) equilibrium codes and gyro-kinetic, particle orbit, MHD stability, and transport codes is presented. Its irreducible set of equilibrium data consists of three (in the 2-D case with occasionally one extra in the 3-D case) functions of coordinates and four 1-D radial profiles together with their first and mixed derivatives. The C reconstruction routines, accessible also from FORTRAN, allow the calculation of basis functions and their first derivatives at any position inside the plasma and in its vicinity. After this all vector fields and geometric coefficients, required for the above mentioned types of codes, can be calculated using only algebraic operations with no further interpolation or differentiation.

  13. Toward a first-principles integrated simulation of tokamak edge plasmas

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chang, C S; Klasky, Scott A; Cummings, Julian

    2008-01-01

    Performance of the ITER is anticipated to be highly sensitive to the edge plasma condition. The edge pedestal in ITER needs to be predicted from an integrated simulation of the necessary firstprinciples, multi-scale physics codes. The mission of the SciDAC Fusion Simulation Project (FSP) Prototype Center for Plasma Edge Simulation (CPES) is to deliver such a code integration framework by (1) building new kinetic codes XGC0 and XGC1, which can simulate the edge pedestal buildup; (2) using and improving the existing MHD codes ELITE, M3D-OMP, M3D-MPP and NIMROD, for study of large-scale edge instabilities called Edge Localized Modes (ELMs); andmore » (3) integrating the codes into a framework using cutting-edge computer science technology. Collaborative effort among physics, computer science, and applied mathematics within CPES has created the first working version of the End-to-end Framework for Fusion Integrated Simulation (EFFIS), which can be used to study the pedestal-ELM cycles.« less

  14. The EUCLID/V1 Integrated Code for Safety Assessment of Liquid Metal Cooled Fast Reactors. Part 1: Basic Models

    NASA Astrophysics Data System (ADS)

    Mosunova, N. A.

    2018-05-01

    The article describes the basic models included in the EUCLID/V1 integrated code intended for safety analysis of liquid metal (sodium, lead, and lead-bismuth) cooled fast reactors using fuel rods with a gas gap and pellet dioxide, mixed oxide or nitride uranium-plutonium fuel under normal operation, under anticipated operational occurrences and accident conditions by carrying out interconnected thermal-hydraulic, neutronics, and thermal-mechanical calculations. Information about the Russian and foreign analogs of the EUCLID/V1 integrated code is given. Modeled objects, equation systems in differential form solved in each module of the EUCLID/V1 integrated code (the thermal-hydraulic, neutronics, fuel rod analysis module, and the burnup and decay heat calculation modules), the main calculated quantities, and also the limitations on application of the code are presented. The article also gives data on the scope of functions performed by the integrated code's thermal-hydraulic module, using which it is possible to describe both one- and twophase processes occurring in the coolant. It is shown that, owing to the availability of the fuel rod analysis module in the integrated code, it becomes possible to estimate the performance of fuel rods in different regimes of the reactor operation. It is also shown that the models implemented in the code for calculating neutron-physical processes make it possible to take into account the neutron field distribution over the fuel assembly cross section as well as other features important for the safety assessment of fast reactors.

  15. Tomography reconstruction methods for damage diagnosis of wood structure in construction field

    NASA Astrophysics Data System (ADS)

    Qiu, Qiwen; Lau, Denvid

    2018-03-01

    The structural integrity of wood building element plays a critical role in the public safety, which requires effective methods for diagnosis of internal damage inside the wood body. Conventionally, the non-destructive testing (NDT) methods such as X-ray computed tomography, thermography, radar imaging reconstruction method, ultrasonic tomography, nuclear magnetic imaging techniques, and sonic tomography have been used to obtain the information about the internal structure of wood. In this paper, the applications, advantages and disadvantages of these traditional tomography methods are reviewed. Additionally, the present article gives an overview of recently developed tomography approach that relies on the use of mechanical and electromagnetic waves for assessing the structural integrity of wood buildings. This developed tomography reconstruction method is believed to provide a more accurate, reliable, and comprehensive assessment of wood structural integrity

  16. Reconstruction of a Three-Dimensional Transonic Rotor Flow Field from Holographical Interferogram Data.

    DTIC Science & Technology

    1985-03-01

    interferometry and computer- R - spanwise coordinate, ft assisted tomography ( CAT ) are used to determine the transonic velocity field of a model rotor...and extracting fringe-order functions, the c data are transferred to a CAT code.- The CAT code Ui transmitted wave complex amplitude then calculates...the perturbation velocity in sev- eral planes above the blade surface. The values Ur reference wave complex amplitude from the holography- CAT method

  17. Research Integrity and Research Ethics in Professional Codes of Ethics: Survey of Terminology Used by Professional Organizations across Research Disciplines.

    PubMed

    Komić, Dubravka; Marušić, Stjepan Ljudevit; Marušić, Ana

    2015-01-01

    Professional codes of ethics are social contracts among members of a professional group, which aim to instigate, encourage and nurture ethical behaviour and prevent professional misconduct, including research and publication. Despite the existence of codes of ethics, research misconduct remains a serious problem. A survey of codes of ethics from 795 professional organizations from the Illinois Institute of Technology's Codes of Ethics Collection showed that 182 of them (23%) used research integrity and research ethics terminology in their codes, with differences across disciplines: while the terminology was common in professional organizations in social sciences (82%), mental health (71%), sciences (61%), other organizations had no statements (construction trades, fraternal social organizations, real estate) or a few of them (management, media, engineering). A subsample of 158 professional organizations we judged to be directly involved in research significantly more often had statements on research integrity/ethics terminology than the whole sample: an average of 10.4% of organizations with a statement (95% CI = 10.4-23-5%) on any of the 27 research integrity/ethics terms compared to 3.3% (95% CI = 2.1-4.6%), respectively (P<0.001). Overall, 62% of all statements addressing research integrity/ethics concepts used prescriptive language in describing the standard of practice. Professional organizations should define research integrity and research ethics issues in their ethics codes and collaborate within and across disciplines to adequately address responsible conduct of research and meet contemporary needs of their communities.

  18. Integrated Main Propulsion System Performance Reconstruction Process/Models

    NASA Technical Reports Server (NTRS)

    Lopez, Eduardo; Elliott, Katie; Snell, Steven; Evans, Michael

    2013-01-01

    The Integrated Main Propulsion System (MPS) Performance Reconstruction process provides the MPS post-flight data files needed for postflight reporting to the project integration management and key customers to verify flight performance. This process/model was used as the baseline for the currently ongoing Space Launch System (SLS) work. The process utilizes several methodologies, including multiple software programs, to model integrated propulsion system performance through space shuttle ascent. It is used to evaluate integrated propulsion systems, including propellant tanks, feed systems, rocket engine, and pressurization systems performance throughout ascent based on flight pressure and temperature data. The latest revision incorporates new methods based on main engine power balance model updates to model higher mixture ratio operation at lower engine power levels.

  19. Computational microscopy: illumination coding and nonlinear optimization enables gigapixel 3D phase imaging

    NASA Astrophysics Data System (ADS)

    Tian, Lei; Waller, Laura

    2017-05-01

    Microscope lenses can have either large field of view (FOV) or high resolution, not both. Computational microscopy based on illumination coding circumvents this limit by fusing images from different illumination angles using nonlinear optimization algorithms. The result is a Gigapixel-scale image having both wide FOV and high resolution. We demonstrate an experimentally robust reconstruction algorithm based on a 2nd order quasi-Newton's method, combined with a novel phase initialization scheme. To further extend the Gigapixel imaging capability to 3D, we develop a reconstruction method to process the 4D light field measurements from sequential illumination scanning. The algorithm is based on a 'multislice' forward model that incorporates both 3D phase and diffraction effects, as well as multiple forward scatterings. To solve the inverse problem, an iterative update procedure that combines both phase retrieval and 'error back-propagation' is developed. To avoid local minimum solutions, we further develop a novel physical model-based initialization technique that accounts for both the geometric-optic and 1st order phase effects. The result is robust reconstructions of Gigapixel 3D phase images having both wide FOV and super resolution in all three dimensions. Experimental results from an LED array microscope were demonstrated.

  20. lpNet: a linear programming approach to reconstruct signal transduction networks.

    PubMed

    Matos, Marta R A; Knapp, Bettina; Kaderali, Lars

    2015-10-01

    With the widespread availability of high-throughput experimental technologies it has become possible to study hundreds to thousands of cellular factors simultaneously, such as coding- or non-coding mRNA or protein concentrations. Still, extracting information about the underlying regulatory or signaling interactions from these data remains a difficult challenge. We present a flexible approach towards network inference based on linear programming. Our method reconstructs the interactions of factors from a combination of perturbation/non-perturbation and steady-state/time-series data. We show both on simulated and real data that our methods are able to reconstruct the underlying networks fast and efficiently, thus shedding new light on biological processes and, in particular, into disease's mechanisms of action. We have implemented the approach as an R package available through bioconductor. This R package is freely available under the Gnu Public License (GPL-3) from bioconductor.org (http://bioconductor.org/packages/release/bioc/html/lpNet.html) and is compatible with most operating systems (Windows, Linux, Mac OS) and hardware architectures. bettina.knapp@helmholtz-muenchen.de Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  1. 3D Representative Volume Element Reconstruction of Fiber Composites via Orientation Tensor and Substructure Features

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Yi; Chen, Wei; Xu, Hongyi

    To provide a seamless integration of manufacturing processing simulation and fiber microstructure modeling, two new stochastic 3D microstructure reconstruction methods are proposed for two types of random fiber composites: random short fiber composites, and Sheet Molding Compounds (SMC) chopped fiber composites. A Random Sequential Adsorption (RSA) algorithm is first developed to embed statistical orientation information into 3D RVE reconstruction of random short fiber composites. For the SMC composites, an optimized Voronoi diagram based approach is developed for capturing the substructure features of SMC chopped fiber composites. The proposed methods are distinguished from other reconstruction works by providing a way ofmore » integrating statistical information (fiber orientation tensor) obtained from material processing simulation, as well as capturing the multiscale substructures of the SMC composites.« less

  2. Tomographic reconstruction of ionospheric electron density during the storm of 5-6 August 2011 using multi-source data

    PubMed Central

    Tang, Jun; Yao, Yibin; Zhang, Liang; Kong, Jian

    2015-01-01

    The insufficiency of data is the essential reason for ill-posed problem existed in computerized ionospheric tomography (CIT) technique. Therefore, the method of integrating multi-source data is proposed. Currently, the multiple satellite navigation systems and various ionospheric observing instruments provide abundant data which can be employed to reconstruct ionospheric electron density (IED). In order to improve the vertical resolution of IED, we do research on IED reconstruction by integration of ground-based GPS data, occultation data from the LEO satellite, satellite altimetry data from Jason-1 and Jason-2 and ionosonde data. We used the CIT results to compare with incoherent scatter radar (ISR) observations, and found that the multi-source data fusion was effective and reliable to reconstruct electron density, showing its superiority than CIT with GPS data alone. PMID:26266764

  3. [Research on Three-dimensional Medical Image Reconstruction and Interaction Based on HTML5 and Visualization Toolkit].

    PubMed

    Gao, Peng; Liu, Peng; Su, Hongsen; Qiao, Liang

    2015-04-01

    Integrating visualization toolkit and the capability of interaction, bidirectional communication and graphics rendering which provided by HTML5, we explored and experimented on the feasibility of remote medical image reconstruction and interaction in pure Web. We prompted server-centric method which did not need to download the big medical data to local connections and avoided considering network transmission pressure and the three-dimensional (3D) rendering capability of client hardware. The method integrated remote medical image reconstruction and interaction into Web seamlessly, which was applicable to lower-end computers and mobile devices. Finally, we tested this method in the Internet and achieved real-time effects. This Web-based 3D reconstruction and interaction method, which crosses over internet terminals and performance limited devices, may be useful for remote medical assistant.

  4. Tomographic reconstruction of ionospheric electron density during the storm of 5-6 August 2011 using multi-source data.

    PubMed

    Tang, Jun; Yao, Yibin; Zhang, Liang; Kong, Jian

    2015-08-12

    The insufficiency of data is the essential reason for ill-posed problem existed in computerized ionospheric tomography (CIT) technique. Therefore, the method of integrating multi-source data is proposed. Currently, the multiple satellite navigation systems and various ionospheric observing instruments provide abundant data which can be employed to reconstruct ionospheric electron density (IED). In order to improve the vertical resolution of IED, we do research on IED reconstruction by integration of ground-based GPS data, occultation data from the LEO satellite, satellite altimetry data from Jason-1 and Jason-2 and ionosonde data. We used the CIT results to compare with incoherent scatter radar (ISR) observations, and found that the multi-source data fusion was effective and reliable to reconstruct electron density, showing its superiority than CIT with GPS data alone.

  5. Real-time capture and reconstruction system with multiple GPUs for a 3D live scene by a generation from 4K IP images to 8K holograms.

    PubMed

    Ichihashi, Yasuyuki; Oi, Ryutaro; Senoh, Takanori; Yamamoto, Kenji; Kurita, Taiichiro

    2012-09-10

    We developed a real-time capture and reconstruction system for three-dimensional (3D) live scenes. In previous research, we used integral photography (IP) to capture 3D images and then generated holograms from the IP images to implement a real-time reconstruction system. In this paper, we use a 4K (3,840 × 2,160) camera to capture IP images and 8K (7,680 × 4,320) liquid crystal display (LCD) panels for the reconstruction of holograms. We investigate two methods for enlarging the 4K images that were captured by integral photography to 8K images. One of the methods increases the number of pixels of each elemental image. The other increases the number of elemental images. In addition, we developed a personal computer (PC) cluster system with graphics processing units (GPUs) for the enlargement of IP images and the generation of holograms from the IP images using fast Fourier transform (FFT). We used the Compute Unified Device Architecture (CUDA) as the development environment for the GPUs. The Fast Fourier transform is performed using the CUFFT (CUDA FFT) library. As a result, we developed an integrated system for performing all processing from the capture to the reconstruction of 3D images by using these components and successfully used this system to reconstruct a 3D live scene at 12 frames per second.

  6. SESAME: a software tool for the numerical dosimetric reconstruction of radiological accidents involving external sources and its application to the accident in Chile in December 2005.

    PubMed

    Huet, C; Lemosquet, A; Clairand, I; Rioual, J B; Franck, D; de Carlan, L; Aubineau-Lanièce, I; Bottollier-Depois, J F

    2009-01-01

    Estimating the dose distribution in a victim's body is a relevant indicator in assessing biological damage from exposure in the event of a radiological accident caused by an external source. This dose distribution can be assessed by physical dosimetric reconstruction methods. Physical dosimetric reconstruction can be achieved using experimental or numerical techniques. This article presents the laboratory-developed SESAME--Simulation of External Source Accident with MEdical images--tool specific to dosimetric reconstruction of radiological accidents through numerical simulations which combine voxel geometry and the radiation-material interaction MCNP(X) Monte Carlo computer code. The experimental validation of the tool using a photon field and its application to a radiological accident in Chile in December 2005 are also described.

  7. Digital tomosynthesis mammography using a parallel maximum-likelihood reconstruction method

    NASA Astrophysics Data System (ADS)

    Wu, Tao; Zhang, Juemin; Moore, Richard; Rafferty, Elizabeth; Kopans, Daniel; Meleis, Waleed; Kaeli, David

    2004-05-01

    A parallel reconstruction method, based on an iterative maximum likelihood (ML) algorithm, is developed to provide fast reconstruction for digital tomosynthesis mammography. Tomosynthesis mammography acquires 11 low-dose projections of a breast by moving an x-ray tube over a 50° angular range. In parallel reconstruction, each projection is divided into multiple segments along the chest-to-nipple direction. Using the 11 projections, segments located at the same distance from the chest wall are combined to compute a partial reconstruction of the total breast volume. The shape of the partial reconstruction forms a thin slab, angled toward the x-ray source at a projection angle 0°. The reconstruction of the total breast volume is obtained by merging the partial reconstructions. The overlap region between neighboring partial reconstructions and neighboring projection segments is utilized to compensate for the incomplete data at the boundary locations present in the partial reconstructions. A serial execution of the reconstruction is compared to a parallel implementation, using clinical data. The serial code was run on a PC with a single PentiumIV 2.2GHz CPU. The parallel implementation was developed using MPI and run on a 64-node Linux cluster using 800MHz Itanium CPUs. The serial reconstruction for a medium-sized breast (5cm thickness, 11cm chest-to-nipple distance) takes 115 minutes, while a parallel implementation takes only 3.5 minutes. The reconstruction time for a larger breast using a serial implementation takes 187 minutes, while a parallel implementation takes 6.5 minutes. No significant differences were observed between the reconstructions produced by the serial and parallel implementations.

  8. Reference View Selection in DIBR-Based Multiview Coding.

    PubMed

    Maugey, Thomas; Petrazzuoli, Giovanni; Frossard, Pascal; Cagnazzo, Marco; Pesquet-Popescu, Beatrice

    2016-04-01

    Augmented reality, interactive navigation in 3D scenes, multiview video, and other emerging multimedia applications require large sets of images, hence larger data volumes and increased resources compared with traditional video services. The significant increase in the number of images in multiview systems leads to new challenging problems in data representation and data transmission to provide high quality of experience on resource-constrained environments. In order to reduce the size of the data, different multiview video compression strategies have been proposed recently. Most of them use the concept of reference or key views that are used to estimate other images when there is high correlation in the data set. In such coding schemes, the two following questions become fundamental: 1) how many reference views have to be chosen for keeping a good reconstruction quality under coding cost constraints? And 2) where to place these key views in the multiview data set? As these questions are largely overlooked in the literature, we study the reference view selection problem and propose an algorithm for the optimal selection of reference views in multiview coding systems. Based on a novel metric that measures the similarity between the views, we formulate an optimization problem for the positioning of the reference views, such that both the distortion of the view reconstruction and the coding rate cost are minimized. We solve this new problem with a shortest path algorithm that determines both the optimal number of reference views and their positions in the image set. We experimentally validate our solution in a practical multiview distributed coding system and in the standardized 3D-HEVC multiview coding scheme. We show that considering the 3D scene geometry in the reference view, positioning problem brings significant rate-distortion improvements and outperforms the traditional coding strategy that simply selects key frames based on the distance between cameras.

  9. Communications and information research: Improved space link performance via concatenated forward error correction coding

    NASA Technical Reports Server (NTRS)

    Rao, T. R. N.; Seetharaman, G.; Feng, G. L.

    1996-01-01

    With the development of new advanced instruments for remote sensing applications, sensor data will be generated at a rate that not only requires increased onboard processing and storage capability, but imposes demands on the space to ground communication link and ground data management-communication system. Data compression and error control codes provide viable means to alleviate these demands. Two types of data compression have been studied by many researchers in the area of information theory: a lossless technique that guarantees full reconstruction of the data, and a lossy technique which generally gives higher data compaction ratio but incurs some distortion in the reconstructed data. To satisfy the many science disciplines which NASA supports, lossless data compression becomes a primary focus for the technology development. While transmitting the data obtained by any lossless data compression, it is very important to use some error-control code. For a long time, convolutional codes have been widely used in satellite telecommunications. To more efficiently transform the data obtained by the Rice algorithm, it is required to meet the a posteriori probability (APP) for each decoded bit. A relevant algorithm for this purpose has been proposed which minimizes the bit error probability in the decoding linear block and convolutional codes and meets the APP for each decoded bit. However, recent results on iterative decoding of 'Turbo codes', turn conventional wisdom on its head and suggest fundamentally new techniques. During the past several months of this research, the following approaches have been developed: (1) a new lossless data compression algorithm, which is much better than the extended Rice algorithm for various types of sensor data, (2) a new approach to determine the generalized Hamming weights of the algebraic-geometric codes defined by a large class of curves in high-dimensional spaces, (3) some efficient improved geometric Goppa codes for disk memory systems and high-speed mass memory systems, and (4) a tree based approach for data compression using dynamic programming.

  10. Conservative bin-to-bin fractional collisions

    NASA Astrophysics Data System (ADS)

    Martin, Robert

    2016-11-01

    Particle methods such as direct simulation Monte Carlo (DSMC) and particle-in-cell (PIC) are commonly used to model rarefied kinetic flows for engineering applications because of their ability to efficiently capture non-equilibrium behavior. The primary drawback to these methods relates to the poor convergence properties due to the stochastic nature of the methods which typically rely heavily on high degrees of non-equilibrium and time averaging to compensate for poor signal to noise ratios. For standard implementations, each computational particle represents many physical particles which further exacerbate statistical noise problems for flow with large species density variation such as encountered in flow expansions and chemical reactions. The stochastic weighted particle method (SWPM) introduced by Rjasanow and Wagner overcome this difficulty by allowing the ratio of real to computational particles to vary on a per particle basis throughout the flow. The DSMC procedure must also be slightly modified to properly sample the Boltzmann collision integral accounting for the variable particle weights and to avoid the creation of additional particles with negative weight. In this work, the SWPM with necessary modification to incorporate the variable hard sphere (VHS) collision cross section model commonly used in engineering applications is first incorporated into an existing engineering code, the Thermophysics Universal Research Framework. The results and computational efficiency are compared to a few simple test cases using a standard validated implementation of the DSMC method along with the adapted SWPM/VHS collision using an octree based conservative phase space reconstruction. The SWPM method is then further extended to combine the collision and phase space reconstruction into a single step which avoids the need to create additional computational particles only to destroy them again during the particle merge. This is particularly helpful when oversampling the collision integral when compared to the standard DSMC method. However, it is found that the more frequent phase space reconstructions can cause added numerical thermalization with low particle per cell counts due to the coarseness of the octree used. However, the methods are expected to be of much greater utility in transient expansion flows and chemical reactions in the future.

  11. On the use of orientation filters for 3D reconstruction in event-driven stereo vision

    PubMed Central

    Camuñas-Mesa, Luis A.; Serrano-Gotarredona, Teresa; Ieng, Sio H.; Benosman, Ryad B.; Linares-Barranco, Bernabe

    2014-01-01

    The recently developed Dynamic Vision Sensors (DVS) sense visual information asynchronously and code it into trains of events with sub-micro second temporal resolution. This high temporal precision makes the output of these sensors especially suited for dynamic 3D visual reconstruction, by matching corresponding events generated by two different sensors in a stereo setup. This paper explores the use of Gabor filters to extract information about the orientation of the object edges that produce the events, therefore increasing the number of constraints applied to the matching algorithm. This strategy provides more reliably matched pairs of events, improving the final 3D reconstruction. PMID:24744694

  12. Revealing the Physics of Galactic Winds Through Massively-Parallel Hydrodynamics Simulations

    NASA Astrophysics Data System (ADS)

    Schneider, Evan Elizabeth

    This thesis documents the hydrodynamics code Cholla and a numerical study of multiphase galactic winds. Cholla is a massively-parallel, GPU-based code designed for astrophysical simulations that is freely available to the astrophysics community. A static-mesh Eulerian code, Cholla is ideally suited to carrying out massive simulations (> 20483 cells) that require very high resolution. The code incorporates state-of-the-art hydrodynamics algorithms including third-order spatial reconstruction, exact and linearized Riemann solvers, and unsplit integration algorithms that account for transverse fluxes on multidimensional grids. Operator-split radiative cooling and a dual-energy formalism for high mach number flows are also included. An extensive test suite demonstrates Cholla's superior ability to model shocks and discontinuities, while the GPU-native design makes the code extremely computationally efficient - speeds of 5-10 million cell updates per GPU-second are typical on current hardware for 3D simulations with all of the aforementioned physics. The latter half of this work comprises a comprehensive study of the mixing between a hot, supernova-driven wind and cooler clouds representative of those observed in multiphase galactic winds. Both adiabatic and radiatively-cooling clouds are investigated. The analytic theory of cloud-crushing is applied to the problem, and adiabatic turbulent clouds are found to be mixed with the hot wind on similar timescales as the classic spherical case (4-5 t cc) with an appropriate rescaling of the cloud-crushing time. Radiatively cooling clouds survive considerably longer, and the differences in evolution between turbulent and spherical clouds cannot be reconciled with a simple rescaling. The rapid incorporation of low-density material into the hot wind implies efficient mass-loading of hot phases of galactic winds. At the same time, the extreme compression of high-density cloud material leads to long-lived but slow-moving clumps that are unlikely to escape the galaxy.

  13. How to differentiate collective variables in free energy codes: Computer-algebra code generation and automatic differentiation

    NASA Astrophysics Data System (ADS)

    Giorgino, Toni

    2018-07-01

    The proper choice of collective variables (CVs) is central to biased-sampling free energy reconstruction methods in molecular dynamics simulations. The PLUMED 2 library, for instance, provides several sophisticated CV choices, implemented in a C++ framework; however, developing new CVs is still time consuming due to the need to provide code for the analytical derivatives of all functions with respect to atomic coordinates. We present two solutions to this problem, namely (a) symbolic differentiation and code generation, and (b) automatic code differentiation, in both cases leveraging open-source libraries (SymPy and Stan Math, respectively). The two approaches are demonstrated and discussed in detail implementing a realistic example CV, the local radius of curvature of a polymer. Users may use the code as a template to streamline the implementation of their own CVs using high-level constructs and automatic gradient computation.

  14. New generation of integrated geological-geomorphological reconstruction maps in the Rhine-Meuse delta, The Netherlands

    NASA Astrophysics Data System (ADS)

    Pierik, Harm Jan; Cohen, Kim; Stouthamer, Esther

    2016-04-01

    Geological-geomorphological reconstructions are important for integrating diverse types of data and improving understanding of landscape formation processes. This works especially well in densely populated Holocene landscapes, where large quantities of raw data are produced by geotechnical, archaeological, soil science and hydrological communities as well as in academic research. The Rhine-Meuse delta, The Netherlands, has a long tradition of integrated digital reconstruction maps and databases. This contributed to improve understanding of delta evolution, especially regarding the channel belt network evolution. In this contribution, we present a new generation of digital map products for the Holocene Rhine-Meuse delta. Our reconstructions expand existing channel belt network maps, with new map layers containing natural levee extent and relative elevation. The maps we present have been based on hundreds of thousands of lithological borehole descriptions, >1000 radiocarbon dates, and further integrate LIDAR data, soil maps and archaeological information. For selected time slices through the Late Holocene, the map products describe the patterns of levee distribution. Additionally, we mapped the palaeo-topography of the levees through the delta, aiming to resolve what parts of the overbank river landscape were the relatively low and high positioned areas in the past landscape. The resulting palaeogeographical maps are integrative products created for a very data-rich research area. They will allow for delta-wide analysis in studying changes in the Late Holocene landscape and the interaction with past habitation.

  15. An address geocoding method for improving rural spatial information infrastructure

    NASA Astrophysics Data System (ADS)

    Pan, Yuchun; Chen, Baisong; Lu, Zhou; Li, Shuhua; Zhang, Jingbo; Zhou, YanBing

    2010-11-01

    The transition of rural and agricultural management from divisional to integrated mode has highlighted the importance of data integration and sharing. Current data are mostly collected by specific department to satisfy their own needs and lake of considering on wider potential uses. This led to great difference in data format, semantic, and precision even in same area, which is a significant barrier for constructing an integrated rural spatial information system to support integrated management and decision-making. Considering the rural cadastral management system and postal zones, the paper designs a rural address geocoding method based on rural cadastral parcel. It puts forward a geocoding standard which consists of absolute position code, relative position code and extended code. It designs a rural geocoding database model, and addresses collection and update model. Then, based on the rural address geocoding model, it proposed a data model for rural agricultural resources management. The results show that the address coding based on postal code is stable and easy to memorize, two-dimensional coding based on the direction and distance is easy to be located and memorized, while extended code can enhance the extensibility and flexibility of address geocoding.

  16. Reconstruction of hadronic decay products of tau leptons with the ATLAS experiment.

    PubMed

    Aad, G; Abbott, B; Abdallah, J; Abdinov, O; Aben, R; Abolins, M; AbouZeid, O S; Abramowicz, H; Abreu, H; Abreu, R; Abulaiti, Y; Acharya, B S; Adamczyk, L; Adams, D L; Adelman, J; Adomeit, S; Adye, T; Affolder, A A; Agatonovic-Jovin, T; Agricola, J; Aguilar-Saavedra, J A; Ahlen, S P; Ahmadov, F; Aielli, G; Akerstedt, H; Åkesson, T P A; Akimov, A V; Alberghi, G L; Albert, J; Albrand, S; Alconada Verzini, M J; Aleksa, M; Aleksandrov, I N; Alexa, C; Alexander, G; Alexopoulos, T; Alhroob, M; Alimonti, G; Alio, L; Alison, J; Alkire, S P; Allbrooke, B M M; Allport, P P; Aloisio, A; Alonso, A; Alonso, F; Alpigiani, C; Altheimer, A; Alvarez Gonzalez, B; Álvarez Piqueras, D; Alviggi, M G; Amadio, B T; Amako, K; Amaral Coutinho, Y; Amelung, C; Amidei, D; Amor Dos Santos, S P; Amorim, A; Amoroso, S; Amram, N; Amundsen, G; Anastopoulos, C; Ancu, L S; Andari, N; Andeen, T; Anders, C F; Anders, G; Anders, J K; Anderson, K J; Andreazza, A; Andrei, V; Angelidakis, S; Angelozzi, I; Anger, P; Angerami, A; Anghinolfi, F; Anisenkov, A V; Anjos, N; Annovi, A; Antonelli, M; Antonov, A; Antos, J; Anulli, F; Aoki, M; Aperio Bella, L; Arabidze, G; Arai, Y; Araque, J P; Arce, A T H; Arduh, F A; Arguin, J-F; Argyropoulos, S; Arik, M; Armbruster, A J; Arnaez, O; Arnold, H; Arratia, M; Arslan, O; Artamonov, A; Artoni, G; Artz, S; Asai, S; Asbah, N; Ashkenazi, A; Åsman, B; Asquith, L; Assamagan, K; Astalos, R; Atkinson, M; Atlay, N B; Augsten, K; Aurousseau, M; Avolio, G; Axen, B; Ayoub, M K; Azuelos, G; Baak, M A; Baas, A E; Baca, M J; Bachacou, H; Bachas, K; Backes, M; Backhaus, M; Bagiacchi, P; Bagnaia, P; Bai, Y; Bain, T; Baines, J T; Baker, O K; Baldin, E M; Balek, P; Balestri, T; Balli, F; Balunas, W K; Banas, E; Banerjee, Sw; Bannoura, A A E; Barak, L; Barberio, E L; Barberis, D; Barbero, M; Barillari, T; Barisonzi, M; Barklow, T; Barlow, N; Barnes, S L; Barnett, B M; Barnett, R M; Barnovska, Z; Baroncelli, A; Barone, G; Barr, A J; Barreiro, F; Barreiro Guimarães da Costa, J; Bartoldus, R; Barton, A E; Bartos, P; Basalaev, A; Bassalat, A; Basye, A; Bates, R L; Batista, S J; Batley, J R; Battaglia, M; Bauce, M; Bauer, F; Bawa, H S; Beacham, J B; Beattie, M D; Beau, T; Beauchemin, P H; Beccherle, R; Bechtle, P; Beck, H P; Becker, K; Becker, M; Beckingham, M; Becot, C; Beddall, A J; Beddall, A; Bednyakov, V A; Bee, C P; Beemster, L J; Beermann, T A; Begel, M; Behr, J K; Belanger-Champagne, C; Bell, W H; Bella, G; Bellagamba, L; Bellerive, A; Bellomo, M; Belotskiy, K; Beltramello, O; Benary, O; Benchekroun, D; Bender, M; Bendtz, K; Benekos, N; Benhammou, Y; Benhar Noccioli, E; Benitez Garcia, J A; Benjamin, D P; Bensinger, J R; Bentvelsen, S; Beresford, L; Beretta, M; Berge, D; Bergeaas Kuutmann, E; Berger, N; Berghaus, F; Beringer, J; Bernard, C; Bernard, N R; Bernius, C; Bernlochner, F U; Berry, T; Berta, P; Bertella, C; Bertoli, G; Bertolucci, F; Bertsche, C; Bertsche, D; Besana, M I; Besjes, G J; Bessidskaia Bylund, O; Bessner, M; Besson, N; Betancourt, C; Bethke, S; Bevan, A J; Bhimji, W; Bianchi, R M; Bianchini, L; Bianco, M; Biebel, O; Biedermann, D; Biesuz, N V; Biglietti, M; Bilbao De Mendizabal, J; Bilokon, H; Bindi, M; Binet, S; Bingul, A; Bini, C; Biondi, S; Bjergaard, D M; Black, C W; Black, J E; Black, K M; Blackburn, D; Blair, R E; Blanchard, J-B; Blanco, J E; Blazek, T; Bloch, I; Blocker, C; Blum, W; Blumenschein, U; Blunier, S; Bobbink, G J; Bobrovnikov, V S; Bocchetta, S S; Bocci, A; Bock, C; Boehler, M; Bogaerts, J A; Bogavac, D; Bogdanchikov, A G; Bohm, C; Boisvert, V; Bold, T; Boldea, V; Boldyrev, A S; Bomben, M; Bona, M; Boonekamp, M; Borisov, A; Borissov, G; Borroni, S; Bortfeldt, J; Bortolotto, V; Bos, K; Boscherini, D; Bosman, M; Boudreau, J; Bouffard, J; Bouhova-Thacker, E V; Boumediene, D; Bourdarios, C; Bousson, N; Boutle, S K; Boveia, A; Boyd, J; Boyko, I R; Bozic, I; Bracinik, J; Brandt, A; Brandt, G; Brandt, O; Bratzler, U; Brau, B; Brau, J E; Braun, H M; Breaden Madden, W D; Brendlinger, K; Brennan, A J; Brenner, L; Brenner, R; Bressler, S; Bristow, T M; Britton, D; Britzger, D; Brochu, F M; Brock, I; Brock, R; Bronner, J; Brooijmans, G; Brooks, T; Brooks, W K; Brosamer, J; Brost, E; Bruckman de Renstrom, P A; Bruncko, D; Bruneliere, R; Bruni, A; Bruni, G; Bruschi, M; Bruscino, N; Bryngemark, L; Buanes, T; Buat, Q; Buchholz, P; Buckley, A G; Budagov, I A; Buehrer, F; Bugge, L; Bugge, M K; Bulekov, O; Bullock, D; Burckhart, H; Burdin, S; Burgard, C D; Burghgrave, B; Burke, S; Burmeister, I; Busato, E; Büscher, D; Büscher, V; Bussey, P; Butler, J M; Butt, A I; Buttar, C M; Butterworth, J M; Butti, P; Buttinger, W; Buzatu, A; Buzykaev, A R; Cabrera Urbán, S; Caforio, D; Cairo, V M; Cakir, O; Calace, N; Calafiura, P; Calandri, A; Calderini, G; Calfayan, P; Caloba, L P; Calvet, D; Calvet, S; Camacho Toro, R; Camarda, S; Camarri, P; Cameron, D; Caminal Armadans, R; Campana, S; Campanelli, M; Campoverde, A; Canale, V; Canepa, A; Cano Bret, M; Cantero, J; Cantrill, R; Cao, T; Capeans Garrido, M D M; Caprini, I; Caprini, M; Capua, M; Caputo, R; Carbone, R M; Cardarelli, R; Cardillo, F; Carli, T; Carlino, G; Carminati, L; Caron, S; Carquin, E; Carrillo-Montoya, G D; Carter, J R; Carvalho, J; Casadei, D; Casado, M P; Casolino, M; Casper, D W; Castaneda-Miranda, E; Castelli, A; Castillo Gimenez, V; Castro, N F; Catastini, P; Catinaccio, A; Catmore, J R; Cattai, A; Caudron, J; Cavaliere, V; Cavalli, D; Cavalli-Sforza, M; Cavasinni, V; Ceradini, F; Cerda Alberich, L; Cerio, B C; Cerny, K; Cerqueira, A S; Cerri, A; Cerrito, L; Cerutti, F; Cerv, M; Cervelli, A; Cetin, S A; Chafaq, A; Chakraborty, D; Chalupkova, I; Chan, Y L; Chang, P; Chapman, J D; Charlton, D G; Chau, C C; Chavez Barajas, C A; Che, S; Cheatham, S; Chegwidden, A; Chekanov, S; Chekulaev, S V; Chelkov, G A; Chelstowska, M A; Chen, C; Chen, H; Chen, K; Chen, L; Chen, S; Chen, S; Chen, X; Chen, Y; Cheng, H C; Cheng, Y; Cheplakov, A; Cheremushkina, E; Cherkaoui El Moursli, R; Chernyatin, V; Cheu, E; Chevalier, L; Chiarella, V; Chiarelli, G; Chiodini, G; Chisholm, A S; Chislett, R T; Chitan, A; Chizhov, M V; Choi, K; Chouridou, S; Chow, B K B; Christodoulou, V; Chromek-Burckhart, D; Chudoba, J; Chuinard, A J; Chwastowski, J J; Chytka, L; Ciapetti, G; Ciftci, A K; Cinca, D; Cindro, V; Cioara, I A; Ciocio, A; Cirotto, F; Citron, Z H; Ciubancan, M; Clark, A; Clark, B L; Clark, P J; Clarke, R N; Clement, C; Coadou, Y; Cobal, M; Coccaro, A; Cochran, J; Coffey, L; Colasurdo, L; Cole, B; Cole, S; Colijn, A P; Collot, J; Colombo, T; Compostella, G; Conde Muiño, P; Coniavitis, E; Connell, S H; Connelly, I A; Consorti, V; Constantinescu, S; Conta, C; Conti, G; Conventi, F; Cooke, M; Cooper, B D; Cooper-Sarkar, A M; Cornelissen, T; Corradi, M; Corriveau, F; Corso-Radu, A; Cortes-Gonzalez, A; Cortiana, G; Costa, G; Costa, M J; Costanzo, D; Côté, D; Cottin, G; Cowan, G; Cox, B E; Cranmer, K; Crawley, S J; Cree, G; Crépé-Renaudin, S; Crescioli, F; Cribbs, W A; Crispin Ortuzar, M; Cristinziani, M; Croft, V; Crosetti, G; Cuhadar Donszelmann, T; Cummings, J; Curatolo, M; Cúth, J; Cuthbert, C; Czirr, H; Czodrowski, P; D'Auria, S; D'Onofrio, M; Da Cunha Sargedas De Sousa, M J; Da Via, C; Dabrowski, W; Dafinca, A; Dai, T; Dale, O; Dallaire, F; Dallapiccola, C; Dam, M; Dandoy, J R; Dang, N P; Daniells, A C; Danninger, M; Dano Hoffmann, M; Dao, V; Darbo, G; Darmora, S; Dassoulas, J; Dattagupta, A; Davey, W; David, C; Davidek, T; Davies, E; Davies, M; Davison, P; Davygora, Y; Dawe, E; Dawson, I; Daya-Ishmukhametova, R K; De, K; de Asmundis, R; De Benedetti, A; De Castro, S; De Cecco, S; De Groot, N; de Jong, P; De la Torre, H; De Lorenzi, F; De Pedis, D; De Salvo, A; De Sanctis, U; De Santo, A; De Vivie De Regie, J B; Dearnaley, W J; Debbe, R; Debenedetti, C; Dedovich, D V; Deigaard, I; Del Peso, J; Del Prete, T; Delgove, D; Deliot, F; Delitzsch, C M; Deliyergiyev, M; Dell'Acqua, A; Dell'Asta, L; Dell'Orso, M; Della Pietra, M; Della Volpe, D; Delmastro, M; Delsart, P A; Deluca, C; DeMarco, D A; Demers, S; Demichev, M; Demilly, A; Denisov, S P; Derendarz, D; Derkaoui, J E; Derue, F; Dervan, P; Desch, K; Deterre, C; Dette, K; Deviveiros, P O; Dewhurst, A; Dhaliwal, S; Di Ciaccio, A; Di Ciaccio, L; Di Domenico, A; Di Donato, C; Di Girolamo, A; Di Girolamo, B; Di Mattia, A; Di Micco, B; Di Nardo, R; Di Simone, A; Di Sipio, R; Di Valentino, D; Diaconu, C; Diamond, M; Dias, F A; Diaz, M A; Diehl, E B; Dietrich, J; Diglio, S; Dimitrievska, A; Dingfelder, J; Dita, P; Dita, S; Dittus, F; Djama, F; Djobava, T; Djuvsland, J I; do Vale, M A B; Dobos, D; Dobre, M; Doglioni, C; Dohmae, T; Dolejsi, J; Dolezal, Z; Dolgoshein, B A; Donadelli, M; Donati, S; Dondero, P; Donini, J; Dopke, J; Doria, A; Dova, M T; Doyle, A T; Drechsler, E; Dris, M; Du, Y; Dubreuil, E; Duchovni, E; Duckeck, G; Ducu, O A; Duda, D; Dudarev, A; Duflot, L; Duguid, L; Dührssen, M; Dunford, M; Duran Yildiz, H; Düren, M; Durglishvili, A; Duschinger, D; Dutta, B; Dyndal, M; Eckardt, C; Ecker, K M; Edgar, R C; Edson, W; Edwards, N C; Ehrenfeld, W; Eifert, T; Eigen, G; Einsweiler, K; Ekelof, T; El Kacimi, M; Ellert, M; Elles, S; Ellinghaus, F; Elliot, A A; Ellis, N; Elmsheuser, J; Elsing, M; Emeliyanov, D; Enari, Y; Endner, O C; Endo, M; Erdmann, J; Ereditato, A; Ernis, G; Ernst, J; Ernst, M; Errede, S; Ertel, E; Escalier, M; Esch, H; Escobar, C; Esposito, B; Etienvre, A I; Etzion, E; Evans, H; Ezhilov, A; Fabbri, L; Facini, G; Fakhrutdinov, R M; Falciano, S; Falla, R J; Faltova, J; Fang, Y; Fanti, M; Farbin, A; Farilla, A; Farooque, T; Farrell, S; Farrington, S M; Farthouat, P; Fassi, F; Fassnacht, P; Fassouliotis, D; Faucci Giannelli, M; Favareto, A; Fayard, L; Fedin, O L; Fedorko, W; Feigl, S; Feligioni, L; Feng, C; Feng, E J; Feng, H; Fenyuk, A B; Feremenga, L; Fernandez Martinez, P; Fernandez Perez, S; Ferrando, J; Ferrari, A; Ferrari, P; Ferrari, R; Ferreira de Lima, D E; Ferrer, A; Ferrere, D; Ferretti, C; Ferretto Parodi, A; Fiascaris, M; Fiedler, F; Filipčič, A; Filipuzzi, M; Filthaut, F; Fincke-Keeler, M; Finelli, K D; Fiolhais, M C N; Fiorini, L; Firan, A; Fischer, A; Fischer, C; Fischer, J; Fisher, W C; Flaschel, N; Fleck, I; Fleischmann, P; Fletcher, G T; Fletcher, G; Fletcher, R R M; Flick, T; Floderus, A; Flores Castillo, L R; Flowerdew, M J; Forcolin, G T; Formica, A; Forti, A; Fournier, D; Fox, H; Fracchia, S; Francavilla, P; Franchini, M; Francis, D; Franconi, L; Franklin, M; Frate, M; Fraternali, M; Freeborn, D; French, S T; Fressard-Batraneanu, S M; Friedrich, F; Froidevaux, D; Frost, J A; Fukunaga, C; Fullana Torregrosa, E; Fulsom, B G; Fusayasu, T; Fuster, J; Gabaldon, C; Gabizon, O; Gabrielli, A; Gabrielli, A; Gach, G P; Gadatsch, S; Gadomski, S; Gagliardi, G; Gagnon, P; Galea, C; Galhardo, B; Gallas, E J; Gallop, B J; Gallus, P; Galster, G; Gan, K K; Gao, J; Gao, Y; Gao, Y S; Garay Walls, F M; García, C; García Navarro, J E; Garcia-Sciveres, M; Gardner, R W; Garelli, N; Garonne, V; Gatti, C; Gaudiello, A; Gaudio, G; Gaur, B; Gauthier, L; Gauzzi, P; Gavrilenko, I L; Gay, C; Gaycken, G; Gazis, E N; Ge, P; Gecse, Z; Gee, C N P; Geich-Gimbel, Ch; Geisler, M P; Gemme, C; Genest, M H; Geng, C; Gentile, S; George, S; Gerbaudo, D; Gershon, A; Ghasemi, S; Ghazlane, H; Giacobbe, B; Giagu, S; Giangiobbe, V; Giannetti, P; Gibbard, B; Gibson, S M; Gignac, M; Gilchriese, M; Gillam, T P S; Gillberg, D; Gilles, G; Gingrich, D M; Giokaris, N; Giordani, M P; Giorgi, F M; Giorgi, F M; Giraud, P F; Giromini, P; Giugni, D; Giuliani, C; Giulini, M; Gjelsten, B K; Gkaitatzis, S; Gkialas, I; Gkougkousis, E L; Gladilin, L K; Glasman, C; Glatzer, J; Glaysher, P C F; Glazov, A; Goblirsch-Kolb, M; Goddard, J R; Godlewski, J; Goldfarb, S; Golling, T; Golubkov, D; Gomes, A; Gonçalo, R; Goncalves Pinto Firmino Da Costa, J; Gonella, L; González de la Hoz, S; Gonzalez Parra, G; Gonzalez-Sevilla, S; Goossens, L; Gorbounov, P A; Gordon, H A; Gorelov, I; Gorini, B; Gorini, E; Gorišek, A; Gornicki, E; Goshaw, A T; Gössling, C; Gostkin, M I; Goujdami, D; Goussiou, A G; Govender, N; Gozani, E; Graber, L; Grabowska-Bold, I; Gradin, P O J; Grafström, P; Gramling, J; Gramstad, E; Grancagnolo, S; Gratchev, V; Gray, H M; Graziani, E; Greenwood, Z D; Grefe, C; Gregersen, K; Gregor, I M; Grenier, P; Griffiths, J; Grillo, A A; Grimm, K; Grinstein, S; Gris, Ph; Grivaz, J-F; Groh, S; Grohs, J P; Grohsjean, A; Gross, E; Grosse-Knetter, J; Grossi, G C; Grout, Z J; Guan, L; Guenther, J; Guescini, F; Guest, D; Gueta, O; Guido, E; Guillemin, T; Guindon, S; Gul, U; Gumpert, C; Guo, J; Guo, Y; Gupta, S; Gustavino, G; Gutierrez, P; Gutierrez Ortiz, N G; Gutschow, C; Guyot, C; Gwenlan, C; Gwilliam, C B; Haas, A; Haber, C; Hadavand, H K; Haddad, N; Haefner, P; Hageböck, S; Hajduk, Z; Hakobyan, H; Haleem, M; Haley, J; Hall, D; Halladjian, G; Hallewell, G D; Hamacher, K; Hamal, P; Hamano, K; Hamilton, A; Hamity, G N; Hamnett, P G; Han, L; Hanagaki, K; Hanawa, K; Hance, M; Haney, B; Hanke, P; Hanna, R; Hansen, J B; Hansen, J D; Hansen, M C; Hansen, P H; Hara, K; Hard, A S; Harenberg, T; Hariri, F; Harkusha, S; Harrington, R D; Harrison, P F; Hartjes, F; Hasegawa, M; Hasegawa, Y; Hasib, A; Hassani, S; Haug, S; Hauser, R; Hauswald, L; Havranek, M; Hawkes, C M; Hawkings, R J; Hawkins, A D; Hayashi, T; Hayden, D; Hays, C P; Hays, J M; Hayward, H S; Haywood, S J; Head, S J; Heck, T; Hedberg, V; Heelan, L; Heim, S; Heim, T; Heinemann, B; Heinrich, L; Hejbal, J; Helary, L; Hellman, S; Helsens, C; Henderson, J; Henderson, R C W; Heng, Y; Hengler, C; Henkelmann, S; Henriques Correia, A M; Henrot-Versille, S; Herbert, G H; Hernández Jiménez, Y; Herten, G; Hertenberger, R; Hervas, L; Hesketh, G G; Hessey, N P; Hetherly, J W; Hickling, R; Higón-Rodriguez, E; Hill, E; Hill, J C; Hiller, K H; Hillier, S J; Hinchliffe, I; Hines, E; Hinman, R R; Hirose, M; Hirschbuehl, D; Hobbs, J; Hod, N; Hodgkinson, M C; Hodgson, P; Hoecker, A; Hoeferkamp, M R; Hoenig, F; Hohlfeld, M; Hohn, D; Holmes, T R; Homann, M; Hong, T M; Hooberman, B H; Hopkins, W H; Horii, Y; Horton, A J; Hostachy, J-Y; Hou, S; Hoummada, A; Howard, J; Howarth, J; Hrabovsky, M; Hristova, I; Hrivnac, J; Hryn'ova, T; Hrynevich, A; Hsu, C; Hsu, P J; Hsu, S-C; Hu, D; Hu, Q; Hu, X; Huang, Y; Hubacek, Z; Hubaut, F; Huegging, F; Huffman, T B; Hughes, E W; Hughes, G; Huhtinen, M; Hülsing, T A; Huseynov, N; Huston, J; Huth, J; Iacobucci, G; Iakovidis, G; Ibragimov, I; Iconomidou-Fayard, L; Ideal, E; Idrissi, Z; Iengo, P; Igonkina, O; Iizawa, T; Ikegami, Y; Ikeno, M; Ilchenko, Y; Iliadis, D; Ilic, N; Ince, T; Introzzi, G; Ioannou, P; Iodice, M; Iordanidou, K; Ippolito, V; Irles Quiles, A; Isaksson, C; Ishino, M; Ishitsuka, M; Ishmukhametov, R; Issever, C; Istin, S; Iturbe Ponce, J M; Iuppa, R; Ivarsson, J; Iwanski, W; Iwasaki, H; Izen, J M; Izzo, V; Jabbar, S; Jackson, B; Jackson, M; Jackson, P; Jaekel, M R; Jain, V; Jakobi, K B; Jakobs, K; Jakobsen, S; Jakoubek, T; Jakubek, J; Jamin, D O; Jana, D K; Jansen, E; Jansky, R; Janssen, J; Janus, M; Jarlskog, G; Javadov, N; Javůrek, T; Jeanty, L; Jejelava, J; Jeng, G-Y; Jennens, D; Jenni, P; Jentzsch, J; Jeske, C; Jézéquel, S; Ji, H; Jia, J; Jiang, H; Jiang, Y; Jiggins, S; Jimenez Pena, J; Jin, S; Jinaru, A; Jinnouchi, O; Joergensen, M D; Johansson, P; Johns, K A; Johnson, W J; Jon-And, K; Jones, G; Jones, R W L; Jones, T J; Jongmanns, J; Jorge, P M; Joshi, K D; Jovicevic, J; Ju, X; Juste Rozas, A; Kaci, M; Kaczmarska, A; Kado, M; Kagan, H; Kagan, M; Kahn, S J; Kajomovitz, E; Kalderon, C W; Kaluza, A; Kama, S; Kamenshchikov, A; Kanaya, N; Kaneti, S; Kantserov, V A; Kanzaki, J; Kaplan, B; Kaplan, L S; Kapliy, A; Kar, D; Karakostas, K; Karamaoun, A; Karastathis, N; Kareem, M J; Karentzos, E; Karnevskiy, M; Karpov, S N; Karpova, Z M; Karthik, K; Kartvelishvili, V; Karyukhin, A N; Kasahara, K; Kashif, L; Kass, R D; Kastanas, A; Kataoka, Y; Kato, C; Katre, A; Katzy, J; Kawade, K; Kawagoe, K; Kawamoto, T; Kawamura, G; Kazama, S; Kazanin, V F; Keeler, R; Kehoe, R; Keller, J S; Kempster, J J; Keoshkerian, H; Kepka, O; Kerševan, B P; Kersten, S; Keyes, R A; Khalil-Zada, F; Khandanyan, H; Khanov, A; Kharlamov, A G; Khoo, T J; Khovanskiy, V; Khramov, E; Khubua, J; Kido, S; Kim, H Y; Kim, S H; Kim, Y K; Kimura, N; Kind, O M; King, B T; King, M; King, S B; Kirk, J; Kiryunin, A E; Kishimoto, T; Kisielewska, D; Kiss, F; Kiuchi, K; Kivernyk, O; Kladiva, E; Klein, M H; Klein, M; Klein, U; Kleinknecht, K; Klimek, P; Klimentov, A; Klingenberg, R; Klinger, J A; Klioutchnikova, T; Kluge, E-E; Kluit, P; Kluth, S; Knapik, J; Kneringer, E; Knoops, E B F G; Knue, A; Kobayashi, A; Kobayashi, D; Kobayashi, T; Kobel, M; Kocian, M; Kodys, P; Koffas, T; Koffeman, E; Kogan, L A; Kohlmann, S; Kohout, Z; Kohriki, T; Koi, T; Kolanoski, H; Kolb, M; Koletsou, I; Komar, A A; Komori, Y; Kondo, T; Kondrashova, N; Köneke, K; König, A C; Kono, T; Konoplich, R; Konstantinidis, N; Kopeliansky, R; Koperny, S; Köpke, L; Kopp, A K; Korcyl, K; Kordas, K; Korn, A; Korol, A A; Korolkov, I; Korolkova, E V; Kortner, O; Kortner, S; Kosek, T; Kostyukhin, V V; Kotov, V M; Kotwal, A; Kourkoumeli-Charalampidi, A; Kourkoumelis, C; Kouskoura, V; Koutsman, A; Kowalewski, R; Kowalski, T Z; Kozanecki, W; Kozhin, A S; Kramarenko, V A; Kramberger, G; Krasnopevtsev, D; Krasny, M W; Krasznahorkay, A; Kraus, J K; Kravchenko, A; Kreiss, S; Kretz, M; Kretzschmar, J; Kreutzfeldt, K; Krieger, P; Krizka, K; Kroeninger, K; Kroha, H; Kroll, J; Kroseberg, J; Krstic, J; Kruchonak, U; Krüger, H; Krumnack, N; Kruse, A; Kruse, M C; Kruskal, M; Kubota, T; Kucuk, H; Kuday, S; Kuehn, S; Kugel, A; Kuger, F; Kuhl, A; Kuhl, T; Kukhtin, V; Kukla, R; Kulchitsky, Y; Kuleshov, S; Kuna, M; Kunigo, T; Kupco, A; Kurashige, H; Kurochkin, Y A; Kus, V; Kuwertz, E S; Kuze, M; Kvita, J; Kwan, T; Kyriazopoulos, D; La Rosa, A; La Rosa Navarro, J L; La Rotonda, L; Lacasta, C; Lacava, F; Lacey, J; Lacker, H; Lacour, D; Lacuesta, V R; Ladygin, E; Lafaye, R; Laforge, B; Lagouri, T; Lai, S; Lambourne, L; Lammers, S; Lampen, C L; Lampl, W; Lançon, E; Landgraf, U; Landon, M P J; Lang, V S; Lange, J C; Lankford, A J; Lanni, F; Lantzsch, K; Lanza, A; Laplace, S; Lapoire, C; Laporte, J F; Lari, T; Lasagni Manghi, F; Lassnig, M; Laurelli, P; Lavrijsen, W; Law, A T; Laycock, P; Lazovich, T; Le Dortz, O; Le Guirriec, E; Le Menedeu, E; LeBlanc, M; LeCompte, T; Ledroit-Guillon, F; Lee, C A; Lee, S C; Lee, L; Lefebvre, G; Lefebvre, M; Legger, F; Leggett, C; Lehan, A; Lehmann Miotto, G; Lei, X; Leight, W A; Leisos, A; Leister, A G; Leite, M A L; Leitner, R; Lellouch, D; Lemmer, B; Leney, K J C; Lenz, T; Lenzi, B; Leone, R; Leone, S; Leonidopoulos, C; Leontsinis, S; Leroy, C; Lester, C G; Levchenko, M; Levêque, J; Levin, D; Levinson, L J; Levy, M; Lewis, A; Leyko, A M; Leyton, M; Li, B; Li, H; Li, H L; Li, L; Li, L; Li, S; Li, X; Li, Y; Liang, Z; Liao, H; Liberti, B; Liblong, A; Lichard, P; Lie, K; Liebal, J; Liebig, W; Limbach, C; Limosani, A; Lin, S C; Lin, T H; Linde, F; Lindquist, B E; Linnemann, J T; Lipeles, E; Lipniacka, A; Lisovyi, M; Liss, T M; Lissauer, D; Lister, A; Litke, A M; Liu, B; Liu, D; Liu, H; Liu, J; Liu, J B; Liu, K; Liu, L; Liu, M; Liu, M; Liu, Y; Livan, M; Lleres, A; Llorente Merino, J; Lloyd, S L; Lo Sterzo, F; Lobodzinska, E; Loch, P; Lockman, W S; Loebinger, F K; Loevschall-Jensen, A E; Loew, K M; Loginov, A; Lohse, T; Lohwasser, K; Lokajicek, M; Long, B A; Long, J D; Long, R E; Looper, K A; Lopes, L; Lopez Mateos, D; Lopez Paredes, B; Lopez Paz, I; Lorenz, J; Lorenzo Martinez, N; Losada, M; Lösel, P J; Lou, X; Lounis, A; Love, J; Love, P A; Lu, H; Lu, N; Lubatti, H J; Luci, C; Lucotte, A; Luedtke, C; Luehring, F; Lukas, W; Luminari, L; Lundberg, O; Lund-Jensen, B; Lynn, D; Lysak, R; Lytken, E; Ma, H; Ma, L L; Maccarrone, G; Macchiolo, A; Macdonald, C M; Maček, B; Machado Miguens, J; Macina, D; Madaffari, D; Madar, R; Maddocks, H J; Mader, W F; Madsen, A; Maeda, J; Maeland, S; Maeno, T; Maevskiy, A; Magradze, E; Mahboubi, K; Mahlstedt, J; Maiani, C; Maidantchik, C; Maier, A A; Maier, T; Maio, A; Majewski, S; Makida, Y; Makovec, N; Malaescu, B; Malecki, Pa; Maleev, V P; Malek, F; Mallik, U; Malon, D; Malone, C; Maltezos, S; Malyshev, V M; Malyukov, S; Mamuzic, J; Mancini, G; Mandelli, B; Mandelli, L; Mandić, I; Mandrysch, R; Maneira, J; Manhaes de Andrade Filho, L; Manjarres Ramos, J; Mann, A; Manousakis-Katsikakis, A; Mansoulie, B; Mantifel, R; Mantoani, M; Mapelli, L; March, L; Marchiori, G; Marcisovsky, M; Marino, C P; Marjanovic, M; Marley, D E; Marroquim, F; Marsden, S P; Marshall, Z; Marti, L F; Marti-Garcia, S; Martin, B; Martin, T A; Martin, V J; Martin Dit Latour, B; Martinez, M; Martin-Haugh, S; Martoiu, V S; Martyniuk, A C; Marx, M; Marzano, F; Marzin, A; Masetti, L; Mashimo, T; Mashinistov, R; Masik, J; Maslennikov, A L; Massa, I; Massa, L; Mastrandrea, P; Mastroberardino, A; Masubuchi, T; Mättig, P; Mattmann, J; Maurer, J; Maxfield, S J; Maximov, D A; Mazini, R; Mazza, S M; Mc Goldrick, G; Mc Kee, S P; McCarn, A; McCarthy, R L; McCarthy, T G; McCubbin, N A; McFarlane, K W; Mcfayden, J A; Mchedlidze, G; McMahon, S J; McPherson, R A; Medinnis, M; Meehan, S; Mehlhase, S; Mehta, A; Meier, K; Meineck, C; Meirose, B; Mellado Garcia, B R; Meloni, F; Mengarelli, A; Menke, S; Meoni, E; Mercurio, K M; Mergelmeyer, S; Mermod, P; Merola, L; Meroni, C; Merritt, F S; Messina, A; Metcalfe, J; Mete, A S; Meyer, C; Meyer, C; Meyer, J-P; Meyer, J; Meyer Zu Theenhausen, H; Middleton, R P; Miglioranzi, S; Mijović, L; Mikenberg, G; Mikestikova, M; Mikuž, M; Milesi, M; Milic, A; Miller, D W; Mills, C; Milov, A; Milstead, D A; Minaenko, A A; Minami, Y; Minashvili, I A; Mincer, A I; Mindur, B; Mineev, M; Ming, Y; Mir, L M; Mistry, K P; Mitani, T; Mitrevski, J; Mitsou, V A; Miucci, A; Miyagawa, P S; Mjörnmark, J U; Moa, T; Mochizuki, K; Mohapatra, S; Mohr, W; Molander, S; Moles-Valls, R; Monden, R; Mondragon, M C; Mönig, K; Monini, C; Monk, J; Monnier, E; Montalbano, A; Montejo Berlingen, J; Monticelli, F; Monzani, S; Moore, R W; Morange, N; Moreno, D; Moreno Llácer, M; Morettini, P; Mori, D; Mori, T; Morii, M; Morinaga, M; Morisbak, V; Moritz, S; Morley, A K; Mornacchi, G; Morris, J D; Mortensen, S S; Morton, A; Morvaj, L; Mosidze, M; Moss, J; Motohashi, K; Mount, R; Mountricha, E; Mouraviev, S V; Moyse, E J W; Muanza, S; Mudd, R D; Mueller, F; Mueller, J; Mueller, R S P; Mueller, T; Muenstermann, D; Mullen, P; Mullier, G A; Munoz Sanchez, F J; Murillo Quijada, J A; Murray, W J; Musheghyan, H; Musto, E; Myagkov, A G; Myska, M; Nachman, B P; Nackenhorst, O; Nadal, J; Nagai, K; Nagai, R; Nagai, Y; Nagano, K; Nagarkar, A; Nagasaka, Y; Nagata, K; Nagel, M; Nagy, E; Nairz, A M; Nakahama, Y; Nakamura, K; Nakamura, T; Nakano, I; Namasivayam, H; Naranjo Garcia, R F; Narayan, R; Narrias Villar, D I; Naumann, T; Navarro, G; Nayyar, R; Neal, H A; Nechaeva, P Yu; Neep, T J; Nef, P D; Negri, A; Negrini, M; Nektarijevic, S; Nellist, C; Nelson, A; Nemecek, S; Nemethy, P; Nepomuceno, A A; Nessi, M; Neubauer, M S; Neumann, M; Neves, R M; Nevski, P; Newman, P R; Nguyen, D H; Nickerson, R B; Nicolaidou, R; Nicquevert, B; Nielsen, J; Nikiforou, N; Nikiforov, A; Nikolaenko, V; Nikolic-Audit, I; Nikolopoulos, K; Nilsen, J K; Nilsson, P; Ninomiya, Y; Nisati, A; Nisius, R; Nobe, T; Nodulman, L; Nomachi, M; Nomidis, I; Nooney, T; Norberg, S; Nordberg, M; Novgorodova, O; Nowak, S; Nozaki, M; Nozka, L; Ntekas, K; Nunnemann, T; Nurse, E; Nuti, F; O'grady, F; O'Neil, D C; O'Shea, V; Oakham, F G; Oberlack, H; Obermann, T; Ocariz, J; Ochi, A; Ochoa, I; Ochoa-Ricoux, J P; Oda, S; Odaka, S; Ogren, H; Oh, A; Oh, S H; Ohm, C C; Ohman, H; Oide, H; Okamura, W; Okawa, H; Okumura, Y; Okuyama, T; Olariu, A; Olivares Pino, S A; Oliveira Damazio, D; Olszewski, A; Olszowska, J; Onofre, A; Onogi, K; Onyisi, P U E; Oram, C J; Oreglia, M J; Oren, Y; Orestano, D; Orlando, N; Oropeza Barrera, C; Orr, R S; Osculati, B; Ospanov, R; Otero Y Garzon, G; Otono, H; Ouchrif, M; Ould-Saada, F; Ouraou, A; Oussoren, K P; Ouyang, Q; Ovcharova, A; Owen, M; Owen, R E; Ozcan, V E; Ozturk, N; Pachal, K; Pacheco Pages, A; Padilla Aranda, C; Pagáčová, M; Pagan Griso, S; Paganis, E; Paige, F; Pais, P; Pajchel, K; Palacino, G; Palestini, S; Palka, M; Pallin, D; Palma, A; Pan, Y B; Panagiotopoulou, E St; Pandini, C E; Panduro Vazquez, J G; Pani, P; Panitkin, S; Pantea, D; Paolozzi, L; Papadopoulou, Th D; Papageorgiou, K; Paramonov, A; Paredes Hernandez, D; Parker, M A; Parker, K A; Parodi, F; Parsons, J A; Parzefall, U; Pasqualucci, E; Passaggio, S; Pastore, F; Pastore, Fr; Pásztor, G; Pataraia, S; Patel, N D; Pater, J R; Pauly, T; Pearce, J; Pearson, B; Pedersen, L E; Pedersen, M; Pedraza Lopez, S; Pedro, R; Peleganchuk, S V; Pelikan, D; Penc, O; Peng, C; Peng, H; Penning, B; Penwell, J; Perepelitsa, D V; Perez Codina, E; Pérez García-Estañ, M T; Perini, L; Pernegger, H; Perrella, S; Peschke, R; Peshekhonov, V D; Peters, K; Peters, R F Y; Petersen, B A; Petersen, T C; Petit, E; Petridis, A; Petridou, C; Petroff, P; Petrolo, E; Petrucci, F; Pettersson, N E; Pezoa, R; Phillips, P W; Piacquadio, G; Pianori, E; Picazio, A; Piccaro, E; Piccinini, M; Pickering, M A; Piegaia, R; Pignotti, D T; Pilcher, J E; Pilkington, A D; Pin, A W J; Pina, J; Pinamonti, M; Pinfold, J L; Pingel, A; Pires, S; Pirumov, H; Pitt, M; Pizio, C; Plazak, L; Pleier, M-A; Pleskot, V; Plotnikova, E; Plucinski, P; Pluth, D; Poettgen, R; Poggioli, L; Pohl, D; Polesello, G; Poley, A; Policicchio, A; Polifka, R; Polini, A; Pollard, C S; Polychronakos, V; Pommès, K; Pontecorvo, L; Pope, B G; Popeneciu, G A; Popovic, D S; Poppleton, A; Pospisil, S; Potamianos, K; Potrap, I N; Potter, C J; Potter, C T; Poulard, G; Poveda, J; Pozdnyakov, V; Pozo Astigarraga, M E; Pralavorio, P; Pranko, A; Prasad, S; Prell, S; Price, D; Price, L E; Primavera, M; Prince, S; Proissl, M; Prokofiev, K; Prokoshin, F; Protopapadaki, E; Protopopescu, S; Proudfoot, J; Przybycien, M; Ptacek, E; Puddu, D; Pueschel, E; Puldon, D; Purohit, M; Puzo, P; Qian, J; Qin, G; Qin, Y; Quadt, A; Quarrie, D R; Quayle, W B; Queitsch-Maitland, M; Quilty, D; Raddum, S; Radeka, V; Radescu, V; Radhakrishnan, S K; Radloff, P; Rados, P; Ragusa, F; Rahal, G; Rajagopalan, S; Rammensee, M; Rangel-Smith, C; Rauscher, F; Rave, S; Ravenscroft, T; Raymond, M; Read, A L; Readioff, N P; Rebuzzi, D M; Redelbach, A; Redlinger, G; Reece, R; Reeves, K; Rehnisch, L; Reichert, J; Reisin, H; Rembser, C; Ren, H; Renaud, A; Rescigno, M; Resconi, S; Rezanova, O L; Reznicek, P; Rezvani, R; Richter, R; Richter, S; Richter-Was, E; Ricken, O; Ridel, M; Rieck, P; Riegel, C J; Rieger, J; Rifki, O; Rijssenbeek, M; Rimoldi, A; Rinaldi, L; Ristić, B; Ritsch, E; Riu, I; Rizatdinova, F; Rizvi, E; Robertson, S H; Robichaud-Veronneau, A; Robinson, D; Robinson, J E M; Robson, A; Roda, C; Roe, S; Røhne, O; Romaniouk, A; Romano, M; Romano Saez, S M; Romero Adam, E; Rompotis, N; Ronzani, M; Roos, L; Ros, E; Rosati, S; Rosbach, K; Rose, P; Rosenthal, O; Rossetti, V; Rossi, E; Rossi, L P; Rosten, J H N; Rosten, R; Rotaru, M; Roth, I; Rothberg, J; Rousseau, D; Royon, C R; Rozanov, A; Rozen, Y; Ruan, X; Rubbo, F; Rubinskiy, I; Rud, V I; Rudolph, C; Rudolph, M S; Rühr, F; Ruiz-Martinez, A; Rurikova, Z; Rusakovich, N A; Ruschke, A; Russell, H L; Rutherfoord, J P; Ruthmann, N; Ryabov, Y F; Rybar, M; Rybkin, G; Ryder, N C; Ryzhov, A; Saavedra, A F; Sabato, G; Sacerdoti, S; Saddique, A; Sadrozinski, H F-W; Sadykov, R; Safai Tehrani, F; Saha, P; Sahinsoy, M; Saimpert, M; Saito, T; Sakamoto, H; Sakurai, Y; Salamanna, G; Salamon, A; Salazar Loyola, J E; Saleem, M; Salek, D; Sales De Bruin, P H; Salihagic, D; Salnikov, A; Salt, J; Salvatore, D; Salvatore, F; Salvucci, A; Salzburger, A; Sammel, D; Sampsonidis, D; Sanchez, A; Sánchez, J; Sanchez Martinez, V; Sandaker, H; Sandbach, R L; Sander, H G; Sanders, M P; Sandhoff, M; Sandoval, C; Sandstroem, R; Sankey, D P C; Sannino, M; Sansoni, A; Santoni, C; Santonico, R; Santos, H; Santoyo Castillo, I; Sapp, K; Sapronov, A; Saraiva, J G; Sarrazin, B; Sasaki, O; Sasaki, Y; Sato, K; Sauvage, G; Sauvan, E; Savage, G; Savard, P; Sawyer, C; Sawyer, L; Saxon, J; Sbarra, C; Sbrizzi, A; Scanlon, T; Scannicchio, D A; Scarcella, M; Scarfone, V; Schaarschmidt, J; Schacht, P; Schaefer, D; Schaefer, R; Schaeffer, J; Schaepe, S; Schaetzel, S; Schäfer, U; Schaffer, A C; Schaile, D; Schamberger, R D; Scharf, V; Schegelsky, V A; Scheirich, D; Schernau, M; Schiavi, C; Schillo, C; Schioppa, M; Schlenker, S; Schmieden, K; Schmitt, C; Schmitt, S; Schmitt, S; Schmitz, S; Schneider, B; Schnellbach, Y J; Schnoor, U; Schoeffel, L; Schoening, A; Schoenrock, B D; Schopf, E; Schorlemmer, A L S; Schott, M; Schouten, D; Schovancova, J; Schramm, S; Schreyer, M; Schuh, N; Schultens, M J; Schultz-Coulon, H-C; Schulz, H; Schumacher, M; Schumm, B A; Schune, Ph; Schwanenberger, C; Schwartzman, A; Schwarz, T A; Schwegler, Ph; Schweiger, H; Schwemling, Ph; Schwienhorst, R; Schwindling, J; Schwindt, T; Scifo, E; Sciolla, G; Scuri, F; Scutti, F; Searcy, J; Sedov, G; Sedykh, E; Seema, P; Seidel, S C; Seiden, A; Seifert, F; Seixas, J M; Sekhniaidze, G; Sekhon, K; Sekula, S J; Seliverstov, D M; Semprini-Cesari, N; Serfon, C; Serin, L; Serkin, L; Serre, T; Sessa, M; Seuster, R; Severini, H; Sfiligoj, T; Sforza, F; Sfyrla, A; Shabalina, E; Shamim, M; Shan, L Y; Shang, R; Shank, J T; Shapiro, M; Shatalov, P B; Shaw, K; Shaw, S M; Shcherbakova, A; Shehu, C Y; Sherwood, P; Shi, L; Shimizu, S; Shimmin, C O; Shimojima, M; Shiyakova, M; Shmeleva, A; Shoaleh Saadi, D; Shochet, M J; Shojaii, S; Shrestha, S; Shulga, E; Shupe, M A; Sicho, P; Sidebo, P E; Sidiropoulou, O; Sidorov, D; Sidoti, A; Siegert, F; Sijacki, Dj; Silva, J; Silver, Y; Silverstein, S B; Simak, V; Simard, O; Simic, Lj; Simion, S; Simioni, E; Simmons, B; Simon, D; Simon, M; Sinervo, P; Sinev, N B; Sioli, M; Siragusa, G; Sivoklokov, S Yu; Sjölin, J; Sjursen, T B; Skinner, M B; Skottowe, H P; Skubic, P; Slater, M; Slavicek, T; Slawinska, M; Sliwa, K; Smakhtin, V; Smart, B H; Smestad, L; Smirnov, S Yu; Smirnov, Y; Smirnova, L N; Smirnova, O; Smith, M N K; Smith, R W; Smizanska, M; Smolek, K; Snesarev, A A; Snidero, G; Snyder, S; Sobie, R; Socher, F; Soffer, A; Soh, D A; Sokhrannyi, G; Solans, C A; Solar, M; Solc, J; Soldatov, E Yu; Soldevila, U; Solodkov, A A; Soloshenko, A; Solovyanov, O V; Solovyev, V; Sommer, P; Song, H Y; Soni, N; Sood, A; Sopczak, A; Sopko, B; Sopko, V; Sorin, V; Sosa, D; Sosebee, M; Sotiropoulou, C L; Soualah, R; Soukharev, A M; South, D; Sowden, B C; Spagnolo, S; Spalla, M; Spangenberg, M; Spanò, F; Spearman, W R; Sperlich, D; Spettel, F; Spighi, R; Spigo, G; Spiller, L A; Spousta, M; St Denis, R D; Stabile, A; Staerz, S; Stahlman, J; Stamen, R; Stamm, S; Stanecka, E; Stanek, R W; Stanescu, C; Stanescu-Bellu, M; Stanitzki, M M; Stapnes, S; Starchenko, E A; Stark, J; Staroba, P; Starovoitov, P; Staszewski, R; Steinberg, P; Stelzer, B; Stelzer, H J; Stelzer-Chilton, O; Stenzel, H; Stewart, G A; Stillings, J A; Stockton, M C; Stoebe, M; Stoicea, G; Stolte, P; Stonjek, S; Stradling, A R; Straessner, A; Stramaglia, M E; Strandberg, J; Strandberg, S; Strandlie, A; Strauss, E; Strauss, M; Strizenec, P; Ströhmer, R; Strom, D M; Stroynowski, R; Strubig, A; Stucci, S A; Stugu, B; Styles, N A; Su, D; Su, J; Subramaniam, R; Succurro, A; Suchek, S; Sugaya, Y; Suk, M; Sulin, V V; Sultansoy, S; Sumida, T; Sun, S; Sun, X; Sundermann, J E; Suruliz, K; Susinno, G; Sutton, M R; Suzuki, S; Svatos, M; Swiatlowski, M; Sykora, I; Sykora, T; Ta, D; Taccini, C; Tackmann, K; Taenzer, J; Taffard, A; Tafirout, R; Taiblum, N; Takai, H; Takashima, R; Takeda, H; Takeshita, T; Takubo, Y; Talby, M; Talyshev, A A; Tam, J Y C; Tan, K G; Tanaka, J; Tanaka, R; Tanaka, S; Tannenwald, B B; Tapia Araya, S; Tapprogge, S; Tarem, S; Tarrade, F; Tartarelli, G F; Tas, P; Tasevsky, M; Tashiro, T; Tassi, E; Tavares Delgado, A; Tayalati, Y; Taylor, A C; Taylor, F E; Taylor, G N; Taylor, P T E; Taylor, W; Teischinger, F A; Teixeira-Dias, P; Temming, K K; Temple, D; Ten Kate, H; Teng, P K; Teoh, J J; Tepel, F; Terada, S; Terashi, K; Terron, J; Terzo, S; Testa, M; Teuscher, R J; Theveneaux-Pelzer, T; Thomas, J P; Thomas-Wilsker, J; Thompson, E N; Thompson, P D; Thompson, R J; Thompson, A S; Thomsen, L A; Thomson, E; Thomson, M; Thun, R P; Tibbetts, M J; Ticse Torres, R E; Tikhomirov, V O; Tikhonov, Yu A; Timoshenko, S; Tiouchichine, E; Tipton, P; Tisserant, S; Todome, K; Todorov, T; Todorova-Nova, S; Tojo, J; Tokár, S; Tokushuku, K; Tollefson, K; Tolley, E; Tomlinson, L; Tomoto, M; Tompkins, L; Toms, K; Torrence, E; Torres, H; Torró Pastor, E; Toth, J; Touchard, F; Tovey, D R; Trefzger, T; Tremblet, L; Tricoli, A; Trigger, I M; Trincaz-Duvoid, S; Tripiana, M F; Trischuk, W; Trocmé, B; Troncon, C; Trottier-McDonald, M; Trovatelli, M; Truong, L; Trzebinski, M; Trzupek, A; Tsarouchas, C; Tseng, J C-L; Tsiareshka, P V; Tsionou, D; Tsipolitis, G; Tsirintanis, N; Tsiskaridze, S; Tsiskaridze, V; Tskhadadze, E G; Tsui, K M; Tsukerman, I I; Tsulaia, V; Tsuno, S; Tsybychev, D; Tudorache, A; Tudorache, V; Tuna, A N; Tupputi, S A; Turchikhin, S; Turecek, D; Turra, R; Turvey, A J; Tuts, P M; Tykhonov, A; Tylmad, M; Tyndel, M; Ueda, I; Ueno, R; Ughetto, M; Ukegawa, F; Unal, G; Undrus, A; Unel, G; Ungaro, F C; Unno, Y; Unverdorben, C; Urban, J; Urquijo, P; Urrejola, P; Usai, G; Usanova, A; Vacavant, L; Vacek, V; Vachon, B; Valderanis, C; Valencic, N; Valentinetti, S; Valero, A; Valery, L; Valkar, S; Vallecorsa, S; Valls Ferrer, J A; Van Den Wollenberg, W; Van Der Deijl, P C; van der Geer, R; van der Graaf, H; van Eldik, N; van Gemmeren, P; Van Nieuwkoop, J; van Vulpen, I; van Woerden, M C; Vanadia, M; Vandelli, W; Vanguri, R; Vaniachine, A; Vannucci, F; Vardanyan, G; Vari, R; Varnes, E W; Varol, T; Varouchas, D; Vartapetian, A; Varvell, K E; Vazeille, F; Vazquez Schroeder, T; Veatch, J; Veloce, L M; Veloso, F; Velz, T; Veneziano, S; Ventura, A; Ventura, D; Venturi, M; Venturi, N; Venturini, A; Vercesi, V; Verducci, M; Verkerke, W; Vermeulen, J C; Vest, A; Vetterli, M C; Viazlo, O; Vichou, I; Vickey, T; Vickey Boeriu, O E; Viehhauser, G H A; Viel, S; Vigne, R; Villa, M; Villaplana Perez, M; Vilucchi, E; Vincter, M G; Vinogradov, V B; Vivarelli, I; Vlachos, S; Vladoiu, D; Vlasak, M; Vogel, M; Vokac, P; Volpi, G; Volpi, M; von der Schmitt, H; von Radziewski, H; von Toerne, E; Vorobel, V; Vorobev, K; Vos, M; Voss, R; Vossebeld, J H; Vranjes, N; Vranjes Milosavljevic, M; Vrba, V; Vreeswijk, M; Vuillermet, R; Vukotic, I; Vykydal, Z; Wagner, P; Wagner, W; Wahlberg, H; Wahrmund, S; Wakabayashi, J; Walder, J; Walker, R; Walkowiak, W; Wang, C; Wang, F; Wang, H; Wang, H; Wang, J; Wang, J; Wang, K; Wang, R; Wang, S M; Wang, T; Wang, T; Wang, X; Wanotayaroj, C; Warburton, A; Ward, C P; Wardrope, D R; Washbrook, A; Wasicki, C; Watkins, P M; Watson, A T; Watson, I J; Watson, M F; Watts, G; Watts, S; Waugh, B M; Webb, S; Weber, M S; Weber, S W; Webster, J S; Weidberg, A R; Weinert, B; Weingarten, J; Weiser, C; Weits, H; Wells, P S; Wenaus, T; Wengler, T; Wenig, S; Wermes, N; Werner, M; Werner, P; Wessels, M; Wetter, J; Whalen, K; Wharton, A M; White, A; White, M J; White, R; White, S; Whiteson, D; Wickens, F J; Wiedenmann, W; Wielers, M; Wienemann, P; Wiglesworth, C; Wiik-Fuchs, L A M; Wildauer, A; Wilkens, H G; Williams, H H; Williams, S; Willis, C; Willocq, S; Wilson, A; Wilson, J A; Wingerter-Seez, I; Winklmeier, F; Winter, B T; Wittgen, M; Wittkowski, J; Wollstadt, S J; Wolter, M W; Wolters, H; Wosiek, B K; Wotschack, J; Woudstra, M J; Wozniak, K W; Wu, M; Wu, M; Wu, S L; Wu, X; Wu, Y; Wyatt, T R; Wynne, B M; Xella, S; Xu, D; Xu, L; Yabsley, B; Yacoob, S; Yakabe, R; Yamada, M; Yamaguchi, D; Yamaguchi, Y; Yamamoto, A; Yamamoto, S; Yamanaka, T; Yamauchi, K; Yamazaki, Y; Yan, Z; Yang, H; Yang, H; Yang, Y; Yao, W-M; Yap, Y C; Yasu, Y; Yatsenko, E; Yau Wong, K H; Ye, J; Ye, S; Yeletskikh, I; Yen, A L; Yildirim, E; Yorita, K; Yoshida, R; Yoshihara, K; Young, C; Young, C J S; Youssef, S; Yu, D R; Yu, J; Yu, J M; Yu, J; Yuan, L; Yuen, S P Y; Yurkewicz, A; Yusuff, I; Zabinski, B; Zaidan, R; Zaitsev, A M; Zalieckas, J; Zaman, A; Zambito, S; Zanello, L; Zanzi, D; Zeitnitz, C; Zeman, M; Zemla, A; Zeng, J C; Zeng, Q; Zengel, K; Zenin, O; Ženiš, T; Zerwas, D; Zhang, D; Zhang, F; Zhang, G; Zhang, H; Zhang, J; Zhang, L; Zhang, R; Zhang, X; Zhang, Z; Zhao, X; Zhao, Y; Zhao, Z; Zhemchugov, A; Zhong, J; Zhou, B; Zhou, C; Zhou, L; Zhou, L; Zhou, M; Zhou, N; Zhu, C G; Zhu, H; Zhu, J; Zhu, Y; Zhuang, X; Zhukov, K; Zibell, A; Zieminska, D; Zimine, N I; Zimmermann, C; Zimmermann, S; Zinonos, Z; Zinser, M; Ziolkowski, M; Živković, L; Zobernig, G; Zoccoli, A; Zur Nedden, M; Zurzolo, G; Zwalinski, L

    2016-01-01

    This paper presents a new method of reconstructing the individual charged and neutral hadrons in tau decays with the ATLAS detector. The reconstructed hadrons are used to classify the decay mode and to calculate the visible four-momentum of reconstructed tau candidates, significantly improving the resolution with respect to the calibration in the existing tau reconstruction. The performance of the reconstruction algorithm is optimised and evaluated using simulation and validated using samples of [Formula: see text] and [Formula: see text]+jets events selected from proton-proton collisions at a centre-of-mass energy [Formula: see text], corresponding to an integrated luminosity of 5 [Formula: see text].

  17. Equilibrium reconstruction with 3D eddy currents in the Lithium Tokamak eXperiment

    DOE PAGES

    Hansen, C.; Boyle, D. P.; Schmitt, J. C.; ...

    2017-04-18

    Axisymmetric free-boundary equilibrium reconstructions of tokamak plasmas in the Lithium Tokamak eXperiment (LTX) are performed using the PSI-Tri equilibrium code. Reconstructions in LTX are complicated by the presence of long-lived non-axisymmetric eddy currents generated by a vacuum vessel and first wall structures. To account for this effect, reconstructions are performed with additional toroidal current sources in these conducting regions. The eddy current sources are fixed in their poloidal distributions, but their magnitude is adjusted as part of the full reconstruction. Eddy distributions are computed by toroidally averaging currents, generated by coupling to vacuum field coils, from a simplified 3D filamentmore » model of important conducting structures. The full 3D eddy current fields are also used to enable the inclusion of local magnetic field measurements, which have strong 3D eddy current pick-up, as reconstruction constraints. Using this method, equilibrium reconstruction yields good agreement with all available diagnostic signals. Here, an accompanying field perturbation produced by 3D eddy currents on the plasma surface with a primarily n = 2, m = 1 character is also predicted for these equilibria.« less

  18. The Reconstruction Toolkit (RTK), an open-source cone-beam CT reconstruction toolkit based on the Insight Toolkit (ITK)

    NASA Astrophysics Data System (ADS)

    Rit, S.; Vila Oliva, M.; Brousmiche, S.; Labarbe, R.; Sarrut, D.; Sharp, G. C.

    2014-03-01

    We propose the Reconstruction Toolkit (RTK, http://www.openrtk.org), an open-source toolkit for fast cone-beam CT reconstruction, based on the Insight Toolkit (ITK) and using GPU code extracted from Plastimatch. RTK is developed by an open consortium (see affiliations) under the non-contaminating Apache 2.0 license. The quality of the platform is daily checked with regression tests in partnership with Kitware, the company supporting ITK. Several features are already available: Elekta, Varian and IBA inputs, multi-threaded Feldkamp-David-Kress reconstruction on CPU and GPU, Parker short scan weighting, multi-threaded CPU and GPU forward projectors, etc. Each feature is either accessible through command line tools or C++ classes that can be included in independent software. A MIDAS community has been opened to share CatPhan datasets of several vendors (Elekta, Varian and IBA). RTK will be used in the upcoming cone-beam CT scanner developed by IBA for proton therapy rooms. Many features are under development: new input format support, iterative reconstruction, hybrid Monte Carlo / deterministic CBCT simulation, etc. RTK has been built to freely share tomographic reconstruction developments between researchers and is open for new contributions.

  19. Reconstructing the vibro-acoustic quantities on a highly non-spherical surface using the Helmholtz equation least squares method.

    PubMed

    Natarajan, Logesh Kumar; Wu, Sean F

    2012-06-01

    This paper presents helpful guidelines and strategies for reconstructing the vibro-acoustic quantities on a highly non-spherical surface by using the Helmholtz equation least squares (HELS). This study highlights that a computationally simple code based on the spherical wave functions can produce an accurate reconstruction of the acoustic pressure and normal surface velocity on planar surfaces. The key is to select the optimal origin of the coordinate system behind the planar surface, choose a target structural wavelength to be reconstructed, set an appropriate stand-off distance and microphone spacing, use a hybrid regularization scheme to determine the optimal number of the expansion functions, etc. The reconstructed vibro-acoustic quantities are validated rigorously via experiments by comparing the reconstructed normal surface velocity spectra and distributions with the benchmark data obtained by scanning a laser vibrometer over the plate surface. Results confirm that following the proposed guidelines and strategies can ensure the accuracy in reconstructing the normal surface velocity up to the target structural wavelength, and produce much more satisfactory results than a straight application of the original HELS formulations. Experiment validations on a baffled, square plate were conducted inside a fully anechoic chamber.

  20. Monte Carlo-based fluorescence molecular tomography reconstruction method accelerated by a cluster of graphic processing units.

    PubMed

    Quan, Guotao; Gong, Hui; Deng, Yong; Fu, Jianwei; Luo, Qingming

    2011-02-01

    High-speed fluorescence molecular tomography (FMT) reconstruction for 3-D heterogeneous media is still one of the most challenging problems in diffusive optical fluorescence imaging. In this paper, we propose a fast FMT reconstruction method that is based on Monte Carlo (MC) simulation and accelerated by a cluster of graphics processing units (GPUs). Based on the Message Passing Interface standard, we modified the MC code for fast FMT reconstruction, and different Green's functions representing the flux distribution in media are calculated simultaneously by different GPUs in the cluster. A load-balancing method was also developed to increase the computational efficiency. By applying the Fréchet derivative, a Jacobian matrix is formed to reconstruct the distribution of the fluorochromes using the calculated Green's functions. Phantom experiments have shown that only 10 min are required to get reconstruction results with a cluster of 6 GPUs, rather than 6 h with a cluster of multiple dual opteron CPU nodes. Because of the advantages of high accuracy and suitability for 3-D heterogeneity media with refractive-index-unmatched boundaries from the MC simulation, the GPU cluster-accelerated method provides a reliable approach to high-speed reconstruction for FMT imaging.

  1. A new paradigm for reproducing and analyzing N-body simulations of planetary systems

    NASA Astrophysics Data System (ADS)

    Rein, Hanno; Tamayo, Daniel

    2017-05-01

    The reproducibility of experiments is one of the main principles of the scientific method. However, numerical N-body experiments, especially those of planetary systems, are currently not reproducible. In the most optimistic scenario, they can only be replicated in an approximate or statistical sense. Even if authors share their full source code and initial conditions, differences in compilers, libraries, operating systems or hardware often lead to qualitatively different results. We provide a new set of easy-to-use, open-source tools that address the above issues, allowing for exact (bit-by-bit) reproducibility of N-body experiments. In addition to generating completely reproducible integrations, we show that our framework also offers novel and innovative ways to analyse these simulations. As an example, we present a high-accuracy integration of the Solar system spanning 10 Gyr, requiring several weeks to run on a modern CPU. In our framework, we can not only easily access simulation data at predefined intervals for which we save snapshots, but at any time during the integration. We achieve this by integrating an on-demand reconstructed simulation forward in time from the nearest snapshot. This allows us to extract arbitrary quantities at any point in the saved simulation exactly (bit-by-bit), and within seconds rather than weeks. We believe that the tools we present in this paper offer a new paradigm for how N-body simulations are run, analysed and shared across the community.

  2. Integrating the nursing management minimum data set into the logical observation identifier names and codes system.

    PubMed

    Subramanian, Amarnath; Westra, Bonnie; Matney, Susan; Wilson, Patricia S; Delaney, Connie W; Huff, Stan; Huff, Stanley M; Huber, Diane

    2008-11-06

    This poster describes the process used to integrate the Nursing Management Minimum Data Set (NMMDS), an instrument to measure the nursing context of care, into the Logical Observation Identifier Names and Codes (LOINC) system to facilitate contextualization of quality measures. Integration of the first three of 18 elements resulted in 48 new codes including five panels. The LOINC Clinical Committee has approved the presented mapping for their next release.

  3. Development of acoustic model-based iterative reconstruction technique for thick-concrete imaging

    NASA Astrophysics Data System (ADS)

    Almansouri, Hani; Clayton, Dwight; Kisner, Roger; Polsky, Yarom; Bouman, Charles; Santos-Villalobos, Hector

    2016-02-01

    Ultrasound signals have been used extensively for non-destructive evaluation (NDE). However, typical reconstruction techniques, such as the synthetic aperture focusing technique (SAFT), are limited to quasi-homogenous thin media. New ultrasonic systems and reconstruction algorithms are in need for one-sided NDE of non-homogenous thick objects. An application example space is imaging of reinforced concrete structures for commercial nuclear power plants (NPPs). These structures provide important foundation, support, shielding, and containment functions. Identification and management of aging and degradation of concrete structures is fundamental to the proposed long-term operation of NPPs. Another example is geothermal and oil/gas production wells. These multi-layered structures are composed of steel, cement, and several types of soil and rocks. Ultrasound systems with greater penetration range and image quality will allow for better monitoring of the well's health and prediction of high-pressure hydraulic fracturing of the rock. These application challenges need to be addressed with an integrated imaging approach, where the application, hardware, and reconstruction software are highly integrated and optimized. Therefore, we are developing an ultrasonic system with Model-Based Iterative Reconstruction (MBIR) as the image reconstruction backbone. As the first implementation of MBIR for ultrasonic signals, this paper document the first implementation of the algorithm and show reconstruction results for synthetically generated data.1

  4. Reconstruction method for fringe projection profilometry based on light beams.

    PubMed

    Li, Xuexing; Zhang, Zhijiang; Yang, Chen

    2016-12-01

    A novel reconstruction method for fringe projection profilometry, based on light beams, is proposed and verified by experiments. Commonly used calibration techniques require the parameters of projector calibration or the reference planes placed in many known positions. Obviously, introducing the projector calibration can reduce the accuracy of the reconstruction result, and setting the reference planes to many known positions is a time-consuming process. Therefore, in this paper, a reconstruction method without projector's parameters is proposed and only two reference planes are introduced. A series of light beams determined by the subpixel point-to-point map on the two reference planes combined with their reflected light beams determined by the camera model are used to calculate the 3D coordinates of reconstruction points. Furthermore, the bundle adjustment strategy and the complementary gray-code phase-shifting method are utilized to ensure the accuracy and stability. Qualitative and quantitative comparisons as well as experimental tests demonstrate the performance of our proposed approach, and the measurement accuracy can reach about 0.0454 mm.

  5. Non-axisymmetric equilibrium reconstruction on the Compact Toroidal Hybrid Experiment using external magnetic and soft x-ray inversion radius measurements

    NASA Astrophysics Data System (ADS)

    Ma, X.; Cianciosa, M.; Hanson, J. D.; Hartwell, G. J.; Knowlton, S. F.; Maurer, D. A.; Ennis, D. A.; Herfindal, J. L.

    2015-11-01

    Non-axisymmetric free-boundary equilibrium reconstructions of stellarator plasmas are performed for discharges in which the magnetic configuration is strongly modified by the driven plasma current. Studies were performed on the Compact Toroidal Hybrid device using the V3FIT reconstruction code incorporating a set of 50 magnetic diagnostics external to the plasma, combined with information from soft X-ray (SXR) arrays. With the assumption of closed magnetic flux surfaces, the reconstructions using external magnetic measurements allow accurate estimates of the net toroidal flux within the last closed flux surface, the edge safety factor, and the outer boundary of these highly non-axisymmetric plasmas. The inversion radius for sawtoothing plasmas is used to identify the location of the q = 1 surface, and thus infer the current profile near the magnetic axis. With external magnetic diagnostics alone, we find the reconstruction to be insufficiently constrained. This work is supported by US Department of Energy Grant No. DE-FG02-00ER54610.

  6. High-Performance 3D Compressive Sensing MRI Reconstruction Using Many-Core Architectures

    PubMed Central

    Kim, Daehyun; Trzasko, Joshua; Smelyanskiy, Mikhail; Haider, Clifton; Dubey, Pradeep; Manduca, Armando

    2011-01-01

    Compressive sensing (CS) describes how sparse signals can be accurately reconstructed from many fewer samples than required by the Nyquist criterion. Since MRI scan duration is proportional to the number of acquired samples, CS has been gaining significant attention in MRI. However, the computationally intensive nature of CS reconstructions has precluded their use in routine clinical practice. In this work, we investigate how different throughput-oriented architectures can benefit one CS algorithm and what levels of acceleration are feasible on different modern platforms. We demonstrate that a CUDA-based code running on an NVIDIA Tesla C2050 GPU can reconstruct a 256 × 160 × 80 volume from an 8-channel acquisition in 19 seconds, which is in itself a significant improvement over the state of the art. We then show that Intel's Knights Ferry can perform the same 3D MRI reconstruction in only 12 seconds, bringing CS methods even closer to clinical viability. PMID:21922017

  7. Tuning the cache memory usage in tomographic reconstruction on standard computers with Advanced Vector eXtensions (AVX)

    PubMed Central

    Agulleiro, Jose-Ignacio; Fernandez, Jose-Jesus

    2015-01-01

    Cache blocking is a technique widely used in scientific computing to minimize the exchange of information with main memory by reusing the data kept in cache memory. In tomographic reconstruction on standard computers using vector instructions, cache blocking turns out to be central to optimize performance. To this end, sinograms of the tilt-series and slices of the volumes to be reconstructed have to be divided into small blocks that fit into the different levels of cache memory. The code is then reorganized so as to operate with a block as much as possible before proceeding with another one. This data article is related to the research article titled Tomo3D 2.0 – Exploitation of Advanced Vector eXtensions (AVX) for 3D reconstruction (Agulleiro and Fernandez, 2015) [1]. Here we present data of a thorough study of the performance of tomographic reconstruction by varying cache block sizes, which allows derivation of expressions for their automatic quasi-optimal tuning. PMID:26217710

  8. Tuning the cache memory usage in tomographic reconstruction on standard computers with Advanced Vector eXtensions (AVX).

    PubMed

    Agulleiro, Jose-Ignacio; Fernandez, Jose-Jesus

    2015-06-01

    Cache blocking is a technique widely used in scientific computing to minimize the exchange of information with main memory by reusing the data kept in cache memory. In tomographic reconstruction on standard computers using vector instructions, cache blocking turns out to be central to optimize performance. To this end, sinograms of the tilt-series and slices of the volumes to be reconstructed have to be divided into small blocks that fit into the different levels of cache memory. The code is then reorganized so as to operate with a block as much as possible before proceeding with another one. This data article is related to the research article titled Tomo3D 2.0 - Exploitation of Advanced Vector eXtensions (AVX) for 3D reconstruction (Agulleiro and Fernandez, 2015) [1]. Here we present data of a thorough study of the performance of tomographic reconstruction by varying cache block sizes, which allows derivation of expressions for their automatic quasi-optimal tuning.

  9. Dictionary-based image reconstruction for superresolution in integrated circuit imaging.

    PubMed

    Cilingiroglu, T Berkin; Uyar, Aydan; Tuysuzoglu, Ahmet; Karl, W Clem; Konrad, Janusz; Goldberg, Bennett B; Ünlü, M Selim

    2015-06-01

    Resolution improvement through signal processing techniques for integrated circuit imaging is becoming more crucial as the rapid decrease in integrated circuit dimensions continues. Although there is a significant effort to push the limits of optical resolution for backside fault analysis through the use of solid immersion lenses, higher order laser beams, and beam apodization, signal processing techniques are required for additional improvement. In this work, we propose a sparse image reconstruction framework which couples overcomplete dictionary-based representation with a physics-based forward model to improve resolution and localization accuracy in high numerical aperture confocal microscopy systems for backside optical integrated circuit analysis. The effectiveness of the framework is demonstrated on experimental data.

  10. Pulse Vector-Excitation Speech Encoder

    NASA Technical Reports Server (NTRS)

    Davidson, Grant; Gersho, Allen

    1989-01-01

    Proposed pulse vector-excitation speech encoder (PVXC) encodes analog speech signals into digital representation for transmission or storage at rates below 5 kilobits per second. Produces high quality of reconstructed speech, but with less computation than required by comparable speech-encoding systems. Has some characteristics of multipulse linear predictive coding (MPLPC) and of code-excited linear prediction (CELP). System uses mathematical model of vocal tract in conjunction with set of excitation vectors and perceptually-based error criterion to synthesize natural-sounding speech.

  11. Integrating Bar-Code Medication Administration Competencies in the Curriculum: Implications for Nursing Education and Interprofessional Collaboration.

    PubMed

    Angel, Vini M; Friedman, Marvin H; Friedman, Andrea L

    This article describes an innovative project involving the integration of bar-code medication administration technology competencies in the nursing curriculum through interprofessional collaboration among nursing, pharmacy, and computer science disciplines. A description of the bar-code medication administration technology project and lessons learned are presented.

  12. Reprint of "Two-stage sparse coding of region covariance via Log-Euclidean kernels to detect saliency".

    PubMed

    Zhang, Ying-Ying; Yang, Cai; Zhang, Ping

    2017-08-01

    In this paper, we present a novel bottom-up saliency detection algorithm from the perspective of covariance matrices on a Riemannian manifold. Each superpixel is described by a region covariance matrix on Riemannian Manifolds. We carry out a two-stage sparse coding scheme via Log-Euclidean kernels to extract salient objects efficiently. In the first stage, given background dictionary on image borders, sparse coding of each region covariance via Log-Euclidean kernels is performed. The reconstruction error on the background dictionary is regarded as the initial saliency of each superpixel. In the second stage, an improvement of the initial result is achieved by calculating reconstruction errors of the superpixels on foreground dictionary, which is extracted from the first stage saliency map. The sparse coding in the second stage is similar to the first stage, but is able to effectively highlight the salient objects uniformly from the background. Finally, three post-processing methods-highlight-inhibition function, context-based saliency weighting, and the graph cut-are adopted to further refine the saliency map. Experiments on four public benchmark datasets show that the proposed algorithm outperforms the state-of-the-art methods in terms of precision, recall and mean absolute error, and demonstrate the robustness and efficiency of the proposed method. Copyright © 2017 Elsevier Ltd. All rights reserved.

  13. Sparsity-promoting orthogonal dictionary updating for image reconstruction from highly undersampled magnetic resonance data.

    PubMed

    Huang, Jinhong; Guo, Li; Feng, Qianjin; Chen, Wufan; Feng, Yanqiu

    2015-07-21

    Image reconstruction from undersampled k-space data accelerates magnetic resonance imaging (MRI) by exploiting image sparseness in certain transform domains. Employing image patch representation over a learned dictionary has the advantage of being adaptive to local image structures and thus can better sparsify images than using fixed transforms (e.g. wavelets and total variations). Dictionary learning methods have recently been introduced to MRI reconstruction, and these methods demonstrate significantly reduced reconstruction errors compared to sparse MRI reconstruction using fixed transforms. However, the synthesis sparse coding problem in dictionary learning is NP-hard and computationally expensive. In this paper, we present a novel sparsity-promoting orthogonal dictionary updating method for efficient image reconstruction from highly undersampled MRI data. The orthogonality imposed on the learned dictionary enables the minimization problem in the reconstruction to be solved by an efficient optimization algorithm which alternately updates representation coefficients, orthogonal dictionary, and missing k-space data. Moreover, both sparsity level and sparse representation contribution using updated dictionaries gradually increase during iterations to recover more details, assuming the progressively improved quality of the dictionary. Simulation and real data experimental results both demonstrate that the proposed method is approximately 10 to 100 times faster than the K-SVD-based dictionary learning MRI method and simultaneously improves reconstruction accuracy.

  14. Research Integrity and Research Ethics in Professional Codes of Ethics: Survey of Terminology Used by Professional Organizations across Research Disciplines

    PubMed Central

    Komić, Dubravka; Marušić, Stjepan Ljudevit; Marušić, Ana

    2015-01-01

    Professional codes of ethics are social contracts among members of a professional group, which aim to instigate, encourage and nurture ethical behaviour and prevent professional misconduct, including research and publication. Despite the existence of codes of ethics, research misconduct remains a serious problem. A survey of codes of ethics from 795 professional organizations from the Illinois Institute of Technology’s Codes of Ethics Collection showed that 182 of them (23%) used research integrity and research ethics terminology in their codes, with differences across disciplines: while the terminology was common in professional organizations in social sciences (82%), mental health (71%), sciences (61%), other organizations had no statements (construction trades, fraternal social organizations, real estate) or a few of them (management, media, engineering). A subsample of 158 professional organizations we judged to be directly involved in research significantly more often had statements on research integrity/ethics terminology than the whole sample: an average of 10.4% of organizations with a statement (95% CI = 10.4-23-5%) on any of the 27 research integrity/ethics terms compared to 3.3% (95% CI = 2.1–4.6%), respectively (P<0.001). Overall, 62% of all statements addressing research integrity/ethics concepts used prescriptive language in describing the standard of practice. Professional organizations should define research integrity and research ethics issues in their ethics codes and collaborate within and across disciplines to adequately address responsible conduct of research and meet contemporary needs of their communities. PMID:26192805

  15. Image reconstruction and system modeling techniques for virtual-pinhole PET insert systems

    PubMed Central

    Keesing, Daniel B; Mathews, Aswin; Komarov, Sergey; Wu, Heyu; Song, Tae Yong; O'Sullivan, Joseph A; Tai, Yuan-Chuan

    2012-01-01

    Virtual-pinhole PET (VP-PET) imaging is a new technology in which one or more high-resolution detector modules are integrated into a conventional PET scanner with lower-resolution detectors. It can locally enhance the spatial resolution and contrast recovery near the add-on detectors, and depending on the configuration, may also increase the sensitivity of the system. This novel scanner geometry makes the reconstruction problem more challenging compared to the reconstruction of data from a standalone PET scanner, as new techniques are needed to model and account for the non-standard acquisition. In this paper, we present a general framework for fully 3D modeling of an arbitrary VP-PET insert system. The model components are incorporated into a statistical reconstruction algorithm to estimate an image from the multi-resolution data. For validation, we apply the proposed model and reconstruction approach to one of our custom-built VP-PET systems – a half-ring insert device integrated into a clinical PET/CT scanner. Details regarding the most important implementation issues are provided. We show that the proposed data model is consistent with the measured data, and that our approach can lead to reconstructions with improved spatial resolution and lesion detectability. PMID:22490983

  16. Magnetic Doppler imaging considering atmospheric structure modifications due to local abundances: a luxury or a necessity?

    NASA Astrophysics Data System (ADS)

    Kochukhov, O.; Wade, G. A.; Shulyak, D.

    2012-04-01

    Magnetic Doppler imaging is currently the most powerful method of interpreting high-resolution spectropolarimetric observations of stars. This technique has provided the very first maps of stellar magnetic field topologies reconstructed from time series of full Stokes vector spectra, revealing the presence of small-scale magnetic fields on the surfaces of Ap stars. These studies were recently criticised by Stift et al., who claimed that magnetic inversions are not robust and are seriously undermined by neglecting a feedback on the Stokes line profiles from the local atmospheric structure in the regions of enhanced metal abundance. We show that Stift et al. misinterpreted published magnetic Doppler imaging results and consistently neglected some of the most fundamental principles behind magnetic mapping. Using state-of-the-art opacity sampling model atmosphere and polarized radiative transfer codes, we demonstrate that the variation of atmospheric structure across the surface of a star with chemical spots affects the local continuum intensity but is negligible for the normalized local Stokes profiles except for the rare situation of a very strong line in an extremely Fe-rich atmosphere. For the disc-integrated spectra of an Ap star with extreme abundance variations, we find that the assumption of a mean model atmosphere leads to moderate errors in Stokes I but is negligible for the circular and linear polarization spectra. Employing a new magnetic inversion code, which incorporates the horizontal variation of atmospheric structure induced by chemical spots, we reconstructed new maps of magnetic field and Fe abundance for the bright Ap star α2 CVn. The resulting distribution of chemical spots changes insignificantly compared to the previous modelling based on a single model atmosphere, while the magnetic field geometry does not change at all. This shows that the assertions by Stift et al. are exaggerated as a consequence of unreasonable assumptions and extrapolations, as well as methodological flaws and inconsistencies of their analysis. Our discussion proves that published magnetic inversions based on a mean stellar atmosphere are highly robust and reliable, and that the presence of small-scale magnetic field structures on the surfaces of Ap stars is indeed real. Incorporating horizontal variations of atmospheric structure in Doppler imaging can marginally improve reconstruction of abundance distributions for stars showing very large iron overabundances. But this costly technique is unnecessary for magnetic mapping with high-resolution polarization spectra.

  17. Capturing method for integral three-dimensional imaging using multiviewpoint robotic cameras

    NASA Astrophysics Data System (ADS)

    Ikeya, Kensuke; Arai, Jun; Mishina, Tomoyuki; Yamaguchi, Masahiro

    2018-03-01

    Integral three-dimensional (3-D) technology for next-generation 3-D television must be able to capture dynamic moving subjects with pan, tilt, and zoom camerawork as good as in current TV program production. We propose a capturing method for integral 3-D imaging using multiviewpoint robotic cameras. The cameras are controlled through a cooperative synchronous system composed of a master camera controlled by a camera operator and other reference cameras that are utilized for 3-D reconstruction. When the operator captures a subject using the master camera, the region reproduced by the integral 3-D display is regulated in real space according to the subject's position and view angle of the master camera. Using the cooperative control function, the reference cameras can capture images at the narrowest view angle that does not lose any part of the object region, thereby maximizing the resolution of the image. 3-D models are reconstructed by estimating the depth from complementary multiviewpoint images captured by robotic cameras arranged in a two-dimensional array. The model is converted into elemental images to generate the integral 3-D images. In experiments, we reconstructed integral 3-D images of karate players and confirmed that the proposed method satisfied the above requirements.

  18. Piecemeal Buildup of the Genetic Code, Ribosomes, and Genomes from Primordial tRNA Building Blocks

    PubMed Central

    Caetano-Anollés, Derek; Caetano-Anollés, Gustavo

    2016-01-01

    The origin of biomolecular machinery likely centered around an ancient and central molecule capable of interacting with emergent macromolecular complexity. tRNA is the oldest and most central nucleic acid molecule of the cell. Its co-evolutionary interactions with aminoacyl-tRNA synthetase protein enzymes define the specificities of the genetic code and those with the ribosome their accurate biosynthetic interpretation. Phylogenetic approaches that focus on molecular structure allow reconstruction of evolutionary timelines that describe the history of RNA and protein structural domains. Here we review phylogenomic analyses that reconstruct the early history of the synthetase enzymes and the ribosome, their interactions with RNA, and the inception of amino acid charging and codon specificities in tRNA that are responsible for the genetic code. We also trace the age of domains and tRNA onto ancient tRNA homologies that were recently identified in rRNA. Our findings reveal a timeline of recruitment of tRNA building blocks for the formation of a functional ribosome, which holds both the biocatalytic functions of protein biosynthesis and the ability to store genetic memory in primordial RNA genomic templates. PMID:27918435

  19. Piecemeal Buildup of the Genetic Code, Ribosomes, and Genomes from Primordial tRNA Building Blocks.

    PubMed

    Caetano-Anollés, Derek; Caetano-Anollés, Gustavo

    2016-12-02

    The origin of biomolecular machinery likely centered around an ancient and central molecule capable of interacting with emergent macromolecular complexity. tRNA is the oldest and most central nucleic acid molecule of the cell. Its co-evolutionary interactions with aminoacyl-tRNA synthetase protein enzymes define the specificities of the genetic code and those with the ribosome their accurate biosynthetic interpretation. Phylogenetic approaches that focus on molecular structure allow reconstruction of evolutionary timelines that describe the history of RNA and protein structural domains. Here we review phylogenomic analyses that reconstruct the early history of the synthetase enzymes and the ribosome, their interactions with RNA, and the inception of amino acid charging and codon specificities in tRNA that are responsible for the genetic code. We also trace the age of domains and tRNA onto ancient tRNA homologies that were recently identified in rRNA. Our findings reveal a timeline of recruitment of tRNA building blocks for the formation of a functional ribosome, which holds both the biocatalytic functions of protein biosynthesis and the ability to store genetic memory in primordial RNA genomic templates.

  20. Deep linear autoencoder and patch clustering-based unified one-dimensional coding of image and video

    NASA Astrophysics Data System (ADS)

    Li, Honggui

    2017-09-01

    This paper proposes a unified one-dimensional (1-D) coding framework of image and video, which depends on deep learning neural network and image patch clustering. First, an improved K-means clustering algorithm for image patches is employed to obtain the compact inputs of deep artificial neural network. Second, for the purpose of best reconstructing original image patches, deep linear autoencoder (DLA), a linear version of the classical deep nonlinear autoencoder, is introduced to achieve the 1-D representation of image blocks. Under the circumstances of 1-D representation, DLA is capable of attaining zero reconstruction error, which is impossible for the classical nonlinear dimensionality reduction methods. Third, a unified 1-D coding infrastructure for image, intraframe, interframe, multiview video, three-dimensional (3-D) video, and multiview 3-D video is built by incorporating different categories of videos into the inputs of patch clustering algorithm. Finally, it is shown in the results of simulation experiments that the proposed methods can simultaneously gain higher compression ratio and peak signal-to-noise ratio than those of the state-of-the-art methods in the situation of low bitrate transmission.

  1. D Integrated Methodologies for the Documentation and the Virtual Reconstruction of AN Archaeological Site

    NASA Astrophysics Data System (ADS)

    Balletti, C.; Guerra, F.; Scocca, V.; Gottardi, C.

    2015-02-01

    Highly accurate documentation and 3D reconstructions are fundamental for analyses and further interpretations in archaeology. In the last years the integrated digital survey (ground-based survey methods and UAV photogrammetry) has confirmed its main role in the documentation and comprehension of excavation contexts, thanks to instrumental and methodological development concerning the on site data acquisition. The specific aim of the project, reported in this paper and realized by the Laboratory of Photogrammetry of the IUAV University of Venice, is to check different acquisition systems and their effectiveness test, considering each methodology individually or integrated. This research focuses on the awareness that the integration of different survey's methodologies can as a matter of fact increase the representative efficacy of the final representations; these are based on a wider and verified set of georeferenced metric data. Particularly the methods' integration allows reducing or neutralizing issues related to composite and complex objects' survey, since the most appropriate tools and techniques can be chosen considering the characteristics of each part of an archaeological site (i.e. urban structures, architectural monuments, small findings). This paper describes the experience in several sites of the municipality of Sepino (Molise, Italy), where the 3d digital acquisition of cities and structure of monuments, sometimes hard to reach, was realized using active and passive techniques (rage-based and image based methods). This acquisition was planned in order to obtain not only the basic support for interpretation analysis, but also to achieve models of the actual state of conservation of the site on which some reconstructive hypotheses can be based on. Laser scanning data were merged with Structure from Motion techniques' clouds into the same reference system, given by a topographical and GPS survey. These 3d models are not only the final results of the metric survey, but also the starting point for the whole reconstruction of the city and its urban context, from the research point of view. This reconstruction process will concern even some areas that have not yet been excavated, where the application of procedural modelling can offer an important support to the reconstructive hypothesis.

  2. Alignment-based and alignment-free methods converge with experimental data on amino acids coded by stop codons at split between nuclear and mitochondrial genetic codes.

    PubMed

    Seligmann, Hervé

    2018-05-01

    Genetic codes mainly evolve by reassigning punctuation codons, starts and stops. Previous analyses assuming that undefined amino acids translate stops showed greater divergence between nuclear and mitochondrial genetic codes. Here, three independent methods converge on which amino acids translated stops at split between nuclear and mitochondrial genetic codes: (a) alignment-free genetic code comparisons inserting different amino acids at stops; (b) alignment-based blast analyses of hypothetical peptides translated from non-coding mitochondrial sequences, inserting different amino acids at stops; (c) biases in amino acid insertions at stops in proteomic data. Hence short-term protein evolution models reconstruct long-term genetic code evolution. Mitochondria reassign stops to amino acids otherwise inserted at stops by codon-anticodon mismatches (near-cognate tRNAs). Hence dual function (translation termination and translation by codon-anticodon mismatch) precedes mitochondrial reassignments of stops to amino acids. Stop ambiguity increases coded information, compensates endocellular mitogenome reduction. Mitochondrial codon reassignments might prevent viral infections. Copyright © 2018 Elsevier B.V. All rights reserved.

  3. ISSYS: An integrated synergistic Synthesis System

    NASA Technical Reports Server (NTRS)

    Dovi, A. R.

    1980-01-01

    Integrated Synergistic Synthesis System (ISSYS), an integrated system of computer codes in which the sequence of program execution and data flow is controlled by the user, is discussed. The commands available to exert such control, the ISSYS major function and rules, and the computer codes currently available in the system are described. Computational sequences frequently used in the aircraft structural analysis and synthesis are defined. External computer codes utilized by the ISSYS system are documented. A bibliography on the programs is included.

  4. Breast reconstruction with anatomical implants: A review of indications and techniques based on current literature.

    PubMed

    Gardani, Marco; Bertozzi, Nicolò; Grieco, Michele Pio; Pesce, Marianna; Simonacci, Francesco; Santi, PierLuigi; Raposio, Edoardo

    2017-09-01

    One important modality of breast cancer therapy is surgical treatment, which has become increasingly less mutilating over the last century. Breast reconstruction has become an integrated part of breast cancer treatment due to long-term psychosexual health factors and its importance for breast cancer survivors. Both autogenous tissue-based and implant-based reconstruction provides satisfactory reconstructive options due to better surgeon awareness of "the ideal breast size", although each has its own advantages and disadvantages. An overview of the current options in breast reconstruction is presented in this article.

  5. Large Scale Software Building with CMake in ATLAS

    NASA Astrophysics Data System (ADS)

    Elmsheuser, J.; Krasznahorkay, A.; Obreshkov, E.; Undrus, A.; ATLAS Collaboration

    2017-10-01

    The offline software of the ATLAS experiment at the Large Hadron Collider (LHC) serves as the platform for detector data reconstruction, simulation and analysis. It is also used in the detector’s trigger system to select LHC collision events during data taking. The ATLAS offline software consists of several million lines of C++ and Python code organized in a modular design of more than 2000 specialized packages. Because of different workflows, many stable numbered releases are in parallel production use. To accommodate specific workflow requests, software patches with modified libraries are distributed on top of existing software releases on a daily basis. The different ATLAS software applications also require a flexible build system that strongly supports unit and integration tests. Within the last year this build system was migrated to CMake. A CMake configuration has been developed that allows one to easily set up and build the above mentioned software packages. This also makes it possible to develop and test new and modified packages on top of existing releases. The system also allows one to detect and execute partial rebuilds of the release based on single package changes. The build system makes use of CPack for building RPM packages out of the software releases, and CTest for running unit and integration tests. We report on the migration and integration of the ATLAS software to CMake and show working examples of this large scale project in production.

  6. Genome Sequence of Candidatus Nitrososphaera evergladensis from Group I.1b Enriched from Everglades Soil Reveals Novel Genomic Features of the Ammonia-Oxidizing Archaea

    PubMed Central

    Zhalnina, Kateryna V.; Dias, Raquel; Leonard, Michael T.; Dorr de Quadros, Patricia; Camargo, Flavio A. O.; Drew, Jennifer C.; Farmerie, William G.; Daroub, Samira H.; Triplett, Eric W.

    2014-01-01

    The activity of ammonia-oxidizing archaea (AOA) leads to the loss of nitrogen from soil, pollution of water sources and elevated emissions of greenhouse gas. To date, eight AOA genomes are available in the public databases, seven are from the group I.1a of the Thaumarchaeota and only one is from the group I.1b, isolated from hot springs. Many soils are dominated by AOA from the group I.1b, but the genomes of soil representatives of this group have not been sequenced and functionally characterized. The lack of knowledge of metabolic pathways of soil AOA presents a critical gap in understanding their role in biogeochemical cycles. Here, we describe the first complete genome of soil archaeon Candidatus Nitrososphaera evergladensis, which has been reconstructed from metagenomic sequencing of a highly enriched culture obtained from an agricultural soil. The AOA enrichment was sequenced with the high throughput next generation sequencing platforms from Pacific Biosciences and Ion Torrent. The de novo assembly of sequences resulted in one 2.95 Mb contig. Annotation of the reconstructed genome revealed many similarities of the basic metabolism with the rest of sequenced AOA. Ca. N. evergladensis belongs to the group I.1b and shares only 40% of whole-genome homology with the closest sequenced relative Ca. N. gargensis. Detailed analysis of the genome revealed coding sequences that were completely absent from the group I.1a. These unique sequences code for proteins involved in control of DNA integrity, transporters, two-component systems and versatile CRISPR defense system. Notably, genomes from the group I.1b have more gene duplications compared to the genomes from the group I.1a. We suggest that the presence of these unique genes and gene duplications may be associated with the environmental versatility of this group. PMID:24999826

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    McMillan, Kyle; Marleau, Peter; Brubaker, Erik

    In coded aperture imaging, one of the most important factors determining the quality of reconstructed images is the choice of mask/aperture pattern. In many applications, uniformly redundant arrays (URAs) are widely accepted as the optimal mask pattern. Under ideal conditions, thin and highly opaque masks, URA patterns are mathematically constructed to provide artifact-free reconstruction however, the number of URAs for a chosen number of mask elements is limited and when highly penetrating particles such as fast neutrons and high-energy gamma-rays are being imaged, the optimum is seldom achieved. In this case more robust mask patterns that provide better reconstructed imagemore » quality may exist. Through the use of heuristic optimization methods and maximum likelihood expectation maximization (MLEM) image reconstruction, we show that for both point and extended neutron sources a random mask pattern can be optimized to provide better image quality than that of a URA.« less

  8. Combining endoscopic ultrasound with Time-Of-Flight PET: The EndoTOFPET-US Project

    NASA Astrophysics Data System (ADS)

    Frisch, Benjamin

    2013-12-01

    The EndoTOFPET-US collaboration develops a multimodal imaging technique for endoscopic exams of the pancreas or the prostate. It combines the benefits of high resolution metabolic imaging with Time-Of-Flight Positron Emission Tomography (TOF PET) and anatomical imaging with ultrasound (US). EndoTOFPET-US consists of a PET head extension for a commercial US endoscope and a PET plate outside the body in coincidence with the head. The high level of miniaturization and integration creates challenges in fields such as scintillating crystals, ultra-fast photo-detection, highly integrated electronics, system integration and image reconstruction. Amongst the developments, fast scintillators as well as fast and compact digital SiPMs with single SPAD readout are used to obtain the best coincidence time resolution (CTR). Highly integrated ASICs and DAQ electronics contribute to the timing performances of EndoTOFPET. In view of the targeted resolution of around 1 mm in the reconstructed image, we present a prototype detector system with a CTR better than 240 ps FWHM. We discuss the challenges in simulating such a system and introduce reconstruction algorithms based on graphics processing units (GPU).

  9. Recovery Discontinuous Galerkin Jacobian-free Newton-Krylov Method for all-speed flows

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    HyeongKae Park; Robert Nourgaliev; Vincent Mousseau

    2008-07-01

    There is an increasing interest to develop the next generation simulation tools for the advanced nuclear energy systems. These tools will utilize the state-of-art numerical algorithms and computer science technology in order to maximize the predictive capability, support advanced reactor designs, reduce uncertainty and increase safety margins. In analyzing nuclear energy systems, we are interested in compressible low-Mach number, high heat flux flows with a wide range of Re, Ra, and Pr numbers. Under these conditions, the focus is placed on turbulent heat transfer, in contrast to other industries whose main interest is in capturing turbulent mixing. Our objective ismore » to develop singlepoint turbulence closure models for large-scale engineering CFD code, using Direct Numerical Simulation (DNS) or Large Eddy Simulation (LES) tools, requireing very accurate and efficient numerical algorithms. The focus of this work is placed on fully-implicit, high-order spatiotemporal discretization based on the discontinuous Galerkin method solving the conservative form of the compressible Navier-Stokes equations. The method utilizes a local reconstruction procedure derived from weak formulation of the problem, which is inspired by the recovery diffusion flux algorithm of van Leer and Nomura [?] and by the piecewise parabolic reconstruction [?] in the finite volume method. The developed methodology is integrated into the Jacobianfree Newton-Krylov framework [?] to allow a fully-implicit solution of the problem.« less

  10. Reconstruction of hadronic decay products of tau leptons with the ATLAS experiment

    DOE PAGES

    Aad, G.; Abbott, B.; Abdallah, J.; ...

    2016-05-25

    This document presents a new method of reconstructing the individual charged and neutral hadrons in tau decays with the ATLAS detector. The reconstructed hadrons are used to classify the decay mode and to calculate the visible four-momentum of reconstructed tau candidates, significantly improving the resolution with respect to the calibration in the existing tau reconstruction. The performance of the reconstruction algorithm is optimised and evaluated using simulation and validated using samples of Z → ττ and Z(→ μμ)+jets events selected from proton–proton collisions at a centre-of-mass energy √s = 8 TeV, corresponding to an integrated luminosity of 5 fb -1.

  11. Method for correction of measured polarization angles from motional Stark effect spectroscopy for the effects of electric fields

    DOE PAGES

    Luce, T. C.; Petty, C. C.; Meyer, W. H.; ...

    2016-11-02

    An approximate method to correct the motional Stark effect (MSE) spectroscopy for the effects of intrinsic plasma electric fields has been developed. The motivation for using an approximate method is to incorporate electric field effects for between-pulse or real-time analysis of the current density or safety factor profile. The toroidal velocity term in the momentum balance equation is normally the dominant contribution to the electric field orthogonal to the flux surface over most of the plasma. When this approximation is valid, the correction to the MSE data can be included in a form like that used when electric field effectsmore » are neglected. This allows measurements of the toroidal velocity to be integrated into the interpretation of the MSE polarization angles without changing how the data is treated in existing codes. In some cases, such as the DIII-D system, the correction is especially simple, due to the details of the neutral beam and MSE viewing geometry. The correction method is compared using DIII-D data in a variety of plasma conditions to analysis that assumes no radial electric field is present and to analysis that uses the standard correction method, which involves significant human intervention for profile fitting. The comparison shows that the new correction method is close to the standard one, and in all cases appears to offer a better result than use of the uncorrected data. Lastly, the method has been integrated into the standard DIII-D equilibrium reconstruction code in use for analysis between plasma pulses and is sufficiently fast that it will be implemented in real-time equilibrium analysis for control applications.« less

  12. Flexible Software Architecture for Visualization and Seismic Data Analysis

    NASA Astrophysics Data System (ADS)

    Petunin, S.; Pavlov, I.; Mogilenskikh, D.; Podzyuban, D.; Arkhipov, A.; Baturuin, N.; Lisin, A.; Smith, A.; Rivers, W.; Harben, P.

    2007-12-01

    Research in the field of seismology requires software and signal processing utilities for seismogram manipulation and analysis. Seismologists and data analysts often encounter a major problem in the use of any particular software application specific to seismic data analysis: the tuning of commands and windows to the specific waveforms and hot key combinations so as to fit their familiar informational environment. The ability to modify the user's interface independently from the developer requires an adaptive code structure. An adaptive code structure also allows for expansion of software capabilities such as new signal processing modules and implementation of more efficient algorithms. Our approach is to use a flexible "open" architecture for development of geophysical software. This report presents an integrated solution for organizing a logical software architecture based on the Unix version of the Geotool software implemented on the Microsoft NET 2.0 platform. Selection of this platform greatly expands the variety and number of computers that can implement the software, including laptops that can be utilized in field conditions. It also facilitates implementation of communication functions for seismic data requests from remote databases through the Internet. The main principle of the new architecture for Geotool is that scientists should be able to add new routines for digital waveform analysis via software plug-ins that utilize the basic Geotool display for GUI interaction. The use of plug-ins allows the efficient integration of diverse signal-processing software, including software still in preliminary development, into an organized platform without changing the fundamental structure of that platform itself. An analyst's use of Geotool is tracked via a metadata file so that future studies can reconstruct, and alter, the original signal processing operations. The work has been completed in the framework of a joint Russian- American project.

  13. Advanced texture filtering: a versatile framework for reconstructing multi-dimensional image data on heterogeneous architectures

    NASA Astrophysics Data System (ADS)

    Zellmann, Stefan; Percan, Yvonne; Lang, Ulrich

    2015-01-01

    Reconstruction of 2-d image primitives or of 3-d volumetric primitives is one of the most common operations performed by the rendering components of modern visualization systems. Because this operation is often aided by GPUs, reconstruction is typically restricted to first-order interpolation. With the advent of in situ visualization, the assumption that rendering algorithms are in general executed on GPUs is however no longer adequate. We thus propose a framework that provides versatile texture filtering capabilities: up to third-order reconstruction using various types of cubic filtering and interpolation primitives; cache-optimized algorithms that integrate seamlessly with GPGPU rendering or with software rendering that was optimized for cache-friendly "Structure of Array" (SoA) access patterns; a memory management layer (MML) that gracefully hides the complexities of extra data copies necessary for memory access optimizations such as swizzling, for rendering on GPGPUs, or for reconstruction schemes that rely on pre-filtered data arrays. We prove the effectiveness of our software architecture by integrating it into and validating it using the open source direct volume rendering (DVR) software DeskVOX.

  14. FLiT: a field line trace code for magnetic confinement devices

    NASA Astrophysics Data System (ADS)

    Innocente, P.; Lorenzini, R.; Terranova, D.; Zanca, P.

    2017-04-01

    This paper presents a field line tracing code (FLiT) developed to study particle and energy transport as well as other phenomena related to magnetic topology in reversed-field pinch (RFP) and tokamak experiments. The code computes magnetic field lines in toroidal geometry using curvilinear coordinates (r, ϑ, ϕ) and calculates the intersections of these field lines with specified planes. The code also computes the magnetic and thermal diffusivity due to stochastic magnetic field in the collisionless limit. Compared to Hamiltonian codes, there are no constraints on the magnetic field functional formulation, which allows the integration of whichever magnetic field is required. The code uses the magnetic field computed by solving the zeroth-order axisymmetric equilibrium and the Newcomb equation for the first-order helical perturbation matching the edge magnetic field measurements in toroidal geometry. Two algorithms are developed to integrate the field lines: one is a dedicated implementation of a first-order semi-implicit volume-preserving integration method, and the other is based on the Adams-Moulton predictor-corrector method. As expected, the volume-preserving algorithm is accurate in conserving divergence, but slow because the low integration order requires small amplitude steps. The second algorithm proves to be quite fast and it is able to integrate the field lines in many partially and fully stochastic configurations accurately. The code has already been used to study the core and edge magnetic topology of the RFX-mod device in both the reversed-field pinch and tokamak magnetic configurations.

  15. Integration of Dakota into the NEAMS Workbench

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Swiler, Laura Painton; Lefebvre, Robert A.; Langley, Brandon R.

    2017-07-01

    This report summarizes a NEAMS (Nuclear Energy Advanced Modeling and Simulation) project focused on integrating Dakota into the NEAMS Workbench. The NEAMS Workbench, developed at Oak Ridge National Laboratory, is a new software framework that provides a graphical user interface, input file creation, parsing, validation, job execution, workflow management, and output processing for a variety of nuclear codes. Dakota is a tool developed at Sandia National Laboratories that provides a suite of uncertainty quantification and optimization algorithms. Providing Dakota within the NEAMS Workbench allows users of nuclear simulation codes to perform uncertainty and optimization studies on their nuclear codes frommore » within a common, integrated environment. Details of the integration and parsing are provided, along with an example of Dakota running a sampling study on the fuels performance code, BISON, from within the NEAMS Workbench.« less

  16. Dynamic PET image reconstruction integrating temporal regularization associated with respiratory motion correction for applications in oncology

    NASA Astrophysics Data System (ADS)

    Merlin, Thibaut; Visvikis, Dimitris; Fernandez, Philippe; Lamare, Frédéric

    2018-02-01

    Respiratory motion reduces both the qualitative and quantitative accuracy of PET images in oncology. This impact is more significant for quantitative applications based on kinetic modeling, where dynamic acquisitions are associated with limited statistics due to the necessity of enhanced temporal resolution. The aim of this study is to address these drawbacks, by combining a respiratory motion correction approach with temporal regularization in a unique reconstruction algorithm for dynamic PET imaging. Elastic transformation parameters for the motion correction are estimated from the non-attenuation-corrected PET images. The derived displacement matrices are subsequently used in a list-mode based OSEM reconstruction algorithm integrating a temporal regularization between the 3D dynamic PET frames, based on temporal basis functions. These functions are simultaneously estimated at each iteration, along with their relative coefficients for each image voxel. Quantitative evaluation has been performed using dynamic FDG PET/CT acquisitions of lung cancer patients acquired on a GE DRX system. The performance of the proposed method is compared with that of a standard multi-frame OSEM reconstruction algorithm. The proposed method achieved substantial improvements in terms of noise reduction while accounting for loss of contrast due to respiratory motion. Results on simulated data showed that the proposed 4D algorithms led to bias reduction values up to 40% in both tumor and blood regions for similar standard deviation levels, in comparison with a standard 3D reconstruction. Patlak parameter estimations on reconstructed images with the proposed reconstruction methods resulted in 30% and 40% bias reduction in the tumor and lung region respectively for the Patlak slope, and a 30% bias reduction for the intercept in the tumor region (a similar Patlak intercept was achieved in the lung area). Incorporation of the respiratory motion correction using an elastic model along with a temporal regularization in the reconstruction process of the PET dynamic series led to substantial quantitative improvements and motion artifact reduction. Future work will include the integration of a linear FDG kinetic model, in order to directly reconstruct parametric images.

  17. Dynamic PET image reconstruction integrating temporal regularization associated with respiratory motion correction for applications in oncology.

    PubMed

    Merlin, Thibaut; Visvikis, Dimitris; Fernandez, Philippe; Lamare, Frédéric

    2018-02-13

    Respiratory motion reduces both the qualitative and quantitative accuracy of PET images in oncology. This impact is more significant for quantitative applications based on kinetic modeling, where dynamic acquisitions are associated with limited statistics due to the necessity of enhanced temporal resolution. The aim of this study is to address these drawbacks, by combining a respiratory motion correction approach with temporal regularization in a unique reconstruction algorithm for dynamic PET imaging. Elastic transformation parameters for the motion correction are estimated from the non-attenuation-corrected PET images. The derived displacement matrices are subsequently used in a list-mode based OSEM reconstruction algorithm integrating a temporal regularization between the 3D dynamic PET frames, based on temporal basis functions. These functions are simultaneously estimated at each iteration, along with their relative coefficients for each image voxel. Quantitative evaluation has been performed using dynamic FDG PET/CT acquisitions of lung cancer patients acquired on a GE DRX system. The performance of the proposed method is compared with that of a standard multi-frame OSEM reconstruction algorithm. The proposed method achieved substantial improvements in terms of noise reduction while accounting for loss of contrast due to respiratory motion. Results on simulated data showed that the proposed 4D algorithms led to bias reduction values up to 40% in both tumor and blood regions for similar standard deviation levels, in comparison with a standard 3D reconstruction. Patlak parameter estimations on reconstructed images with the proposed reconstruction methods resulted in 30% and 40% bias reduction in the tumor and lung region respectively for the Patlak slope, and a 30% bias reduction for the intercept in the tumor region (a similar Patlak intercept was achieved in the lung area). Incorporation of the respiratory motion correction using an elastic model along with a temporal regularization in the reconstruction process of the PET dynamic series led to substantial quantitative improvements and motion artifact reduction. Future work will include the integration of a linear FDG kinetic model, in order to directly reconstruct parametric images.

  18. Application of computer generated color graphic techniques to the processing and display of three dimensional fluid dynamic data

    NASA Technical Reports Server (NTRS)

    Anderson, B. H.; Putt, C. W.; Giamati, C. C.

    1981-01-01

    Color coding techniques used in the processing of remote sensing imagery were adapted and applied to the fluid dynamics problems associated with turbofan mixer nozzles. The computer generated color graphics were found to be useful in reconstructing the measured flow field from low resolution experimental data to give more physical meaning to this information and in scanning and interpreting the large volume of computer generated data from the three dimensional viscous computer code used in the analysis.

  19. Classical and neural methods of image sequence interpolation

    NASA Astrophysics Data System (ADS)

    Skoneczny, Slawomir; Szostakowski, Jaroslaw

    2001-08-01

    An image interpolation problem is often encountered in many areas. Some examples are interpolation for coding/decoding process for transmission purposes, reconstruction a full frame from two interlaced sub-frames in normal TV or HDTV, or reconstruction of missing frames in old destroyed cinematic sequences. In this paper an overview of interframe interpolation methods is presented. Both direct as well as motion compensated interpolation techniques are given by examples. The used methodology can also be either classical or based on neural networks depending on demand of a specific interpolation problem solving person.

  20. The RAVE/VERTIGO vertex reconstruction toolkit and framework

    NASA Astrophysics Data System (ADS)

    Waltenberger, W.; Mitaroff, W.; Moser, F.; Pflugfelder, B.; Riedel, H. V.

    2008-07-01

    A detector-independent toolkit for vertex reconstruction (RAVE1) is being developed, along with a standalone framework (VERTIGO2) for testing, analyzing and debugging. The core algorithms represent state-of-the-art for geometric vertex finding and fitting by both linear (Kalman filter) and robust estimation methods. Main design goals are ease of use, flexibility for embedding into existing software frameworks, extensibility, and openness. The implementation is based on modern object-oriented techniques, is coded in C++ with interfaces for Java and Python, and follows an open-source approach. A beta release is available.

  1. VizieR Online Data Catalog: ynogkm: code for calculating time-like geodesics (Yang+, 2014)

    NASA Astrophysics Data System (ADS)

    Yang, X.-L.; Wang, J.-C.

    2013-11-01

    Here we present the source file for a new public code named ynogkm, aim on calculating the time-like geodesics in a Kerr-Newmann spacetime fast. In the code the four Boyer-Lindquis coordinates and proper time are expressed as functions of a parameter p semi-analytically, i.e., r(p), μ(p), φ(p), t(p), and σ(p), by using the Weiers- trass' and Jacobi's elliptic functions and integrals. All of the ellip- tic integrals are computed by Carlson's elliptic integral method, which guarantees the fast speed of the code.The source Fortran file ynogkm.f90 contains three modules: constants, rootfind, ellfunction, and blcoordinates. (3 data files).

  2. 75 FR 39472 - Airworthiness Directives; Eclipse Aerospace, Inc. Model EA500 Airplanes

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-07-09

    ..., altitude preselect, and/or transponder codes. We are proposing this AD to correct faulty integration of... determined to be a software communication integration issue between the EFIS display interface and associated... transponder codes. We are issuing this AD to correct faulty integration of hardware and software, which could...

  3. Pseudodynamic systems approach based on a quadratic approximation of update equations for diffuse optical tomography.

    PubMed

    Biswas, Samir Kumar; Kanhirodan, Rajan; Vasu, Ram Mohan; Roy, Debasish

    2011-08-01

    We explore a pseudodynamic form of the quadratic parameter update equation for diffuse optical tomographic reconstruction from noisy data. A few explicit and implicit strategies for obtaining the parameter updates via a semianalytical integration of the pseudodynamic equations are proposed. Despite the ill-posedness of the inverse problem associated with diffuse optical tomography, adoption of the quadratic update scheme combined with the pseudotime integration appears not only to yield higher convergence, but also a muted sensitivity to the regularization parameters, which include the pseudotime step size for integration. These observations are validated through reconstructions with both numerically generated and experimentally acquired data.

  4. Comprehensive Identification of Long Non-coding RNAs in Purified Cell Types from the Brain Reveals Functional LncRNA in OPC Fate Determination

    PubMed Central

    Dong, Xiaomin; Chen, Kenian; Cuevas-Diaz Duran, Raquel; You, Yanan; Sloan, Steven A.; Zhang, Ye; Zong, Shan; Cao, Qilin; Barres, Ben A.; Wu, Jia Qian

    2015-01-01

    Long non-coding RNAs (lncRNAs) (> 200 bp) play crucial roles in transcriptional regulation during numerous biological processes. However, it is challenging to comprehensively identify lncRNAs, because they are often expressed at low levels and with more cell-type specificity than are protein-coding genes. In the present study, we performed ab initio transcriptome reconstruction using eight purified cell populations from mouse cortex and detected more than 5000 lncRNAs. Predicting the functions of lncRNAs using cell-type specific data revealed their potential functional roles in Central Nervous System (CNS) development. We performed motif searches in ENCODE DNase I digital footprint data and Mouse ENCODE promoters to infer transcription factor (TF) occupancy. By integrating TF binding and cell-type specific transcriptomic data, we constructed a novel framework that is useful for systematically identifying lncRNAs that are potentially essential for brain cell fate determination. Based on this integrative analysis, we identified lncRNAs that are regulated during Oligodendrocyte Precursor Cell (OPC) differentiation from Neural Stem Cells (NSCs) and that are likely to be involved in oligodendrogenesis. The top candidate, lnc-OPC, shows highly specific expression in OPCs and remarkable sequence conservation among placental mammals. Interestingly, lnc-OPC is significantly up-regulated in glial progenitors from experimental autoimmune encephalomyelitis (EAE) mouse models compared to wild-type mice. OLIG2-binding sites in the upstream regulatory region of lnc-OPC were identified by ChIP (chromatin immunoprecipitation)-Sequencing and validated by luciferase assays. Loss-of-function experiments confirmed that lnc-OPC plays a functional role in OPC genesis. Overall, our results substantiated the role of lncRNA in OPC fate determination and provided an unprecedented data source for future functional investigations in CNS cell types. We present our datasets and analysis results via the interactive genome browser at our laboratory website that is freely accessible to the research community. This is the first lncRNA expression database of collective populations of glia, vascular cells, and neurons. We anticipate that these studies will advance the knowledge of this major class of non-coding genes and their potential roles in neurological development and diseases. PMID:26683846

  5. CBP TOOLBOX VERSION 2.0: CODE INTEGRATION ENHANCEMENTS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, F.; Flach, G.; BROWN, K.

    2013-06-01

    This report describes enhancements made to code integration aspects of the Cementitious Barriers Project (CBP) Toolbox as a result of development work performed at the Savannah River National Laboratory (SRNL) in collaboration with Vanderbilt University (VU) in the first half of fiscal year 2013. Code integration refers to the interfacing to standalone CBP partner codes, used to analyze the performance of cementitious materials, with the CBP Software Toolbox. The most significant enhancements are: 1) Improved graphical display of model results. 2) Improved error analysis and reporting. 3) Increase in the default maximum model mesh size from 301 to 501 nodes.more » 4) The ability to set the LeachXS/Orchestra simulation times through the GoldSim interface. These code interface enhancements have been included in a new release (Version 2.0) of the CBP Toolbox.« less

  6. Development of Acoustic Model-Based Iterative Reconstruction Technique for Thick-Concrete Imaging

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Almansouri, Hani; Clayton, Dwight A; Kisner, Roger A

    Ultrasound signals have been used extensively for non-destructive evaluation (NDE). However, typical reconstruction techniques, such as the synthetic aperture focusing technique (SAFT), are limited to quasi-homogenous thin media. New ultrasonic systems and reconstruction algorithms are in need for one-sided NDE of non-homogenous thick objects. An application example space is imaging of reinforced concrete structures for commercial nuclear power plants (NPPs). These structures provide important foundation, support, shielding, and containment functions. Identification and management of aging and degradation of concrete structures is fundamental to the proposed long-term operation of NPPs. Another example is geothermal and oil/gas production wells. These multi-layered structuresmore » are composed of steel, cement, and several types of soil and rocks. Ultrasound systems with greater penetration range and image quality will allow for better monitoring of the well's health and prediction of high-pressure hydraulic fracturing of the rock. These application challenges need to be addressed with an integrated imaging approach, where the application, hardware, and reconstruction software are highly integrated and optimized. Therefore, we are developing an ultrasonic system with Model-Based Iterative Reconstruction (MBIR) as the image reconstruction backbone. As the first implementation of MBIR for ultrasonic signals, this paper document the first implementation of the algorithm and show reconstruction results for synthetically generated data.« less

  7. Indoor Photogrammetry Aided with Uwb Navigation

    NASA Astrophysics Data System (ADS)

    Masiero, A.; Fissore, F.; Guarnieri, A.; Vettore, A.

    2018-05-01

    The subject of photogrammetric surveying with mobile devices, in particular smartphones, is becoming of significant interest in the research community. Nowadays, the process of providing 3D point clouds with photogrammetric procedures is well known. However, external information is still typically needed in order to move from the point cloud obtained from images to a 3D metric reconstruction. This paper investigates the integration of information provided by an UWB positioning system with visual based reconstruction to produce a metric reconstruction. Furthermore, the orientation (with respect to North-East directions) of the obtained model is assessed thanks to the use of inertial sensors included in the considered UWB devices. Results of this integration are shown on two case studies in indoor environments.

  8. Modeling global vector fields of chaotic systems from noisy time series with the aid of structure-selection techniques.

    PubMed

    Xu, Daolin; Lu, Fangfang

    2006-12-01

    We address the problem of reconstructing a set of nonlinear differential equations from chaotic time series. A method that combines the implicit Adams integration and the structure-selection technique of an error reduction ratio is proposed for system identification and corresponding parameter estimation of the model. The structure-selection technique identifies the significant terms from a pool of candidates of functional basis and determines the optimal model through orthogonal characteristics on data. The technique with the Adams integration algorithm makes the reconstruction available to data sampled with large time intervals. Numerical experiment on Lorenz and Rossler systems shows that the proposed strategy is effective in global vector field reconstruction from noisy time series.

  9. A Code of Ethics and Integrity for HRD Research and Practice.

    ERIC Educational Resources Information Center

    Hatcher, Tim; Aragon, Steven R.

    2000-01-01

    Describes the rationale for a code of ethics and integrity in human resource development (HRD). Outlines the Academy of Human Resource Development's standards. Reviews ethical issues faced by the HRD profession. (SK)

  10. Multi-Material ALE with AMR for Modeling Hot Plasmas and Cold Fragmenting Materials

    NASA Astrophysics Data System (ADS)

    Alice, Koniges; Nathan, Masters; Aaron, Fisher; David, Eder; Wangyi, Liu; Robert, Anderson; David, Benson; Andrea, Bertozzi

    2015-02-01

    We have developed a new 3D multi-physics multi-material code, ALE-AMR, which combines Arbitrary Lagrangian Eulerian (ALE) hydrodynamics with Adaptive Mesh Refinement (AMR) to connect the continuum to the microstructural regimes. The code is unique in its ability to model hot radiating plasmas and cold fragmenting solids. New numerical techniques were developed for many of the physics packages to work efficiently on a dynamically moving and adapting mesh. We use interface reconstruction based on volume fractions of the material components within mixed zones and reconstruct interfaces as needed. This interface reconstruction model is also used for void coalescence and fragmentation. A flexible strength/failure framework allows for pluggable material models, which may require material history arrays to determine the level of accumulated damage or the evolving yield stress in J2 plasticity models. For some applications laser rays are propagating through a virtual composite mesh consisting of the finest resolution representation of the modeled space. A new 2nd order accurate diffusion solver has been implemented for the thermal conduction and radiation transport packages. One application area is the modeling of laser/target effects including debris/shrapnel generation. Other application areas include warm dense matter, EUV lithography, and material wall interactions for fusion devices.

  11. Progress on China nuclear data processing code system

    NASA Astrophysics Data System (ADS)

    Liu, Ping; Wu, Xiaofei; Ge, Zhigang; Li, Songyang; Wu, Haicheng; Wen, Lili; Wang, Wenming; Zhang, Huanyu

    2017-09-01

    China is developing the nuclear data processing code Ruler, which can be used for producing multi-group cross sections and related quantities from evaluated nuclear data in the ENDF format [1]. The Ruler includes modules for reconstructing cross sections in all energy range, generating Doppler-broadened cross sections for given temperature, producing effective self-shielded cross sections in unresolved energy range, calculating scattering cross sections in thermal energy range, generating group cross sections and matrices, preparing WIMS-D format data files for the reactor physics code WIMS-D [2]. Programming language of the Ruler is Fortran-90. The Ruler is tested for 32-bit computers with Windows-XP and Linux operating systems. The verification of Ruler has been performed by comparison with calculation results obtained by the NJOY99 [3] processing code. The validation of Ruler has been performed by using WIMSD5B code.

  12. Advanced Imaging Optics Utilizing Wavefront Coding.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Scrymgeour, David; Boye, Robert; Adelsberger, Kathleen

    2015-06-01

    Image processing offers a potential to simplify an optical system by shifting some of the imaging burden from lenses to the more cost effective electronics. Wavefront coding using a cubic phase plate combined with image processing can extend the system's depth of focus, reducing many of the focus-related aberrations as well as material related chromatic aberrations. However, the optimal design process and physical limitations of wavefront coding systems with respect to first-order optical parameters and noise are not well documented. We examined image quality of simulated and experimental wavefront coded images before and after reconstruction in the presence of noise.more » Challenges in the implementation of cubic phase in an optical system are discussed. In particular, we found that limitations must be placed on system noise, aperture, field of view and bandwidth to develop a robust wavefront coded system.« less

  13. Reconstruction of improvised explosive device blast loading to personnel in the open

    NASA Astrophysics Data System (ADS)

    Wiri, Suthee; Needham, Charles

    2016-05-01

    Significant advances in reconstructing attacks by improvised explosive devices (IEDs) and other blast events are reported. A high-fidelity three-dimensional computational fluid dynamics tool, called Second-order Hydrodynamic Automatic Mesh Refinement Code, was used for the analysis. Computer-aided design models for subjects or vehicles in the scene accurately represent geometries of objects in the blast field. A wide range of scenario types and blast exposure levels were reconstructed including free field blast, enclosed space of vehicle cabin, IED attack on a vehicle, buried charges, recoilless rifle operation, rocket-propelled grenade attack and missile attack with single subject or multiple subject exposure to pressure levels from ˜ 27.6 kPa (˜ 4 psi) to greater than 690 kPa (>100 psi). To create a full 3D pressure time-resolved reconstruction of a blast event for injury and blast exposure analysis, a combination of intelligence data and Blast Gauge data can be used to reconstruct an actual in-theatre blast event. The methodology to reconstruct an event and the "lessons learned" from multiple reconstructions in open space are presented. The analysis uses records of blast pressure at discrete points, and the output is a spatial and temporal blast load distribution for all personnel involved.

  14. A cross-platform freeware tool for digital reconstruction of neuronal arborizations from image stacks.

    PubMed

    Brown, Kerry M; Donohue, Duncan E; D'Alessandro, Giampaolo; Ascoli, Giorgio A

    2005-01-01

    Digital reconstruction of neuronal arborizations is an important step in the quantitative investigation of cellular neuroanatomy. In this process, neurites imaged by microscopy are semi-manually traced through the use of specialized computer software and represented as binary trees of branching cylinders (or truncated cones). Such form of the reconstruction files is efficient and parsimonious, and allows extensive morphometric analysis as well as the implementation of biophysical models of electrophysiology. Here, we describe Neuron_ Morpho, a plugin for the popular Java application ImageJ that mediates the digital reconstruction of neurons from image stacks. Both the executable and code of Neuron_ Morpho are freely distributed (www.maths. soton.ac.uk/staff/D'Alessandro/morpho or www.krasnow.gmu.edu/L-Neuron), and are compatible with all major computer platforms (including Windows, Mac, and Linux). We tested Neuron_Morpho by reconstructing two neurons from each of the two preparations representing different brain areas (hippocampus and cerebellum), neuritic type (pyramidal cell dendrites and olivar axonal projection terminals), and labeling method (rapid Golgi impregnation and anterograde dextran amine), and quantitatively comparing the resulting morphologies to those of the same cells reconstructed with the standard commercial system, Neurolucida. None of the numerous morphometric measures that were analyzed displayed any significant or systematic difference between the two reconstructing systems.

  15. Reconstruction of magnetic resonance imaging by three-dimensional dual-dictionary learning.

    PubMed

    Song, Ying; Zhu, Zhen; Lu, Yang; Liu, Qiegen; Zhao, Jun

    2014-03-01

    To improve the magnetic resonance imaging (MRI) data acquisition speed while maintaining the reconstruction quality, a novel method is proposed for multislice MRI reconstruction from undersampled k-space data based on compressed-sensing theory using dictionary learning. There are two aspects to improve the reconstruction quality. One is that spatial correlation among slices is used by extending the atoms in dictionary learning from patches to blocks. The other is that the dictionary-learning scheme is used at two resolution levels; i.e., a low-resolution dictionary is used for sparse coding and a high-resolution dictionary is used for image updating. Numerical experiments are carried out on in vivo 3D MR images of brains and abdomens with a variety of undersampling schemes and ratios. The proposed method (dual-DLMRI) achieves better reconstruction quality than conventional reconstruction methods, with the peak signal-to-noise ratio being 7 dB higher. The advantages of the dual dictionaries are obvious compared with the single dictionary. Parameter variations ranging from 50% to 200% only bias the image quality within 15% in terms of the peak signal-to-noise ratio. Dual-DLMRI effectively uses the a priori information in the dual-dictionary scheme and provides dramatically improved reconstruction quality. Copyright © 2013 Wiley Periodicals, Inc.

  16. ACTS: from ATLAS software towards a common track reconstruction software

    NASA Astrophysics Data System (ADS)

    Gumpert, C.; Salzburger, A.; Kiehn, M.; Hrdinka, J.; Calace, N.; ATLAS Collaboration

    2017-10-01

    Reconstruction of charged particles’ trajectories is a crucial task for most particle physics experiments. The high instantaneous luminosity achieved at the LHC leads to a high number of proton-proton collisions per bunch crossing, which has put the track reconstruction software of the LHC experiments through a thorough test. Preserving track reconstruction performance under increasingly difficult experimental conditions, while keeping the usage of computational resources at a reasonable level, is an inherent problem for many HEP experiments. Exploiting concurrent algorithms and using multivariate techniques for track identification are the primary strategies to achieve that goal. Starting from current ATLAS software, the ACTS project aims to encapsulate track reconstruction software into a generic, framework- and experiment-independent software package. It provides a set of high-level algorithms and data structures for performing track reconstruction tasks as well as fast track simulation. The software is developed with special emphasis on thread-safety to support parallel execution of the code and data structures are optimised for vectorisation to speed up linear algebra operations. The implementation is agnostic to the details of the detection technologies and magnetic field configuration which makes it applicable to many different experiments.

  17. Interleaved diffusion-weighted EPI improved by adaptive partial-Fourier and multi-band multiplexed sensitivity-encoding reconstruction

    PubMed Central

    Chang, Hing-Chiu; Guhaniyogi, Shayan; Chen, Nan-kuei

    2014-01-01

    Purpose We report a series of techniques to reliably eliminate artifacts in interleaved echo-planar imaging (EPI) based diffusion weighted imaging (DWI). Methods First, we integrate the previously reported multiplexed sensitivity encoding (MUSE) algorithm with a new adaptive Homodyne partial-Fourier reconstruction algorithm, so that images reconstructed from interleaved partial-Fourier DWI data are free from artifacts even in the presence of either a) motion-induced k-space energy peak displacement, or b) susceptibility field gradient induced fast phase changes. Second, we generalize the previously reported single-band MUSE framework to multi-band MUSE, so that both through-plane and in-plane aliasing artifacts in multi-band multi-shot interleaved DWI data can be effectively eliminated. Results The new adaptive Homodyne-MUSE reconstruction algorithm reliably produces high-quality and high-resolution DWI, eliminating residual artifacts in images reconstructed with previously reported methods. Furthermore, the generalized MUSE algorithm is compatible with multi-band and high-throughput DWI. Conclusion The integration of the multi-band and adaptive Homodyne-MUSE algorithms significantly improves the spatial-resolution, image quality, and scan throughput of interleaved DWI. We expect that the reported reconstruction framework will play an important role in enabling high-resolution DWI for both neuroscience research and clinical uses. PMID:24925000

  18. Accelerating execution of the integrated TIGER series Monte Carlo radiation transport codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, L.M.; Hochstedler, R.D.

    1997-02-01

    Execution of the integrated TIGER series (ITS) of coupled electron/photon Monte Carlo radiation transport codes has been accelerated by modifying the FORTRAN source code for more efficient computation. Each member code of ITS was benchmarked and profiled with a specific test case that directed the acceleration effort toward the most computationally intensive subroutines. Techniques for accelerating these subroutines included replacing linear search algorithms with binary versions, replacing the pseudo-random number generator, reducing program memory allocation, and proofing the input files for geometrical redundancies. All techniques produced identical or statistically similar results to the original code. Final benchmark timing of themore » accelerated code resulted in speed-up factors of 2.00 for TIGER (the one-dimensional slab geometry code), 1.74 for CYLTRAN (the two-dimensional cylindrical geometry code), and 1.90 for ACCEPT (the arbitrary three-dimensional geometry code).« less

  19. The Athena Astrophysical MHD Code in Cylindrical Geometry

    NASA Astrophysics Data System (ADS)

    Skinner, M. A.; Ostriker, E. C.

    2011-10-01

    We have developed a method for implementing cylindrical coordinates in the Athena MHD code (Skinner & Ostriker 2010). The extension has been designed to alter the existing Cartesian-coordinates code (Stone et al. 2008) as minimally and transparently as possible. The numerical equations in cylindrical coordinates are formulated to maintain consistency with constrained transport, a central feature of the Athena algorithm, while making use of previously implemented code modules such as the eigensystems and Riemann solvers. Angular-momentum transport, which is critical in astrophysical disk systems dominated by rotation, is treated carefully. We describe modifications for cylindrical coordinates of the higher-order spatial reconstruction and characteristic evolution steps as well as the finite-volume and constrained transport updates. Finally, we have developed a test suite of standard and novel problems in one-, two-, and three-dimensions designed to validate our algorithms and implementation and to be of use to other code developers. The code is suitable for use in a wide variety of astrophysical applications and is freely available for download on the web.

  20. Critical roles for a genetic code alteration in the evolution of the genus Candida.

    PubMed

    Silva, Raquel M; Paredes, João A; Moura, Gabriela R; Manadas, Bruno; Lima-Costa, Tatiana; Rocha, Rita; Miranda, Isabel; Gomes, Ana C; Koerkamp, Marian J G; Perrot, Michel; Holstege, Frank C P; Boucherie, Hélian; Santos, Manuel A S

    2007-10-31

    During the last 30 years, several alterations to the standard genetic code have been discovered in various bacterial and eukaryotic species. Sense and nonsense codons have been reassigned or reprogrammed to expand the genetic code to selenocysteine and pyrrolysine. These discoveries highlight unexpected flexibility in the genetic code, but do not elucidate how the organisms survived the proteome chaos generated by codon identity redefinition. In order to shed new light on this question, we have reconstructed a Candida genetic code alteration in Saccharomyces cerevisiae and used a combination of DNA microarrays, proteomics and genetics approaches to evaluate its impact on gene expression, adaptation and sexual reproduction. This genetic manipulation blocked mating, locked yeast in a diploid state, remodelled gene expression and created stress cross-protection that generated adaptive advantages under environmental challenging conditions. This study highlights unanticipated roles for codon identity redefinition during the evolution of the genus Candida, and strongly suggests that genetic code alterations create genetic barriers that speed up speciation.

  1. Evaluation of the accuracy of the Rotating Parallel Ray Omnidirectional Integration for instantaneous pressure reconstruction from the measured pressure gradient

    NASA Astrophysics Data System (ADS)

    Moreto, Jose; Liu, Xiaofeng

    2017-11-01

    The accuracy of the Rotating Parallel Ray omnidirectional integration for pressure reconstruction from the measured pressure gradient (Liu et al., AIAA paper 2016-1049) is evaluated against both the Circular Virtual Boundary omnidirectional integration (Liu and Katz, 2006 and 2013) and the conventional Poisson equation approach. Dirichlet condition at one boundary point and Neumann condition at all other boundary points are applied to the Poisson solver. A direct numerical simulation database of isotropic turbulence flow (JHTDB), with a homogeneously distributed random noise added to the entire field of DNS pressure gradient, is used to assess the performance of the methods. The random noise, generated by the Matlab function Rand, has a magnitude varying randomly within the range of +/-40% of the maximum DNS pressure gradient. To account for the effect of the noise distribution pattern on the reconstructed pressure accuracy, a total of 1000 different noise distributions achieved by using different random number seeds are involved in the evaluation. Final results after averaging the 1000 realizations show that the error of the reconstructed pressure normalized by the DNS pressure variation range is 0.15 +/-0.07 for the Poisson equation approach, 0.028 +/-0.003 for the Circular Virtual Boundary method and 0.027 +/-0.003 for the Rotating Parallel Ray method, indicating the robustness of the Rotating Parallel Ray method in pressure reconstruction. Sponsor: The San Diego State University UGP program.

  2. CHOLLA: A New Massively Parallel Hydrodynamics Code for Astrophysical Simulation

    NASA Astrophysics Data System (ADS)

    Schneider, Evan E.; Robertson, Brant E.

    2015-04-01

    We present Computational Hydrodynamics On ParaLLel Architectures (Cholla ), a new three-dimensional hydrodynamics code that harnesses the power of graphics processing units (GPUs) to accelerate astrophysical simulations. Cholla models the Euler equations on a static mesh using state-of-the-art techniques, including the unsplit Corner Transport Upwind algorithm, a variety of exact and approximate Riemann solvers, and multiple spatial reconstruction techniques including the piecewise parabolic method (PPM). Using GPUs, Cholla evolves the fluid properties of thousands of cells simultaneously and can update over 10 million cells per GPU-second while using an exact Riemann solver and PPM reconstruction. Owing to the massively parallel architecture of GPUs and the design of the Cholla code, astrophysical simulations with physically interesting grid resolutions (≳2563) can easily be computed on a single device. We use the Message Passing Interface library to extend calculations onto multiple devices and demonstrate nearly ideal scaling beyond 64 GPUs. A suite of test problems highlights the physical accuracy of our modeling and provides a useful comparison to other codes. We then use Cholla to simulate the interaction of a shock wave with a gas cloud in the interstellar medium, showing that the evolution of the cloud is highly dependent on its density structure. We reconcile the computed mixing time of a turbulent cloud with a realistic density distribution destroyed by a strong shock with the existing analytic theory for spherical cloud destruction by describing the system in terms of its median gas density.

  3. Alternatively Constrained Dictionary Learning For Image Superresolution.

    PubMed

    Lu, Xiaoqiang; Yuan, Yuan; Yan, Pingkun

    2014-03-01

    Dictionaries are crucial in sparse coding-based algorithm for image superresolution. Sparse coding is a typical unsupervised learning method to study the relationship between the patches of high-and low-resolution images. However, most of the sparse coding methods for image superresolution fail to simultaneously consider the geometrical structure of the dictionary and the corresponding coefficients, which may result in noticeable superresolution reconstruction artifacts. In other words, when a low-resolution image and its corresponding high-resolution image are represented in their feature spaces, the two sets of dictionaries and the obtained coefficients have intrinsic links, which has not yet been well studied. Motivated by the development on nonlocal self-similarity and manifold learning, a novel sparse coding method is reported to preserve the geometrical structure of the dictionary and the sparse coefficients of the data. Moreover, the proposed method can preserve the incoherence of dictionary entries and provide the sparse coefficients and learned dictionary from a new perspective, which have both reconstruction and discrimination properties to enhance the learning performance. Furthermore, to utilize the model of the proposed method more effectively for single-image superresolution, this paper also proposes a novel dictionary-pair learning method, which is named as two-stage dictionary training. Extensive experiments are carried out on a large set of images comparing with other popular algorithms for the same purpose, and the results clearly demonstrate the effectiveness of the proposed sparse representation model and the corresponding dictionary learning algorithm.

  4. Progressive low-bitrate digital color/monochrome image coding by neuro-fuzzy clustering

    NASA Astrophysics Data System (ADS)

    Mitra, Sunanda; Meadows, Steven

    1997-10-01

    Color image coding at low bit rates is an area of research that is just being addressed in recent literature since the problems of storage and transmission of color images are becoming more prominent in many applications. Current trends in image coding exploit the advantage of subband/wavelet decompositions in reducing the complexity in optimal scalar/vector quantizer (SQ/VQ) design. Compression ratios (CRs) of the order of 10:1 to 20:1 with high visual quality have been achieved by using vector quantization of subband decomposed color images in perceptually weighted color spaces. We report the performance of a recently developed adaptive vector quantizer, namely, AFLC-VQ for effective reduction in bit rates while maintaining high visual quality of reconstructed color as well as monochrome images. For 24 bit color images, excellent visual quality is maintained upto a bit rate reduction to approximately 0.48 bpp (for each color plane or monochrome 0.16 bpp, CR 50:1) by using the RGB color space. Further tuning of the AFLC-VQ, and addition of an entropy coder module after the VQ stage results in extremely low bit rates (CR 80:1) for good quality, reconstructed images. Our recent study also reveals that for similar visual quality, RGB color space requires less bits/pixel than either the YIQ, or HIS color space for storing the same information when entropy coding is applied. AFLC-VQ outperforms other standard VQ and adaptive SQ techniques in retaining visual fidelity at similar bit rate reduction.

  5. Computer-Aided Design and Computer-Aided Manufacturing Hydroxyapatite/Epoxide Acrylate Maleic Compound Construction for Craniomaxillofacial Bone Defects.

    PubMed

    Zhang, Lei; Shen, Shunyao; Yu, Hongbo; Shen, Steve Guofang; Wang, Xudong

    2015-07-01

    The aim of this study was to investigate the use of computer-aided design and computer-aided manufacturing hydroxyapatite (HA)/epoxide acrylate maleic (EAM) compound construction artificial implants for craniomaxillofacial bone defects. Computed tomography, computer-aided design/computer-aided manufacturing and three-dimensional reconstruction, as well as rapid prototyping were performed in 12 patients between 2008 and 2013. The customized HA/EAM compound artificial implants were manufactured through selective laser sintering using a rapid prototyping machine into the exact geometric shapes of the defect. The HA/EAM compound artificial implants were then implanted during surgical reconstruction. Color-coded superimpositions demonstrated the discrepancy between the virtual plan and achieved results using Geomagic Studio. As a result, the HA/EAM compound artificial bone implants were perfectly matched with the facial areas that needed reconstruction. The postoperative aesthetic and functional results were satisfactory. The color-coded superimpositions demonstrated good consistency between the virtual plan and achieved results. The three-dimensional maximum deviation is 2.12 ± 0.65  mm and the three-dimensional mean deviation is 0.27 ± 0.07  mm. No facial nerve weakness or pain was observed at the follow-up examinations. Only 1 implant had to be removed 2 months after the surgery owing to severe local infection. No other complication was noted during the follow-up period. In conclusion, computer-aided, individually fabricated HA/EAM compound construction artificial implant was a good craniomaxillofacial surgical technique that yielded improved aesthetic results and functional recovery after reconstruction.

  6. Natural pixel decomposition for computational tomographic reconstruction from interferometric projection: algorithms and comparison

    NASA Astrophysics Data System (ADS)

    Cha, Don J.; Cha, Soyoung S.

    1995-09-01

    A computational tomographic technique, termed the variable grid method (VGM), has been developed for improving interferometric reconstruction of flow fields under ill-posed data conditions of restricted scanning and incomplete projection. The technique is based on natural pixel decomposition, that is, division of a field into variable grid elements. The performances of two algorithms, that is, original and revised versions, are compared to investigate the effects of the data redundancy criteria and seed element forming schemes. Tests of the VGMs are conducted through computer simulation of experiments and reconstruction of fields with a limited view angel of 90 degree(s). The temperature fields at two horizontal sections of a thermal plume of two interacting isothermal cubes, produced by a finite numerical code, are analyzed as test fields. The computer simulation demonstrates the superiority of the revised VGM to either the conventional fixed grid method or the original VGM. Both the maximum and average reconstruction errors are reduced appreciably. The reconstruction shows substantial improvement in the regions with dense scanning by probing rays. These regions are usually of interest in engineering applications.

  7. Reconstructing evolutionary trees in parallel for massive sequences.

    PubMed

    Zou, Quan; Wan, Shixiang; Zeng, Xiangxiang; Ma, Zhanshan Sam

    2017-12-14

    Building the evolutionary trees for massive unaligned DNA sequences is challenging and crucial. However, reconstructing evolutionary tree for ultra-large sequences is hard. Massive multiple sequence alignment is also challenging and time/space consuming. Hadoop and Spark are developed recently, which bring spring light for the classical computational biology problems. In this paper, we tried to solve the multiple sequence alignment and evolutionary reconstruction in parallel. HPTree, which is developed in this paper, can deal with big DNA sequence files quickly. It works well on the >1GB files, and gets better performance than other evolutionary reconstruction tools. Users could use HPTree for reonstructing evolutioanry trees on the computer clusters or cloud platform (eg. Amazon Cloud). HPTree could help on population evolution research and metagenomics analysis. In this paper, we employ the Hadoop and Spark platform and design an evolutionary tree reconstruction software tool for unaligned massive DNA sequences. Clustering and multiple sequence alignment are done in parallel. Neighbour-joining model was employed for the evolutionary tree building. We opened our software together with source codes via http://lab.malab.cn/soft/HPtree/ .

  8. Plasma stability analysis using Consistent Automatic Kinetic Equilibrium reconstruction (CAKE)

    NASA Astrophysics Data System (ADS)

    Roelofs, Matthijs; Kolemen, Egemen; Eldon, David; Glasser, Alex; Meneghini, Orso; Smith, Sterling P.

    2017-10-01

    Presented here is the Consistent Automatic Kinetic Equilibrium (CAKE) code. CAKE is being developed to perform real-time kinetic equilibrium reconstruction, aiming to do a reconstruction in less than 100ms. This is achieved by taking, next to real-time Motional Stark Effect (MSE) and magnetics data, real-time Thomson Scattering (TS) and real-time Charge Exchange Recombination (CER, still in development) data in to account. Electron densities and temperature are determined by TS, while ion density and pressures are determined using CER. These form, together with the temperature and density of neutrals, the additional pressure constraints. Extra current constraints are imposed in the core by the MSE diagnostics. The pedestal current density is estimated using Sauters equation for the bootstrap current density. By comparing the behaviour of the ideal MHD perturbed potential energy (δW) and the linear stability index (Δ') of CAKE to magnetics-only reconstruction, it can be seen that the use of diagnostics to reconstruct the pedestal have a large effect on stability. Supported by U.S. DOE DE-SC0015878 and DE-FC02-04ER54698.

  9. Tomographic image reconstruction using the cell broadband engine (CBE) general purpose hardware

    NASA Astrophysics Data System (ADS)

    Knaup, Michael; Steckmann, Sven; Bockenbach, Olivier; Kachelrieß, Marc

    2007-02-01

    Tomographic image reconstruction, such as the reconstruction of CT projection values, of tomosynthesis data, PET or SPECT events, is computational very demanding. In filtered backprojection as well as in iterative reconstruction schemes, the most time-consuming steps are forward- and backprojection which are often limited by the memory bandwidth. Recently, a novel general purpose architecture optimized for distributed computing became available: the Cell Broadband Engine (CBE). Its eight synergistic processing elements (SPEs) currently allow for a theoretical performance of 192 GFlops (3 GHz, 8 units, 4 floats per vector, 2 instructions, multiply and add, per clock). To maximize image reconstruction speed we modified our parallel-beam and perspective backprojection algorithms which are highly optimized for standard PCs, and optimized the code for the CBE processor. 1-3 In addition, we implemented an optimized perspective forwardprojection on the CBE which allows us to perform statistical image reconstructions like the ordered subset convex (OSC) algorithm. 4 Performance was measured using simulated data with 512 projections per rotation and 5122 detector elements. The data were backprojected into an image of 512 3 voxels using our PC-based approaches and the new CBE- based algorithms. Both the PC and the CBE timings were scaled to a 3 GHz clock frequency. On the CBE, we obtain total reconstruction times of 4.04 s for the parallel backprojection, 13.6 s for the perspective backprojection and 192 s for a complete OSC reconstruction, consisting of one initial Feldkamp reconstruction, followed by 4 OSC iterations.

  10. Space communication system for compressed data with a concatenated Reed-Solomon-Viterbi coding channel

    NASA Technical Reports Server (NTRS)

    Rice, R. F.; Hilbert, E. E. (Inventor)

    1976-01-01

    A space communication system incorporating a concatenated Reed Solomon Viterbi coding channel is discussed for transmitting compressed and uncompressed data from a spacecraft to a data processing center on Earth. Imaging (and other) data are first compressed into source blocks which are then coded by a Reed Solomon coder and interleaver, followed by a convolutional encoder. The received data is first decoded by a Viterbi decoder, followed by a Reed Solomon decoder and deinterleaver. The output of the latter is then decompressed, based on the compression criteria used in compressing the data in the spacecraft. The decompressed data is processed to reconstruct an approximation of the original data-producing condition or images.

  11. A denoising algorithm for CT image using low-rank sparse coding

    NASA Astrophysics Data System (ADS)

    Lei, Yang; Xu, Dong; Zhou, Zhengyang; Wang, Tonghe; Dong, Xue; Liu, Tian; Dhabaan, Anees; Curran, Walter J.; Yang, Xiaofeng

    2018-03-01

    We propose a denoising method of CT image based on low-rank sparse coding. The proposed method constructs an adaptive dictionary of image patches and estimates the sparse coding regularization parameters using the Bayesian interpretation. A low-rank approximation approach is used to simultaneously construct the dictionary and achieve sparse representation through clustering similar image patches. A variable-splitting scheme and a quadratic optimization are used to reconstruct CT image based on achieved sparse coefficients. We tested this denoising technology using phantom, brain and abdominal CT images. The experimental results showed that the proposed method delivers state-of-art denoising performance, both in terms of objective criteria and visual quality.

  12. Squeal Those Tires! Automobile-Accident Reconstruction.

    ERIC Educational Resources Information Center

    Caples, Linda Griffin

    1992-01-01

    Methods use to reconstruct traffic accidents provide settings for real life applications for students in precalculus, mathematical analysis, or trigonometry. Described is the investigation of an accident in conjunction with the local Highway Patrol Academy integrating physics, vector, and trigonometry. Class findings were compared with those of…

  13. An integrated approach: managing resources for post-disaster reconstruction.

    PubMed

    Chang, Yan; Wilkinson, Suzanne; Brunsdon, David; Seville, Erica; Potangaroa, Regan

    2011-10-01

    A lack of resources for post-disaster housing reconstruction significantly limits the prospects for successful recovery. Following the earthquake in Wenchuan, China, in May 2008, housing reconstruction was not immune to resource shortages and price inflation. Difficulties in sourcing materials and labour considerably impeded recovery. This paper provides evidence of the resourcing bottlenecks inherent in the post-Wenchuan earthquake reconstruction process. Its aim is to present an integrated planning framework for managing resources for post-disaster housing rebuilding. The results are drawn from in-field surveys that highlight the areas where stakeholders need to concentrate effort, including revising legislation and policy, enhancing capacity for rebuilding in the construction industry, strengthening the transportation network, restructuring market mechanisms, and incorporating environmental considerations into overall planning. Although the case study presented here is country-specific, it is hoped that the findings provide a basis for future research to identify resourcing constraints and solutions in other disaster contexts. © 2011 The Author(s). Disasters © Overseas Development Institute, 2011.

  14. Fast High Resolution Volume Carving for 3D Plant Shoot Reconstruction

    PubMed Central

    Scharr, Hanno; Briese, Christoph; Embgenbroich, Patrick; Fischbach, Andreas; Fiorani, Fabio; Müller-Linow, Mark

    2017-01-01

    Volume carving is a well established method for visual hull reconstruction and has been successfully applied in plant phenotyping, especially for 3d reconstruction of small plants and seeds. When imaging larger plants at still relatively high spatial resolution (≤1 mm), well known implementations become slow or have prohibitively large memory needs. Here we present and evaluate a computationally efficient algorithm for volume carving, allowing e.g., 3D reconstruction of plant shoots. It combines a well-known multi-grid representation called “Octree” with an efficient image region integration scheme called “Integral image.” Speedup with respect to less efficient octree implementations is about 2 orders of magnitude, due to the introduced refinement strategy “Mark and refine.” Speedup is about a factor 1.6 compared to a highly optimized GPU implementation using equidistant voxel grids, even without using any parallelization. We demonstrate the application of this method for trait derivation of banana and maize plants. PMID:29033961

  15. Integrated large view angle hologram system with multi-slm

    NASA Astrophysics Data System (ADS)

    Yang, ChengWei; Liu, Juan

    2017-10-01

    Recently holographic display has attracted much attention for its ability to generate real-time 3D reconstructed image. CGH provides an effective way to produce hologram, and spacial light modulator (SLM) is used to reconstruct the image. However the reconstructing system is usually very heavy and complex, and the view-angle is limited by the pixel size and spatial bandwidth product (SBP) of the SLM. In this paper a light portable holographic display system is proposed by integrating the optical elements and host computer units.Which significantly reduces the space taken in horizontal direction. CGH is produced based on the Fresnel diffraction and point source method. To reduce the memory usage and image distortion, we use an optimized accurate compressed look up table method (AC-LUT) to compute the hologram. In the system, six SLMs are concatenated to a curved plane, each one loading the phase-only hologram in a different angle of the object, the horizontal view-angle of the reconstructed image can be expanded to about 21.8°.

  16. Performance of missing transverse momentum reconstruction in proton-proton collisions at $$\\sqrt{s} = 7~\\mbox{TeV}$$ with ATLAS

    DOE PAGES

    Aad, G.; Abbott, B.; Abdallah, J.; ...

    2012-01-03

    The measurement of missing transverse momentum in the ATLAS detector, described in this paper, makes use of the full event reconstruction and a calibration based on reconstructed physics objects. The performance of the missing transverse momentum reconstruction is evaluated using data collected in pp collisions at a centre-of-mass energy of 7 TeV in 2010. Minimum bias events and events with jets of hadrons are used from data samples corresponding to an integrated luminosity of about 0.3 nb –1 and 600 nb –1 respectively, together with events containing a Z boson decaying to two leptons (electrons or muons) or a Wmore » boson decaying to a lepton (electron or muon) and a neutrino, from a data sample corresponding to an integrated luminosity of about 36 pb –1. In conclusion, an estimate of the systematic uncertainty on the missing transverse momentum scale is presented.« less

  17. Phylogenetic relationships of Hemiptera inferred from mitochondrial and nuclear genes.

    PubMed

    Song, Nan; Li, Hu; Cai, Wanzhi; Yan, Fengming; Wang, Jianyun; Song, Fan

    2016-11-01

    Here, we reconstructed the Hemiptera phylogeny based on the expanded mitochondrial protein-coding genes and the nuclear 18S rRNA gene, separately. The differential rates of change across lineages may associate with long-branch attraction (LBA) effect and result in conflicting estimates of phylogeny from different types of data. To reduce the potential effects of systematic biases on inferences of topology, various data coding schemes, site removal method, and different algorithms were utilized in phylogenetic reconstruction. We show that the outgroups Phthiraptera, Thysanoptera, and the ingroup Sternorrhyncha share similar base composition, and exhibit "long branches" relative to other hemipterans. Thus, the long-branch attraction between these groups is suspected to cause the failure of recovering Hemiptera under the homogeneous model. In contrast, a monophyletic Hemiptera is supported when heterogeneous model is utilized in the analysis. Although higher level phylogenetic relationships within Hemiptera remain to be answered, consensus between analyses is beginning to converge on a stable phylogeny.

  18. Calculated organ doses for Mayak production association central hall using ICRP and MCNP.

    PubMed

    Choe, Dong-Ok; Shelkey, Brenda N; Wilde, Justin L; Walk, Heidi A; Slaughter, David M

    2003-03-01

    As part of an ongoing dose reconstruction project, equivalent organ dose rates from photons and neutrons were estimated using the energy spectra measured in the central hall above the graphite reactor core located in the Russian Mayak Production Association facility. Reconstruction of the work environment was necessary due to the lack of personal dosimeter data for neutrons in the time period prior to 1987. A typical worker scenario for the central hall was developed for the Monte Carlo Neutron Photon-4B (MCNP) code. The resultant equivalent dose rates for neutrons and photons were compared with the equivalent dose rates derived from calculations using the conversion coefficients in the International Commission on Radiological Protection Publications 51 and 74 in order to validate the model scenario for this Russian facility. The MCNP results were in good agreement with the results of the ICRP publications indicating the modeling scenario was consistent with actual work conditions given the spectra provided. The MCNP code will allow for additional orientations to accurately reflect source locations.

  19. [Artistic dream reconstructions of the Wolf Man in the light of experimental findings].

    PubMed

    Leuschner, W; Hau, S

    1995-07-01

    The authors take the famous dream picture by the "wolf-man" as the starting-point for experimental research of their own and relate the one to the other. They see the significance of pictorial representations of dreams less in the supplementary information they convey about the "text" of dreams than in the fact that they give separate expression to motility-based "gesture-associated" memories and ideas. These need to be distinguished from "language-associated" memories as there are dissociated modes of encoding experience underlying verbal and pictorial representation. The splitting of these modes into different part-codes is an important component of repression. In addition, the pictorial representation of dreams enables the patient to re-associate part-codes and thus achieve a "deferred" reconstruction and complementation of significant experiences. In the authors view, pictorial representation is favorable to the discovery of repressed material; it is as yet however impossible to say exactly what relationship there is between verbalized and pictorial representation in the context of analysis.

  20. Fostering integrity in postgraduate research: an evidence-based policy and support framework.

    PubMed

    Mahmud, Saadia; Bretag, Tracey

    2014-01-01

    Postgraduate research students have a unique position in the debate on integrity in research as students and novice researchers. To assess how far policies for integrity in postgraduate research meet the needs of students as "research trainees," we reviewed online policies for integrity in postgraduate research at nine particular Australian universities against the Australian Code for Responsible Conduct of Research (the Code) and the five core elements of exemplary academic integrity policy identified by Bretag et al. (2011 ), i.e., access, approach, responsibility, detail, and support. We found inconsistency with the Code in the definition of research misconduct and a lack of adequate detail and support. Based on our analysis, previous research, and the literature, we propose a framework for policy and support for postgraduate research that encompasses a consistent and educative approach to integrity maintained across the university at all levels of scholarship and for all stakeholders.

  1. SMITHERS: An object-oriented modular mapping methodology for MCNP-based neutronic–thermal hydraulic multiphysics

    DOE PAGES

    Richard, Joshua; Galloway, Jack; Fensin, Michael; ...

    2015-04-04

    A novel object-oriented modular mapping methodology for externally coupled neutronics–thermal hydraulics multiphysics simulations was developed. The Simulator using MCNP with Integrated Thermal-Hydraulics for Exploratory Reactor Studies (SMITHERS) code performs on-the-fly mapping of material-wise power distribution tallies implemented by MCNP-based neutron transport/depletion solvers for use in estimating coolant temperature and density distributions with a separate thermal-hydraulic solver. The key development of SMITHERS is that it reconstructs the hierarchical geometry structure of the material-wise power generation tallies from the depletion solver automatically, with only a modicum of additional information required from the user. In addition, it performs the basis mapping from themore » combinatorial geometry of the depletion solver to the required geometry of the thermal-hydraulic solver in a generalizable manner, such that it can transparently accommodate varying levels of thermal-hydraulic solver geometric fidelity, from the nodal geometry of multi-channel analysis solvers to the pin-cell level of discretization for sub-channel analysis solvers.« less

  2. The New CCSDS Image Compression Recommendation

    NASA Technical Reports Server (NTRS)

    Yeh, Pen-Shu; Armbruster, Philippe; Kiely, Aaron; Masschelein, Bart; Moury, Gilles; Schaefer, Christoph

    2005-01-01

    The Consultative Committee for Space Data Systems (CCSDS) data compression working group has recently adopted a recommendation for image data compression, with a final release expected in 2005. The algorithm adopted in the recommendation consists of a two-dimensional discrete wavelet transform of the image, followed by progressive bit-plane coding of the transformed data. The algorithm can provide both lossless and lossy compression, and allows a user to directly control the compressed data volume or the fidelity with which the wavelet-transformed data can be reconstructed. The algorithm is suitable for both frame-based image data and scan-based sensor data, and has applications for near-Earth and deep-space missions. The standard will be accompanied by free software sources on a future web site. An Application-Specific Integrated Circuit (ASIC) implementation of the compressor is currently under development. This paper describes the compression algorithm along with the requirements that drove the selection of the algorithm. Performance results and comparisons with other compressors are given for a test set of space images.

  3. From LIDAR Scanning to 3d FEM Analysis for Complex Surface and Underground Excavations

    NASA Astrophysics Data System (ADS)

    Chun, K.; Kemeny, J.

    2017-12-01

    Light detection and ranging (LIDAR) has been a prevalent remote-sensing technology applied in the geological fields due to its high precision and ease to use. One of the major applications is to use the detailed geometrical information of underground structures as a basis for the generation of three-dimensional numerical model that can be used in FEM analysis. To date, however, straightforward techniques in reconstructing numerical model from the scanned data of underground structures have not been well established or tested. In this paper, we propose a comprehensive approach integrating from LIDAR scanning to finite element numerical analysis, specifically converting LIDAR 3D point clouds of object containing complex surface geometry into finite element model. This methodology has been applied to the Kartchner Caverns in Arizona for the stability analysis. Numerical simulations were performed using the finite element code ABAQUS. The results indicate that the highlights of our technologies obtained from LIDAR is effective and provide reference for other similar engineering project in practice.

  4. Integrating structure-from-motion photogrammetry with geospatial software as a novel technique for quantifying 3D ecological characteristics of coral reefs

    PubMed Central

    Delparte, D; Gates, RD; Takabayashi, M

    2015-01-01

    The structural complexity of coral reefs plays a major role in the biodiversity, productivity, and overall functionality of reef ecosystems. Conventional metrics with 2-dimensional properties are inadequate for characterization of reef structural complexity. A 3-dimensional (3D) approach can better quantify topography, rugosity and other structural characteristics that play an important role in the ecology of coral reef communities. Structure-from-Motion (SfM) is an emerging low-cost photogrammetric method for high-resolution 3D topographic reconstruction. This study utilized SfM 3D reconstruction software tools to create textured mesh models of a reef at French Frigate Shoals, an atoll in the Northwestern Hawaiian Islands. The reconstructed orthophoto and digital elevation model were then integrated with geospatial software in order to quantify metrics pertaining to 3D complexity. The resulting data provided high-resolution physical properties of coral colonies that were then combined with live cover to accurately characterize the reef as a living structure. The 3D reconstruction of reef structure and complexity can be integrated with other physiological and ecological parameters in future research to develop reliable ecosystem models and improve capacity to monitor changes in the health and function of coral reef ecosystems. PMID:26207190

  5. Efficient space-time sampling with pixel-wise coded exposure for high-speed imaging.

    PubMed

    Liu, Dengyu; Gu, Jinwei; Hitomi, Yasunobu; Gupta, Mohit; Mitsunaga, Tomoo; Nayar, Shree K

    2014-02-01

    Cameras face a fundamental trade-off between spatial and temporal resolution. Digital still cameras can capture images with high spatial resolution, but most high-speed video cameras have relatively low spatial resolution. It is hard to overcome this trade-off without incurring a significant increase in hardware costs. In this paper, we propose techniques for sampling, representing, and reconstructing the space-time volume to overcome this trade-off. Our approach has two important distinctions compared to previous works: 1) We achieve sparse representation of videos by learning an overcomplete dictionary on video patches, and 2) we adhere to practical hardware constraints on sampling schemes imposed by architectures of current image sensors, which means that our sampling function can be implemented on CMOS image sensors with modified control units in the future. We evaluate components of our approach, sampling function and sparse representation, by comparing them to several existing approaches. We also implement a prototype imaging system with pixel-wise coded exposure control using a liquid crystal on silicon device. System characteristics such as field of view and modulation transfer function are evaluated for our imaging system. Both simulations and experiments on a wide range of scenes show that our method can effectively reconstruct a video from a single coded image while maintaining high spatial resolution.

  6. Transform coding for space applications

    NASA Technical Reports Server (NTRS)

    Glover, Daniel

    1993-01-01

    Data compression coding requirements for aerospace applications differ somewhat from the compression requirements for entertainment systems. On the one hand, entertainment applications are bit rate driven with the goal of getting the best quality possible with a given bandwidth. Science applications are quality driven with the goal of getting the lowest bit rate for a given level of reconstruction quality. In the past, the required quality level has been nothing less than perfect allowing only the use of lossless compression methods (if that). With the advent of better, faster, cheaper missions, an opportunity has arisen for lossy data compression methods to find a use in science applications as requirements for perfect quality reconstruction runs into cost constraints. This paper presents a review of the data compression problem from the space application perspective. Transform coding techniques are described and some simple, integer transforms are presented. The application of these transforms to space-based data compression problems is discussed. Integer transforms have an advantage over conventional transforms in computational complexity. Space applications are different from broadcast or entertainment in that it is desirable to have a simple encoder (in space) and tolerate a more complicated decoder (on the ground) rather than vice versa. Energy compaction with new transforms are compared with the Walsh-Hadamard (WHT), Discrete Cosine (DCT), and Integer Cosine (ICT) transforms.

  7. Modeling MHD Equilibrium and Dynamics with Non-Axisymmetric Resistive Walls in LTX and HBT-EP

    NASA Astrophysics Data System (ADS)

    Hansen, C.; Levesque, J.; Boyle, D. P.; Hughes, P.

    2017-10-01

    In experimental magnetized plasmas, currents in the first wall, vacuum vessel, and other conducting structures can have a strong influence on plasma shape and dynamics. These effects are complicated by the 3D nature of these structures, which dictate available current paths. Results from simulations to study the effect of external currents on plasmas in two different experiments will be presented: 1) The arbitrary geometry, 3D extended MHD code PSI-Tet is applied to study linear and non-linear plasma dynamics in the High Beta Tokamak (HBT-EP) focusing on toroidal asymmetries in the adjustable conducting wall. 2) Equilibrium reconstructions of the Lithium Tokamak eXperiment (LTX) in the presence of non-axisymmetric eddy currents. An axisymmetric model is used to reconstruct the plasma equilibrium, using the PSI-Tri code, along with a set of fixed 3D eddy current distributions in the first wall and vacuum vessel [C. Hansen et al., PoP Apr. 2017]. Simulations of detailed experimental geometries are enabled by use of the PSI-Tet code, which employs a high order finite element method on unstructured tetrahedral grids that are generated directly from CAD models. Further development of PSI-Tet and PSI-Tri will also be presented. This work supported by US DOE contract DE-SC0016256.

  8. Multi-ray-based system matrix generation for 3D PET reconstruction

    NASA Astrophysics Data System (ADS)

    Moehrs, Sascha; Defrise, Michel; Belcari, Nicola; DelGuerra, Alberto; Bartoli, Antonietta; Fabbri, Serena; Zanetti, Gianluigi

    2008-12-01

    Iterative image reconstruction algorithms for positron emission tomography (PET) require a sophisticated system matrix (model) of the scanner. Our aim is to set up such a model offline for the YAP-(S)PET II small animal imaging tomograph in order to use it subsequently with standard ML-EM (maximum-likelihood expectation maximization) and OSEM (ordered subset expectation maximization) for fully three-dimensional image reconstruction. In general, the system model can be obtained analytically, via measurements or via Monte Carlo simulations. In this paper, we present the multi-ray method, which can be considered as a hybrid method to set up the system model offline. It incorporates accurate analytical (geometric) considerations as well as crystal depth and crystal scatter effects. At the same time, it has the potential to model seamlessly other physical aspects such as the positron range. The proposed method is based on multiple rays which are traced from/to the detector crystals through the image volume. Such a ray-tracing approach itself is not new; however, we derive a novel mathematical formulation of the approach and investigate the positioning of the integration (ray-end) points. First, we study single system matrix entries and show that the positioning and weighting of the ray-end points according to Gaussian integration give better results compared to equally spaced integration points (trapezoidal integration), especially if only a small number of integration points (rays) are used. Additionally, we show that, for a given variance of the single matrix entries, the number of rays (events) required to calculate the whole matrix is a factor of 20 larger when using a pure Monte-Carlo-based method. Finally, we analyse the quality of the model by reconstructing phantom data from the YAP-(S)PET II scanner.

  9. Reverse engineering physical models employing a sensor integration between 3D stereo detection and contact digitization

    NASA Astrophysics Data System (ADS)

    Chen, Liang-Chia; Lin, Grier C. I.

    1997-12-01

    A vision-drive automatic digitization process for free-form surface reconstruction has been developed, with a coordinate measurement machine (CMM) equipped with a touch-triggered probe and a CCD camera, in reverse engineering physical models. The process integrates 3D stereo detection, data filtering, Delaunay triangulation, adaptive surface digitization into a single process of surface reconstruction. By using this innovative approach, surface reconstruction can be implemented automatically and accurately. Least-squares B- spline surface models with the controlled accuracy of digitization can be generated for further application in product design and manufacturing processes. One industrial application indicates that this approach is feasible, and the processing time required in reverse engineering process can be significantly reduced up to more than 85%.

  10. Three Dimensional High-Resolution Reconstruction of the Ionosphere Over the Very Large Array

    DTIC Science & Technology

    2010-12-15

    Watts Progress Report, Dec 10; 1 Final Report: Three Dimensional High-Resolution Reconstruction of the Ionosphere over the Very Large Array...proposed research is reconstruct the three-dimensional regional electron density profile of Earth’s ionosphere with spatial resolution of better than 10 km...10x better sensitivity to total electron content (TEC, or chord integrated density) in the ionosphere that does GPS. The proposal funds the

  11. Biomaterials in craniofacial reconstruction.

    PubMed

    Cho, Younghoon R; Gosain, Arun K

    2004-07-01

    Biomaterials have become an integral component of craniofacial reconstruction. Their increasing ease of use, long "shelf-life," and safety enables them to be used effectively and play an important role in reducing operating times. There are various biomaterials currently available and specific usages have been characterized well in the literature. This article reviews different biomaterials that can be used in craniofacial reconstruction,including autogenous bone, methyl methacrylate and hard tissue replacement,hydroxyapatite, porous polyethylene, bioactive glass, and demineralized bone.

  12. MODTRAN6: a major upgrade of the MODTRAN radiative transfer code

    NASA Astrophysics Data System (ADS)

    Berk, Alexander; Conforti, Patrick; Kennett, Rosemary; Perkins, Timothy; Hawes, Frederick; van den Bosch, Jeannette

    2014-06-01

    The MODTRAN6 radiative transfer (RT) code is a major advancement over earlier versions of the MODTRAN atmospheric transmittance and radiance model. This version of the code incorporates modern software ar- chitecture including an application programming interface, enhanced physics features including a line-by-line algorithm, a supplementary physics toolkit, and new documentation. The application programming interface has been developed for ease of integration into user applications. The MODTRAN code has been restructured towards a modular, object-oriented architecture to simplify upgrades as well as facilitate integration with other developers' codes. MODTRAN now includes a line-by-line algorithm for high resolution RT calculations as well as coupling to optical scattering codes for easy implementation of custom aerosols and clouds.

  13. Anterior Cruciate Ligament-Derived Stem Cells Transduced With BMP2 Accelerate Graft-Bone Integration After ACL Reconstruction.

    PubMed

    Kawakami, Yohei; Takayama, Koji; Matsumoto, Tomoyuki; Tang, Ying; Wang, Bing; Mifune, Yutaka; Cummins, James H; Warth, Ryan J; Kuroda, Ryosuke; Kurosaka, Masahiro; Fu, Freddie H; Huard, Johnny

    2017-03-01

    Strong graft-bone integration is a prerequisite for successful graft remodeling after reconstruction of the anterior cruciate ligament (ACL) using soft tissue grafts. Novel strategies to accelerate soft tissue graft-bone integration are needed to reduce the need for bone-tendon-bone graft harvest, reduce patient convalescence, facilitate rehabilitation, and reduce total recovery time after ACL reconstruction. The application of ACL-derived stem cells with enhanced expression of bone morphogenetic protein 2 (BMP2) onto soft tissue grafts in the form of cell sheets will both accelerate and improve the quality of graft-bone integration after ACL reconstruction in a rat model. Controlled laboratory study. ACL-derived CD34+ cells were isolated from remnant human ACL tissues, virally transduced to express BMP2, and embedded within cell sheets. In a rat model of ACL injury, bilateral single-bundle ACL reconstructions were performed, in which cell sheets were wrapped around tendon autografts before reconstruction. Four groups containing a total of 48 rats (96 knees) were established (n = 12 rats; 24 knees per group): CD34+BMP2 (100%), CD34+BMP2 (25%), CD34+ (untransduced), and a control group containing no cells. Six rats from each group were euthanized 2 and 4 weeks after surgery, and each graft was harvested for immunohistochemical and histological analyses. The remaining 6 rats in each group were euthanized at 4 and 8 weeks to evaluate in situ tensile load to failure in each femur-graft-tibia complex. In vitro, BMP2 transduction promoted the osteogenic differentiation of ACL-derived CD34+ cells while retaining their intrinsic multipotent capabilities. Osteoblast densities were greatest in the BMP2 (100%) and BMP2 (25%) groups. Bone tunnels in the CD34+BMP2 (100%) and CD34+BMP2 (25%) groups had the smallest cross-sectional areas according to micro-computed tomography analyses. Graft-bone integration occurred most rapidly in the CD34+BMP2 (25%) group. Tensile load to failure was significantly greater in the groups containing stem cells at 4 and 8 weeks after surgery. Tensile strength was greatest in the CD34+BMP2 (100%) group at 4 weeks, and in the CD34+BMP2 (25%) group at 8 weeks. ACL-derived CD34+ cells transduced with BMP2 accelerated graft-bone integration after ACL reconstruction using soft tissue autografts in a rat model, as evidenced by improved histological appearance and graft-bone interface biology along with tensile load to failure at each time point up to 8 weeks after surgery. A primary disadvantage of using soft tissue grafts for ACL reconstruction is the prolonged time required for bony ingrowth, which delays the initiation of midsubstance graft remodeling. The lack of consistent correlation between the appearance of a "healed" ACL on postoperative magnetic resonance imaging and readiness to return to sport results in athletes being released to sport before the graft is ready to handle high-intensity loading. Therefore, it is desirable to identify strategies that accelerate graft-bone integration, which would reduce the time to biologic fixation, improve the reliability of biologic fixation, allow for accelerated rehabilitation, and potentially reduce the incidence of early graft pullout and late midsubstance failure.

  14. Image Processing, Coding, and Compression with Multiple-Point Impulse Response Functions.

    NASA Astrophysics Data System (ADS)

    Stossel, Bryan Joseph

    1995-01-01

    Aspects of image processing, coding, and compression with multiple-point impulse response functions are investigated. Topics considered include characterization of the corresponding random-walk transfer function, image recovery for images degraded by the multiple-point impulse response, and the application of the blur function to image coding and compression. It is found that although the zeros of the real and imaginary parts of the random-walk transfer function occur in continuous, closed contours, the zeros of the transfer function occur at isolated spatial frequencies. Theoretical calculations of the average number of zeros per area are in excellent agreement with experimental results obtained from computer counts of the zeros. The average number of zeros per area is proportional to the standard deviations of the real part of the transfer function as well as the first partial derivatives. Statistical parameters of the transfer function are calculated including the mean, variance, and correlation functions for the real and imaginary parts of the transfer function and their corresponding first partial derivatives. These calculations verify the assumptions required in the derivation of the expression for the average number of zeros. Interesting results are found for the correlations of the real and imaginary parts of the transfer function and their first partial derivatives. The isolated nature of the zeros in the transfer function and its characteristics at high spatial frequencies result in largely reduced reconstruction artifacts and excellent reconstructions are obtained for distributions of impulses consisting of 25 to 150 impulses. The multiple-point impulse response obscures original scenes beyond recognition. This property is important for secure transmission of data on many communication systems. The multiple-point impulse response enables the decoding and restoration of the original scene with very little distortion. Images prefiltered by the random-walk transfer function yield greater compression ratios than are obtained for the original scene. The multiple-point impulse response decreases the bit rate approximately 40-70% and affords near distortion-free reconstructions. Due to the lossy nature of transform-based compression algorithms, noise reduction measures must be incorporated to yield acceptable reconstructions after decompression.

  15. Whose Code of Conduct Matters Most? Examining the Link between Academic Integrity and Student Development

    ERIC Educational Resources Information Center

    Biswas, Ann E.

    2013-01-01

    Although most colleges strive to nurture a culture of integrity, incidents of dishonest behavior are on the rise. This article examines the role student development plays in students' perceptions of academic dishonesty and in their willingness to adhere to a code of conduct that may be in sharp contrast to traditional integrity policies.

  16. Computer codes developed and under development at Lewis

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    1992-01-01

    The objective of this summary is to provide a brief description of: (1) codes developed or under development at LeRC; and (2) the development status of IPACS with some typical early results. The computer codes that have been developed and/or are under development at LeRC are listed in the accompanying charts. This list includes: (1) the code acronym; (2) select physics descriptors; (3) current enhancements; and (4) present (9/91) code status with respect to its availability and documentation. The computer codes list is grouped by related functions such as: (1) composite mechanics; (2) composite structures; (3) integrated and 3-D analysis; (4) structural tailoring; and (5) probabilistic structural analysis. These codes provide a broad computational simulation infrastructure (technology base-readiness) for assessing the structural integrity/durability/reliability of propulsion systems. These codes serve two other very important functions: they provide an effective means of technology transfer; and they constitute a depository of corporate memory.

  17. Study of SOL in DIII-D tokamak with SOLPS suite of codes.

    NASA Astrophysics Data System (ADS)

    Pankin, Alexei; Bateman, Glenn; Brennan, Dylan; Coster, David; Hogan, John; Kritz, Arnold; Kukushkin, Andrey; Schnack, Dalton; Snyder, Phil

    2005-10-01

    The scrape-of-layer (SOL) region in DIII-D tokamak is studied with the SOLPS integrated suite of codes. The SOLPS package includes the 3D multi-species Monte-Carlo neutral code EIRINE and 2D multi-fluid code B2. The EIRINE and B2 codes are cross-coupled through B2-EIRINE interface. The results of SOLPS simulations are used in the integrated modeling of the plasma edge in DIII-D tokamak with the ASTRA transport code. Parameterized dependences for neutral particle fluxes that are computed with the SOLPS code are implemented in a model for the H-mode pedestal and ELMs [1] in the ASTRA code. The effects of neutrals on the H-mode pedestal and ELMs are studied in this report. [1] A. Y. Pankin, I. Voitsekhovitch, G. Bateman, et al., Plasma Phys. Control. Fusion 47, 483 (2005).

  18. Starch biosynthesis in cassava: a genome-based pathway reconstruction and its exploitation in data integration

    PubMed Central

    2013-01-01

    Background Cassava is a well-known starchy root crop utilized for food, feed and biofuel production. However, the comprehension underlying the process of starch production in cassava is not yet available. Results In this work, we exploited the recently released genome information and utilized the post-genomic approaches to reconstruct the metabolic pathway of starch biosynthesis in cassava using multiple plant templates. The quality of pathway reconstruction was assured by the employed parsimonious reconstruction framework and the collective validation steps. Our reconstructed pathway is presented in the form of an informative map, which describes all important information of the pathway, and an interactive map, which facilitates the integration of omics data into the metabolic pathway. Additionally, to demonstrate the advantage of the reconstructed pathways beyond just the schematic presentation, the pathway could be used for incorporating the gene expression data obtained from various developmental stages of cassava roots. Our results exhibited the distinct activities of the starch biosynthesis pathway in different stages of root development at the transcriptional level whereby the activity of the pathway is higher toward the development of mature storage roots. Conclusions To expand its applications, the interactive map of the reconstructed starch biosynthesis pathway is available for download at the SBI group’s website (http://sbi.pdti.kmutt.ac.th/?page_id=33). This work is considered a big step in the quantitative modeling pipeline aiming to investigate the dynamic regulation of starch biosynthesis in cassava roots. PMID:23938102

  19. Axial Cone-Beam Reconstruction by Weighted BPF/DBPF and Orthogonal Butterfly Filtering.

    PubMed

    Tang, Shaojie; Tang, Xiangyang

    2016-09-01

    The backprojection-filtration (BPF) and the derivative backprojection filtered (DBPF) algorithms, in which Hilbert filtering is the common algorithmic feature, are originally derived for exact helical reconstruction from cone-beam (CB) scan data and axial reconstruction from fan beam data, respectively. These two algorithms can be heuristically extended for image reconstruction from axial CB scan data, but induce severe artifacts in images located away from the central plane, determined by the circular source trajectory. We propose an algorithmic solution herein to eliminate the artifacts. The solution is an integration of three-dimensional (3-D) weighted axial CB-BPF/DBPF algorithm with orthogonal butterfly filtering, namely axial CB-BPF/DBPF cascaded with orthogonal butterfly filtering. Using the computer simulated Forbild head and thoracic phantoms that are rigorous in inspecting the reconstruction accuracy, and an anthropomorphic thoracic phantom with projection data acquired by a CT scanner, we evaluate the performance of the proposed algorithm. Preliminary results show that the orthogonal butterfly filtering can eliminate the severe streak artifacts existing in the images reconstructed by the 3-D weighted axial CB-BPF/DBPF algorithm located at off-central planes. Integrated with orthogonal butterfly filtering, the 3-D weighted CB-BPF/DBPF algorithm can perform at least as well as the 3-D weighted CB-FBP algorithm in image reconstruction from axial CB scan data. The proposed 3-D weighted axial CB-BPF/DBPF cascaded with orthogonal butterfly filtering can be an algorithmic solution for CT imaging in extensive clinical and preclinical applications.

  20. Starch biosynthesis in cassava: a genome-based pathway reconstruction and its exploitation in data integration.

    PubMed

    Saithong, Treenut; Rongsirikul, Oratai; Kalapanulak, Saowalak; Chiewchankaset, Porntip; Siriwat, Wanatsanan; Netrphan, Supatcharee; Suksangpanomrung, Malinee; Meechai, Asawin; Cheevadhanarak, Supapon

    2013-08-10

    Cassava is a well-known starchy root crop utilized for food, feed and biofuel production. However, the comprehension underlying the process of starch production in cassava is not yet available. In this work, we exploited the recently released genome information and utilized the post-genomic approaches to reconstruct the metabolic pathway of starch biosynthesis in cassava using multiple plant templates. The quality of pathway reconstruction was assured by the employed parsimonious reconstruction framework and the collective validation steps. Our reconstructed pathway is presented in the form of an informative map, which describes all important information of the pathway, and an interactive map, which facilitates the integration of omics data into the metabolic pathway. Additionally, to demonstrate the advantage of the reconstructed pathways beyond just the schematic presentation, the pathway could be used for incorporating the gene expression data obtained from various developmental stages of cassava roots. Our results exhibited the distinct activities of the starch biosynthesis pathway in different stages of root development at the transcriptional level whereby the activity of the pathway is higher toward the development of mature storage roots. To expand its applications, the interactive map of the reconstructed starch biosynthesis pathway is available for download at the SBI group's website (http://sbi.pdti.kmutt.ac.th/?page_id=33). This work is considered a big step in the quantitative modeling pipeline aiming to investigate the dynamic regulation of starch biosynthesis in cassava roots.

  1. Muon reconstruction efficiency and momentum resolution of the ATLAS experiment in proton–proton collisions at √s = 7 TeV in 2010

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aad, G.; Abajyan, T.; Abbott, B.

    2014-09-16

    This paper presents a study of the performance of the muon reconstruction in the analysis of proton–proton collisions at √s = 7 TeV at the LHC, recorded by the ATLAS detector in 2010. This performance is described in terms of reconstruction and isolation efficiencies and momentum resolutions for different classes of reconstructed muons. The results are obtained from an analysis of J/ψ meson and Z boson decays to dimuons, reconstructed from a data sample corresponding to an integrated luminosity of 40 pb -1. The measured performance is compared to Monte Carlo predictions and deviations from the predicted performance are discussed.

  2. Comparison of numerical techniques for integration of stiff ordinary differential equations arising in combustion chemistry

    NASA Technical Reports Server (NTRS)

    Radhakrishnan, K.

    1984-01-01

    The efficiency and accuracy of several algorithms recently developed for the efficient numerical integration of stiff ordinary differential equations are compared. The methods examined include two general-purpose codes, EPISODE and LSODE, and three codes (CHEMEQ, CREK1D, and GCKP84) developed specifically to integrate chemical kinetic rate equations. The codes are applied to two test problems drawn from combustion kinetics. The comparisons show that LSODE is the fastest code currently available for the integration of combustion kinetic rate equations. An important finding is that an interactive solution of the algebraic energy conservation equation to compute the temperature does not result in significant errors. In addition, this method is more efficient than evaluating the temperature by integrating its time derivative. Significant reductions in computational work are realized by updating the rate constants (k = at(supra N) N exp(-E/RT) only when the temperature change exceeds an amount delta T that is problem dependent. An approximate expression for the automatic evaluation of delta T is derived and is shown to result in increased efficiency.

  3. Experimental demonstration of distributed feedback semiconductor lasers based on reconstruction-equivalent-chirp technology.

    PubMed

    Li, Jingsi; Wang, Huan; Chen, Xiangfei; Yin, Zuowei; Shi, Yuechun; Lu, Yanqing; Dai, Yitang; Zhu, Hongliang

    2009-03-30

    In this paper we report, to the best of our knowledge, the first experimental realization of distributed feedback (DFB) semiconductor lasers based on reconstruction-equivalent-chirp (REC) technology. Lasers with different lasing wavelengths are achieved simultaneously on one chip, which shows a potential for the REC technology in combination with the photonic integrated circuits (PIC) technology to be a possible method for monolithic integration, in that its fabrication is as powerful as electron beam technology and the cost and time-consuming are almost the same as standard holographic technology.

  4. The Muon Ionization Cooling Experiment User Software

    NASA Astrophysics Data System (ADS)

    Dobbs, A.; Rajaram, D.; MICE Collaboration

    2017-10-01

    The Muon Ionization Cooling Experiment (MICE) is a proof-of-principle experiment designed to demonstrate muon ionization cooling for the first time. MICE is currently on Step IV of its data taking programme, where transverse emittance reduction will be demonstrated. The MICE Analysis User Software (MAUS) is the reconstruction, simulation and analysis framework for the MICE experiment. MAUS is used for both offline data analysis and fast online data reconstruction and visualization to serve MICE data taking. This paper provides an introduction to MAUS, describing the central Python and C++ based framework, the data structure and and the code management and testing procedures.

  5. An image filtering technique for SPIDER visible tomography

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fonnesu, N., E-mail: nicola.fonnesu@igi.cnr.it; Agostini, M.; Brombin, M.

    2014-02-15

    The tomographic diagnostic developed for the beam generated in the SPIDER facility (100 keV, 50 A prototype negative ion source of ITER neutral beam injector) will characterize the two-dimensional particle density distribution of the beam. The simulations described in the paper show that instrumental noise has a large influence on the maximum achievable resolution of the diagnostic. To reduce its impact on beam pattern reconstruction, a filtering technique has been adapted and implemented in the tomography code. This technique is applied to the simulated tomographic reconstruction of the SPIDER beam, and the main results are reported.

  6. Coupled optics reconstruction from TBT data using MAD-X

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Alexahin, Y.; Gianfelice-Wendt, E.; /Fermilab

    2007-06-01

    Turn-by-turn BPM data provide immediate information on the coupled optics functions at BPM locations. In the case of small deviations from the known (design) uncoupled optics some cognizance of the sources of perturbation, BPM calibration errors and tilts can also be inferred without detailed lattice modeling. In practical situations, however, fitting the lattice model with the help of some optics code would lead to more reliable results. We present an algorithm for coupled optics reconstruction from TBT data on the basis of MAD-X and give examples of its application for the Fermilab Tevatron accelerator.

  7. Reconstruction of recycling flux from synthetic camera images, evaluated for the Wendelstein 7-X startup limiter

    NASA Astrophysics Data System (ADS)

    Frerichs, H.; Effenberg, F.; Feng, Y.; Schmitz, O.; Stephey, L.; Reiter, D.; Börner, P.; The W7-X Team

    2017-12-01

    The interpretation of spectroscopic measurements in the edge region of high-temperature plasmas can be guided by modeling with the EMC3-EIRENE code. A versatile synthetic diagnostic module, initially developed for the generation of synthetic camera images, has been extended for the evaluation of the inverse problem in which the observable photon flux is related back to the originating particle flux (recycling). An application of this synthetic diagnostic to the startup phase (inboard) limiter in Wendelstein 7-X (W7-X) is presented, and reconstruction of recycling from synthetic observation of \\renewcommand{\

  8. Renovating and Reconstructing in Phases--Specifying Phased Construction.

    ERIC Educational Resources Information Center

    Bunzick, John

    2002-01-01

    Discusses planning for phased school construction projects, including effects on occupancy (for example, construction adjacent to occupied space, construction procedure safety zones near occupied areas, and code-complying means of egress), effects on building systems (such as heating and cooling equipment and power distribution), and contract…

  9. Dual-sided coded-aperture imager

    DOEpatents

    Ziock, Klaus-Peter [Clinton, TN

    2009-09-22

    In a vehicle, a single detector plane simultaneously measures radiation coming through two coded-aperture masks, one on either side of the detector. To determine which side of the vehicle a source is, the two shadow masks are inverses of each other, i.e., one is a mask and the other is the anti-mask. All of the data that is collected is processed through two versions of an image reconstruction algorithm. One treats the data as if it were obtained through the mask, the other as though the data is obtained through the anti-mask.

  10. Analog system for computing sparse codes

    DOEpatents

    Rozell, Christopher John; Johnson, Don Herrick; Baraniuk, Richard Gordon; Olshausen, Bruno A.; Ortman, Robert Lowell

    2010-08-24

    A parallel dynamical system for computing sparse representations of data, i.e., where the data can be fully represented in terms of a small number of non-zero code elements, and for reconstructing compressively sensed images. The system is based on the principles of thresholding and local competition that solves a family of sparse approximation problems corresponding to various sparsity metrics. The system utilizes Locally Competitive Algorithms (LCAs), nodes in a population continually compete with neighboring units using (usually one-way) lateral inhibition to calculate coefficients representing an input in an over complete dictionary.

  11. Study of the IMRT interplay effect using a 4DCT Monte Carlo dose calculation.

    PubMed

    Jensen, Michael D; Abdellatif, Ady; Chen, Jeff; Wong, Eugene

    2012-04-21

    Respiratory motion may lead to dose errors when treating thoracic and abdominal tumours with radiotherapy. The interplay between complex multileaf collimator patterns and patient respiratory motion could result in unintuitive dose changes. We have developed a treatment reconstruction simulation computer code that accounts for interplay effects by combining multileaf collimator controller log files, respiratory trace log files, 4DCT images and a Monte Carlo dose calculator. Two three-dimensional (3D) IMRT step-and-shoot plans, a concave target and integrated boost were delivered to a 1D rigid motion phantom. Three sets of experiments were performed with 100%, 50% and 25% duty cycle gating. The log files were collected, and five simulation types were performed on each data set: continuous isocentre shift, discrete isocentre shift, 4DCT, 4DCT delivery average and 4DCT plan average. Analysis was performed using 3D gamma analysis with passing criteria of 2%, 2 mm. The simulation framework was able to demonstrate that a single fraction of the integrated boost plan was more sensitive to interplay effects than the concave target. Gating was shown to reduce the interplay effects. We have developed a 4DCT Monte Carlo simulation method that accounts for IMRT interplay effects with respiratory motion by utilizing delivery log files.

  12. Meaning Reconstruction Process After Suicide: Life-Story of a Japanese Woman Who Lost Her Son to Suicide.

    PubMed

    Kawashima, Daisuke; Kawano, Kenji

    2017-09-01

    Although Japan has a high suicide rate, there is insufficient research on the experiences of suicide-bereaved individuals. We investigated the qualitative aspects of the meaning reconstruction process after a loss to suicide. We conducted a life-story interview using open-ended questions with one middle-aged Japanese woman who lost her son to suicide. We used a narrative approach to transcribe and code the participant's narratives for analysis. The analysis revealed three meaning groups that structured the participant's reactions to the suicide: making sense of her son's death and life, relationships with other people, and reconstruction of a bond with the deceased. The belief that death is not an eternal split and that there is a connection between the living and the deceased reduced the pain felt by our participant. Furthermore, the narratives worked as scaffolds in the meaning reconstruction process. We discuss our results in the light of cross-cultural differences in the grieving process.

  13. Fourier ptychographic reconstruction using Poisson maximum likelihood and truncated Wirtinger gradient.

    PubMed

    Bian, Liheng; Suo, Jinli; Chung, Jaebum; Ou, Xiaoze; Yang, Changhuei; Chen, Feng; Dai, Qionghai

    2016-06-10

    Fourier ptychographic microscopy (FPM) is a novel computational coherent imaging technique for high space-bandwidth product imaging. Mathematically, Fourier ptychographic (FP) reconstruction can be implemented as a phase retrieval optimization process, in which we only obtain low resolution intensity images corresponding to the sub-bands of the sample's high resolution (HR) spatial spectrum, and aim to retrieve the complex HR spectrum. In real setups, the measurements always suffer from various degenerations such as Gaussian noise, Poisson noise, speckle noise and pupil location error, which would largely degrade the reconstruction. To efficiently address these degenerations, we propose a novel FP reconstruction method under a gradient descent optimization framework in this paper. The technique utilizes Poisson maximum likelihood for better signal modeling, and truncated Wirtinger gradient for effective error removal. Results on both simulated data and real data captured using our laser-illuminated FPM setup show that the proposed method outperforms other state-of-the-art algorithms. Also, we have released our source code for non-commercial use.

  14. Computer Reconstruction of Plant Growth and Chlorophyll Fluorescence Emission in Three Spatial Dimensions

    PubMed Central

    Bellasio, Chandra; Olejníčková, Julie; Tesař, Radek; Šebela, David; Nedbal, Ladislav

    2012-01-01

    Plant leaves grow and change their orientation as well their emission of chlorophyll fluorescence in time. All these dynamic plant properties can be semi-automatically monitored by a 3D imaging system that generates plant models by the method of coded light illumination, fluorescence imaging and computer 3D reconstruction. Here, we describe the essentials of the method, as well as the system hardware. We show that the technique can reconstruct, with a high fidelity, the leaf size, the leaf angle and the plant height. The method fails with wilted plants when leaves overlap obscuring their true area. This effect, naturally, also interferes when the method is applied to measure plant growth under water stress. The method is, however, very potent in capturing the plant dynamics under mild stress and without stress. The 3D reconstruction is also highly effective in correcting geometrical factors that distort measurements of chlorophyll fluorescence emission of naturally positioned plant leaves. PMID:22368511

  15. GPU-Based Real-Time Volumetric Ultrasound Image Reconstruction for a Ring Array

    PubMed Central

    Choe, Jung Woo; Nikoozadeh, Amin; Oralkan, Ömer; Khuri-Yakub, Butrus T.

    2014-01-01

    Synthetic phased array (SPA) beamforming with Hadamard coding and aperture weighting is an optimal option for real-time volumetric imaging with a ring array, a particularly attractive geometry in intracardiac and intravascular applications. However, the imaging frame rate of this method is limited by the immense computational load required in synthetic beamforming. For fast imaging with a ring array, we developed graphics processing unit (GPU)-based, real-time image reconstruction software that exploits massive data-level parallelism in beamforming operations. The GPU-based software reconstructs and displays three cross-sectional images at 45 frames per second (fps). This frame rate is 4.5 times higher than that for our previously-developed multi-core CPU-based software. In an alternative imaging mode, it shows one B-mode image rotating about the axis and its maximum intensity projection (MIP), processed at a rate of 104 fps. This paper describes the image reconstruction procedure on the GPU platform and presents the experimental images obtained using this software. PMID:23529080

  16. Computer reconstruction of plant growth and chlorophyll fluorescence emission in three spatial dimensions.

    PubMed

    Bellasio, Chandra; Olejníčková, Julie; Tesař, Radek; Sebela, David; Nedbal, Ladislav

    2012-01-01

    Plant leaves grow and change their orientation as well their emission of chlorophyll fluorescence in time. All these dynamic plant properties can be semi-automatically monitored by a 3D imaging system that generates plant models by the method of coded light illumination, fluorescence imaging and computer 3D reconstruction. Here, we describe the essentials of the method, as well as the system hardware. We show that the technique can reconstruct, with a high fidelity, the leaf size, the leaf angle and the plant height. The method fails with wilted plants when leaves overlap obscuring their true area. This effect, naturally, also interferes when the method is applied to measure plant growth under water stress. The method is, however, very potent in capturing the plant dynamics under mild stress and without stress. The 3D reconstruction is also highly effective in correcting geometrical factors that distort measurements of chlorophyll fluorescence emission of naturally positioned plant leaves.

  17. Highly undersampled MR image reconstruction using an improved dual-dictionary learning method with self-adaptive dictionaries.

    PubMed

    Li, Jiansen; Song, Ying; Zhu, Zhen; Zhao, Jun

    2017-05-01

    Dual-dictionary learning (Dual-DL) method utilizes both a low-resolution dictionary and a high-resolution dictionary, which are co-trained for sparse coding and image updating, respectively. It can effectively exploit a priori knowledge regarding the typical structures, specific features, and local details of training sets images. The prior knowledge helps to improve the reconstruction quality greatly. This method has been successfully applied in magnetic resonance (MR) image reconstruction. However, it relies heavily on the training sets, and dictionaries are fixed and nonadaptive. In this research, we improve Dual-DL by using self-adaptive dictionaries. The low- and high-resolution dictionaries are updated correspondingly along with the image updating stage to ensure their self-adaptivity. The updated dictionaries incorporate both the prior information of the training sets and the test image directly. Both dictionaries feature improved adaptability. Experimental results demonstrate that the proposed method can efficiently and significantly improve the quality and robustness of MR image reconstruction.

  18. Test of 3D CT reconstructions by EM + TV algorithm from undersampled data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Evseev, Ivan; Ahmann, Francielle; Silva, Hamilton P. da

    2013-05-06

    Computerized tomography (CT) plays an important role in medical imaging for diagnosis and therapy. However, CT imaging is connected with ionization radiation exposure of patients. Therefore, the dose reduction is an essential issue in CT. In 2011, the Expectation Maximization and Total Variation Based Model for CT Reconstruction (EM+TV) was proposed. This method can reconstruct a better image using less CT projections in comparison with the usual filtered back projection (FBP) technique. Thus, it could significantly reduce the overall dose of radiation in CT. This work reports the results of an independent numerical simulation for cone beam CT geometry withmore » alternative virtual phantoms. As in the original report, the 3D CT images of 128 Multiplication-Sign 128 Multiplication-Sign 128 virtual phantoms were reconstructed. It was not possible to implement phantoms with lager dimensions because of the slowness of code execution even by the CORE i7 CPU.« less

  19. Real Time Computation of Kinetic Constraints to Support Equilibrium Reconstruction

    NASA Astrophysics Data System (ADS)

    Eggert, W. J.; Kolemen, E.; Eldon, D.

    2016-10-01

    A new method for quickly and automatically applying kinetic constraints to EFIT equilibrium reconstructions using readily available data is presented. The ultimate goal is to produce kinetic equilibrium reconstructions in real time and use them to constrain the DCON stability code as part of a disruption avoidance scheme. A first effort presented here replaces CPU-time expensive modules, such as the fast ion pressure profile calculation, with a simplified model. We show with a DIII-D database analysis that we can achieve reasonable predictions for selected applications by modeling the fast ion pressure profile and determining the fit parameters as functions of easily measured quantities including neutron rate and electron temperature on axis. Secondly, we present a strategy for treating Thomson scattering and Charge Exchange Recombination data to automatically form constraints for a kinetic equilibrium reconstruction, a process that historically was performed by hand. Work supported by US DOE DE-AC02-09CH11466 and DE-FC02-04ER54698.

  20. An integrity measure to benchmark quantum error correcting memories

    NASA Astrophysics Data System (ADS)

    Xu, Xiaosi; de Beaudrap, Niel; O'Gorman, Joe; Benjamin, Simon C.

    2018-02-01

    Rapidly developing experiments across multiple platforms now aim to realise small quantum codes, and so demonstrate a memory within which a logical qubit can be protected from noise. There is a need to benchmark the achievements in these diverse systems, and to compare the inherent power of the codes they rely upon. We describe a recently introduced performance measure called integrity, which relates to the probability that an ideal agent will successfully ‘guess’ the state of a logical qubit after a period of storage in the memory. Integrity is straightforward to evaluate experimentally without state tomography and it can be related to various established metrics such as the logical fidelity and the pseudo-threshold. We offer a set of experimental milestones that are steps towards demonstrating unconditionally superior encoded memories. Using intensive numerical simulations we compare memories based on the five-qubit code, the seven-qubit Steane code, and a nine-qubit code which is the smallest instance of a surface code; we assess both the simple and fault-tolerant implementations of each. While the ‘best’ code upon which to base a memory does vary according to the nature and severity of the noise, nevertheless certain trends emerge.

  1. Is Remnant Preservation Truly Beneficial to Anterior Cruciate Ligament Reconstruction Healing? Clinical and Magnetic Resonance Imaging Evaluations of Remnant-Preserved Reconstruction.

    PubMed

    Naraoka, Takuya; Kimura, Yuka; Tsuda, Eiichi; Yamamoto, Yuji; Ishibashi, Yasuyuki

    2017-04-01

    Remnant-preserved anterior cruciate ligament (ACL) reconstruction was introduced to improve clinical outcomes and biological healing. However, the effects of remnant preservation and the influence of the delay from injury until reconstruction on the outcomes of this technique are still uncertain. Purpose/Hypothesis: The purposes of this study were to evaluate whether remnant preservation improved the clinical outcomes and graft incorporation of ACL reconstruction and to examine the influence of the delay between ACL injury and reconstruction on the usefulness of remnant preservation. We hypothesized that remnant preservation improves clinical results and accelerates graft incorporation and that its effect is dependent on the delay between ACL injury and reconstruction. Cohort study; Level of evidence, 2. A total of 151 consecutive patients who underwent double-bundle ACL reconstruction using a semitendinosus graft were enrolled in this study: 74 knees underwent ACL reconstruction without a remnant (or the remnant was <25% of the intra-articular portion of the graft; NR group), while 77 knees underwent ACL reconstruction with remnant preservation (RP group). These were divided into 4 subgroups based on the time from injury to surgery: phase 1 was <3 weeks (n = 24), phase 2 was 3 to less than 8 weeks (n = 70), phase 3 was 8 to 20 weeks (n = 32), and phase 4 was >20 weeks (n = 25). Clinical measurements, including KT-1000 arthrometer side-to-side anterior tibial translation measurements, were assessed at 3, 6, 12, and 24 months after reconstruction. Magnetic resonance imaging evaluations of graft maturation and graft-tunnel integration of the anteromedial and posterolateral bundles were assessed at 3, 6, and 12 months after reconstruction. There was no difference in side-to-side anterior tibial translation between the NR and RP groups. There was also no difference in graft maturation between the 2 groups. Furthermore, the time from ACL injury until reconstruction did not affect graft maturation, except in the case of very long delays before reconstruction (phase 4). Graft-tunnel integration was significantly increased in both groups in a time-dependent manner. However, there was no difference between the NR and RP groups. Remnant preservation did not improve knee stability at 2 years after ACL reconstruction. Furthermore, remnant preservation did not accelerate graft incorporation, especially during the acute and subacute injury phases.

  2. An electron tomography algorithm for reconstructing 3D morphology using surface tangents of projected scattering interfaces

    NASA Astrophysics Data System (ADS)

    Petersen, T. C.; Ringer, S. P.

    2010-03-01

    Upon discerning the mere shape of an imaged object, as portrayed by projected perimeters, the full three-dimensional scattering density may not be of particular interest. In this situation considerable simplifications to the reconstruction problem are possible, allowing calculations based upon geometric principles. Here we describe and provide an algorithm which reconstructs the three-dimensional morphology of specimens from tilt series of images for application to electron tomography. Our algorithm uses a differential approach to infer the intersection of projected tangent lines with surfaces which define boundaries between regions of different scattering densities within and around the perimeters of specimens. Details of the algorithm implementation are given and explained using reconstruction calculations from simulations, which are built into the code. An experimental application of the algorithm to a nano-sized Aluminium tip is also presented to demonstrate practical analysis for a real specimen. Program summaryProgram title: STOMO version 1.0 Catalogue identifier: AEFS_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEFS_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 2988 No. of bytes in distributed program, including test data, etc.: 191 605 Distribution format: tar.gz Programming language: C/C++ Computer: PC Operating system: Windows XP RAM: Depends upon the size of experimental data as input, ranging from 200 Mb to 1.5 Gb Supplementary material: Sample output files, for the test run provided, are available. Classification: 7.4, 14 External routines: Dev-C++ ( http://www.bloodshed.net/devcpp.html) Nature of problem: Electron tomography of specimens for which conventional back projection may fail and/or data for which there is a limited angular range. The algorithm does not solve the tomographic back-projection problem but rather reconstructs the local 3D morphology of surfaces defined by varied scattering densities. Solution method: Reconstruction using differential geometry applied to image analysis computations. Restrictions: The code has only been tested with square images and has been developed for only single-axis tilting. Running time: For high quality reconstruction, 5-15 min

  3. Using Coding Apps to Support Literacy Instruction and Develop Coding Literacy

    ERIC Educational Resources Information Center

    Hutchison, Amy; Nadolny, Larysa; Estapa, Anne

    2016-01-01

    In this article the authors present the concept of Coding Literacy and describe the ways in which coding apps can support the development of Coding Literacy and disciplinary and digital literacy skills. Through detailed examples, we describe how coding apps can be integrated into literacy instruction to support learning of the Common Core English…

  4. Asymmetric Memory Circuit Would Resist Soft Errors

    NASA Technical Reports Server (NTRS)

    Buehler, Martin G.; Perlman, Marvin

    1990-01-01

    Some nonlinear error-correcting codes more efficient in presence of asymmetry. Combination of circuit-design and coding concepts expected to make integrated-circuit random-access memories more resistant to "soft" errors (temporary bit errors, also called "single-event upsets" due to ionizing radiation). Integrated circuit of new type made deliberately more susceptible to one kind of bit error than to other, and associated error-correcting code adapted to exploit this asymmetry in error probabilities.

  5. Partial correlation properties of pseudonoise /PN/ codes in noncoherent synchronization/detection schemes

    NASA Technical Reports Server (NTRS)

    Cartier, D. E.

    1976-01-01

    This concise paper considers the effect on the autocorrelation function of a pseudonoise (PN) code when the acquisition scheme only integrates coherently over part of the code and then noncoherently combines these results. The peak-to-null ratio of the effective PN autocorrelation function is shown to degrade to the square root of n, where n is the number of PN symbols over which coherent integration takes place.

  6. Highly undersampled contrast-enhanced MRA with iterative reconstruction: Integration in a clinical setting.

    PubMed

    Stalder, Aurelien F; Schmidt, Michaela; Quick, Harald H; Schlamann, Marc; Maderwald, Stefan; Schmitt, Peter; Wang, Qiu; Nadar, Mariappan S; Zenge, Michael O

    2015-12-01

    To integrate, optimize, and evaluate a three-dimensional (3D) contrast-enhanced sparse MRA technique with iterative reconstruction on a standard clinical MR system. Data were acquired using a highly undersampled Cartesian spiral phyllotaxis sampling pattern and reconstructed directly on the MR system with an iterative SENSE technique. Undersampling, regularization, and number of iterations of the reconstruction were optimized and validated based on phantom experiments and patient data. Sparse MRA of the whole head (field of view: 265 × 232 × 179 mm(3) ) was investigated in 10 patient examinations. High-quality images with 30-fold undersampling, resulting in 0.7 mm isotropic resolution within 10 s acquisition, were obtained. After optimization of the regularization factor and of the number of iterations of the reconstruction, it was possible to reconstruct images with excellent quality within six minutes per 3D volume. Initial results of sparse contrast-enhanced MRA (CEMRA) in 10 patients demonstrated high-quality whole-head first-pass MRA for both the arterial and venous contrast phases. While sparse MRI techniques have not yet reached clinical routine, this study demonstrates the technical feasibility of high-quality sparse CEMRA of the whole head in a clinical setting. Sparse CEMRA has the potential to become a viable alternative where conventional CEMRA is too slow or does not provide sufficient spatial resolution. © 2014 Wiley Periodicals, Inc.

  7. Integrating multisensor satellite data merging and image reconstruction in support of machine learning for better water quality management.

    PubMed

    Chang, Ni-Bin; Bai, Kaixu; Chen, Chi-Farn

    2017-10-01

    Monitoring water quality changes in lakes, reservoirs, estuaries, and coastal waters is critical in response to the needs for sustainable development. This study develops a remote sensing-based multiscale modeling system by integrating multi-sensor satellite data merging and image reconstruction algorithms in support of feature extraction with machine learning leading to automate continuous water quality monitoring in environmentally sensitive regions. This new Earth observation platform, termed "cross-mission data merging and image reconstruction with machine learning" (CDMIM), is capable of merging multiple satellite imageries to provide daily water quality monitoring through a series of image processing, enhancement, reconstruction, and data mining/machine learning techniques. Two existing key algorithms, including Spectral Information Adaptation and Synthesis Scheme (SIASS) and SMart Information Reconstruction (SMIR), are highlighted to support feature extraction and content-based mapping. Whereas SIASS can support various data merging efforts to merge images collected from cross-mission satellite sensors, SMIR can overcome data gaps by reconstructing the information of value-missing pixels due to impacts such as cloud obstruction. Practical implementation of CDMIM was assessed by predicting the water quality over seasons in terms of the concentrations of nutrients and chlorophyll-a, as well as water clarity in Lake Nicaragua, providing synergistic efforts to better monitor the aquatic environment and offer insightful lake watershed management strategies. Copyright © 2017 Elsevier Ltd. All rights reserved.

  8. Removal of two large-scale cosmic microwave background anomalies after subtraction of the integrated Sachs-Wolfe effect

    NASA Astrophysics Data System (ADS)

    Rassat, A.; Starck, J.-L.; Dupé, F.-X.

    2013-09-01

    Context. Although there is currently a debate over the significance of the claimed large-scale anomalies in the cosmic microwave background (CMB), their existence is not totally dismissed. In parallel to the debate over their statistical significance, recent work has also focussed on masks and secondary anisotropies as potential sources of these anomalies. Aims: In this work we investigate simultaneously the impact of the method used to account for masked regions as well as the impact of the integrated Sachs-Wolfe (ISW) effect, which is the large-scale secondary anisotropy most likely to affect the CMB anomalies. In this sense, our work is an update of previous works. Our aim is to identify trends in CMB data from different years and with different mask treatments. Methods: We reconstruct the ISW signal due to 2 Micron All-Sky Survey (2MASS) and NRAO VLA Sky Survey (NVSS) galaxies, effectively reconstructing the low-redshift ISW signal out to z ~ 1. We account for regions of missing data using the sparse inpainting technique. We test sparse inpainting of the CMB, large scale structure and ISW and find that it constitutes a bias-free reconstruction method suitable to study large-scale statistical isotropy and the ISW effect. Results: We focus on three large-scale CMB anomalies: the low quadrupole, the quadrupole/octopole alignment, and the octopole planarity. After sparse inpainting, the low quadrupole becomes more anomalous, whilst the quadrupole/octopole alignment becomes less anomalous. The significance of the low quadrupole is unchanged after subtraction of the ISW effect, while the trend amongst the CMB maps is that both the low quadrupole and the quadrupole/octopole alignment have reduced significance, yet other hypotheses remain possible as well (e.g. exotic physics). Our results also suggest that both of these anomalies may be due to the quadrupole alone. While the octopole planarity significance is reduced after inpainting and after ISW subtraction, however, we do not find that it was very anomalous to start with. In the spirit of participating in reproducible research, we make all codes and resulting products which constitute main results of this paper public here: http://www.cosmostat.org/anomaliesCMB.html

  9. The Integration of COTS/GOTS within NASA's HST Command and Control System

    NASA Technical Reports Server (NTRS)

    Pfarr, Thomas; Reis, James E.; Obenschain, Arthur F. (Technical Monitor)

    2001-01-01

    NASA's mission critical Hubble Space Telescope (HST) command and control system has been re-engineered with COTS/GOTS and minimal custom code. This paper focuses on the design of this new HST Control Center System (CCS) and the lessons learned throughout its development. CCS currently utilizes 31 COTS/GOTS products with an additional 12 million lines of custom glueware code; the new CCS exceeds the capabilities of the original system while significantly reducing the lines of custom code by more than 50%. The lifecycle of COTS/GOTS products will be examined including the pack-age selection process, evaluation process, and integration process. The advantages, disadvantages, issues, concerns, and lessons teamed for integrating COTS/GOTS into the NASA's mission critical HST CCS will be examined in detail. Command and control systems designed with traditional custom code development efforts will be compared with command and control systems designed with new development techniques relying heavily on COTS/COTS integration. This paper will reveal the many hidden costs of COTS/GOTS solutions when compared to traditional custom code development efforts; this paper will show the high cost of COTS/GOTS solutions including training expenses, consulting fees, and long-term maintenance expenses.

  10. Cosmic microwave background reconstruction from WMAP and Planck PR2 data

    NASA Astrophysics Data System (ADS)

    Bobin, J.; Sureau, F.; Starck, J.-L.

    2016-06-01

    We describe a new estimate of the cosmic microwave background (CMB) intensity map reconstructed by a joint analysis of the full Planck 2015 data (PR2) and nine years of WMAP data. The proposed map provides more than a mere update of the CMB map introduced in a previous paper since it benefits from an improvement of the component separation method L-GMCA (Local-Generalized Morphological Component Analysis), which facilitates efficient separation of correlated components. Based on the most recent CMB data, we further confirm previous results showing that the proposed CMB map estimate exhibits appealing characteristics for astrophysical and cosmological applications: I) it is a full-sky map as it did not require any inpainting or interpolation postprocessing; II) foreground contamination is very low even on the galactic center; and III) the map does not exhibit any detectable trace of thermal Sunyaev-Zel'dovich contamination. We show that its power spectrum is in good agreement with the Planck PR2 official theoretical best-fit power spectrum. Finally, following the principle of reproducible research, we provide the codes to reproduce the L-GMCA, which makes it the only reproducible CMB map. The reconstructed CMB map and the code are only available at the CDS via anonymous ftp to http://cdsarc.u-strasbg.fr (http://130.79.128.5) or via http://cdsarc.u-strasbg.fr/viz-bin/qcat?J/A+A/591/A50

  11. Delivering Breast Reconstruction Information to Patients: Women Report on Preferred Information Delivery Styles and Options.

    PubMed

    Webb, Carmen; Sharma, Vishal; Temple-Oberle, Claire

    2018-02-01

    To discover missed opportunities for providing information to women undergoing breast reconstruction in an effort to decrease regret and improve patient education, teaching modalities, and satisfaction. Thirty- to 45-minute semi-structured interviews were conducted exploring patient experiences with information provision on breast reconstruction. Purposeful sampling was used to include women with a variety of reconstruction types at different time points along their recovery. Using grounded theory methodology, 2 independent reviewers analyzed the transcripts and generated thematic codes based on patient responses. BREAST-Q scores were also collected to compare satisfaction scores with qualitative responses. Patients were interested in a wide variety of topics related to breast reconstruction including the pros and cons of different options, nipple-sparing mastectomies, immediate breast reconstruction, oncological safety/monitoring and the impact of chemotherapy and radiotherapy, secondary procedures (balancing, nipple reconstruction), post-operative recovery, and long-term expectations. Patients valued accessing information from multiple sources, seeing numerous photographs, being guided to reliable information online, and having access to a frequently asked questions file or document. Information delivery via interaction with medical personnel and previously reconstructed patients was most appreciated. Compared with BREAST-Q scores for satisfaction with the plastic surgeon (mean: 95.7, range: 60-100), informational satisfaction scores were lower at 74.7 (50-100), confirming the informational gaps expressed by interviewees. Women having recently undergone breast reconstruction reported key deficiencies in information provided prior to surgery and identified preferred information delivery options. Addressing women's educational needs is important to achieve appropriate expectations and improve satisfaction.

  12. Recording from two neurons: second-order stimulus reconstruction from spike trains and population coding.

    PubMed

    Fernandes, N M; Pinto, B D L; Almeida, L O B; Slaets, J F W; Köberle, R

    2010-10-01

    We study the reconstruction of visual stimuli from spike trains, representing the reconstructed stimulus by a Volterra series up to second order. We illustrate this procedure in a prominent example of spiking neurons, recording simultaneously from the two H1 neurons located in the lobula plate of the fly Chrysomya megacephala. The fly views two types of stimuli, corresponding to rotational and translational displacements. Second-order reconstructions require the manipulation of potentially very large matrices, which obstructs the use of this approach when there are many neurons. We avoid the computation and inversion of these matrices using a convenient set of basis functions to expand our variables in. This requires approximating the spike train four-point functions by combinations of two-point functions similar to relations, which would be true for gaussian stochastic processes. In our test case, this approximation does not reduce the quality of the reconstruction. The overall contribution to stimulus reconstruction of the second-order kernels, measured by the mean squared error, is only about 5% of the first-order contribution. Yet at specific stimulus-dependent instants, the addition of second-order kernels represents up to 100% improvement, but only for rotational stimuli. We present a perturbative scheme to facilitate the application of our method to weakly correlated neurons.

  13. Yangians in Integrable Field Theories, Spin Chains and Gauge-String Dualities

    NASA Astrophysics Data System (ADS)

    Spill, Fabian

    In the following paper, which is based on the author's PhD thesis submitted to Imperial College London, we explore the applicability of Yangian symmetry to various integrable models, in particular, in relation with S-matrices. One of the main themes in this work is that, after a careful study of the mathematics of the symmetry algebras one finds that in an integrable model, one can directly reconstruct S-matrices just from the algebra. It has been known for a long time that S-matrices in integrable models are fixed by symmetry. However, Lie algebra symmetry, the Yang-Baxter equation, crossing and unitarity, which constrain the S-matrix in integrable models, are often taken to be separate, independent properties of the S-matrix. Here, we construct scattering matrices purely from the Yangian, showing that the Yangian is the right algebraic object to unify all required symmetries of many integrable models. In particular, we reconstruct the S-matrix of the principal chiral field, and, up to a CDD factor, of other integrable field theories with 𝔰𝔲(n) symmetry. Furthermore, we study the AdS/CFT correspondence, which is also believed to be integrable in the planar limit. We reconstruct the S-matrices at weak and at strong coupling from the Yangian or its classical limit. We give a pedagogical introduction into the subject, presenting a unified perspective of Yangians and their applications in physics. This paper should hence be accessible to mathematicians who would like to explore the application of algebraic objects to physics as well as to physicists interested in a deeper understanding of the mathematical origin of physical quantities.

  14. OPTIMASS: a package for the minimization of kinematic mass functions with constraints

    NASA Astrophysics Data System (ADS)

    Cho, Won Sang; Gainer, James S.; Kim, Doojin; Lim, Sung Hak; Matchev, Konstantin T.; Moortgat, Filip; Pape, Luc; Park, Myeonghun

    2016-01-01

    Reconstructed mass variables, such as M 2, M 2 C , M T * , and M T2 W , play an essential role in searches for new physics at hadron colliders. The calculation of these variables generally involves constrained minimization in a large parameter space, which is numerically challenging. We provide a C++ code, O ptimass, which interfaces with the M inuit library to perform this constrained minimization using the Augmented Lagrangian Method. The code can be applied to arbitrarily general event topologies, thus allowing the user to significantly extend the existing set of kinematic variables. We describe this code, explain its physics motivation, and demonstrate its use in the analysis of the fully leptonic decay of pair-produced top quarks using M 2 variables.

  15. Wavelet-based audio embedding and audio/video compression

    NASA Astrophysics Data System (ADS)

    Mendenhall, Michael J.; Claypoole, Roger L., Jr.

    2001-12-01

    Watermarking, traditionally used for copyright protection, is used in a new and exciting way. An efficient wavelet-based watermarking technique embeds audio information into a video signal. Several effective compression techniques are applied to compress the resulting audio/video signal in an embedded fashion. This wavelet-based compression algorithm incorporates bit-plane coding, index coding, and Huffman coding. To demonstrate the potential of this audio embedding and audio/video compression algorithm, we embed an audio signal into a video signal and then compress. Results show that overall compression rates of 15:1 can be achieved. The video signal is reconstructed with a median PSNR of nearly 33 dB. Finally, the audio signal is extracted from the compressed audio/video signal without error.

  16. Speech coding, reconstruction and recognition using acoustics and electromagnetic waves

    DOEpatents

    Holzrichter, J.F.; Ng, L.C.

    1998-03-17

    The use of EM radiation in conjunction with simultaneously recorded acoustic speech information enables a complete mathematical coding of acoustic speech. The methods include the forming of a feature vector for each pitch period of voiced speech and the forming of feature vectors for each time frame of unvoiced, as well as for combined voiced and unvoiced speech. The methods include how to deconvolve the speech excitation function from the acoustic speech output to describe the transfer function each time frame. The formation of feature vectors defining all acoustic speech units over well defined time frames can be used for purposes of speech coding, speech compression, speaker identification, language-of-speech identification, speech recognition, speech synthesis, speech translation, speech telephony, and speech teaching. 35 figs.

  17. Speech coding, reconstruction and recognition using acoustics and electromagnetic waves

    DOEpatents

    Holzrichter, John F.; Ng, Lawrence C.

    1998-01-01

    The use of EM radiation in conjunction with simultaneously recorded acoustic speech information enables a complete mathematical coding of acoustic speech. The methods include the forming of a feature vector for each pitch period of voiced speech and the forming of feature vectors for each time frame of unvoiced, as well as for combined voiced and unvoiced speech. The methods include how to deconvolve the speech excitation function from the acoustic speech output to describe the transfer function each time frame. The formation of feature vectors defining all acoustic speech units over well defined time frames can be used for purposes of speech coding, speech compression, speaker identification, language-of-speech identification, speech recognition, speech synthesis, speech translation, speech telephony, and speech teaching.

  18. Newly-Developed 3D GRMHD Code and its Application to Jet Formation

    NASA Technical Reports Server (NTRS)

    Mizuno, Y.; Nishikawa, K.-I.; Koide, S.; Hardee, P.; Fishman, G. J.

    2006-01-01

    We have developed a new three-dimensional general relativistic magnetohydrodynamic code by using a conservative, high-resolution shock-capturing scheme. The numerical fluxes are calculated using the HLL approximate Riemann solver scheme. The flux-interpolated constrained transport scheme is used to maintain a divergence-free magnetic field. We have performed various 1-dimensional test problems in both special and general relativity by using several reconstruction methods and found that the new 3D GRMHD code shows substantial improvements over our previous model. The . preliminary results show the jet formations from a geometrically thin accretion disk near a non-rotating and a rotating black hole. We will discuss the jet properties depended on the rotation of a black hole and the magnetic field strength.

  19. Coded aperture detector: an image sensor with sub 20-nm pixel resolution.

    PubMed

    Miyakawa, Ryan; Mayer, Rafael; Wojdyla, Antoine; Vannier, Nicolas; Lesser, Ian; Aron-Dine, Shifrah; Naulleau, Patrick

    2014-08-11

    We describe the coded aperture detector, a novel image sensor based on uniformly redundant arrays (URAs) with customizable pixel size, resolution, and operating photon energy regime. In this sensor, a coded aperture is scanned laterally at the image plane of an optical system, and the transmitted intensity is measured by a photodiode. The image intensity is then digitally reconstructed using a simple convolution. We present results from a proof-of-principle optical prototype, demonstrating high-fidelity image sensing comparable to a CCD. A 20-nm half-pitch URA fabricated by the Center for X-ray Optics (CXRO) nano-fabrication laboratory is presented that is suitable for high-resolution image sensing at EUV and soft X-ray wavelengths.

  20. An Implementation Of Elias Delta Code And ElGamal Algorithm In Image Compression And Security

    NASA Astrophysics Data System (ADS)

    Rachmawati, Dian; Andri Budiman, Mohammad; Saffiera, Cut Amalia

    2018-01-01

    In data transmission such as transferring an image, confidentiality, integrity, and efficiency of data storage aspects are highly needed. To maintain the confidentiality and integrity of data, one of the techniques used is ElGamal. The strength of this algorithm is found on the difficulty of calculating discrete logs in a large prime modulus. ElGamal belongs to the class of Asymmetric Key Algorithm and resulted in enlargement of the file size, therefore data compression is required. Elias Delta Code is one of the compression algorithms that use delta code table. The image was first compressed using Elias Delta Code Algorithm, then the result of the compression was encrypted by using ElGamal algorithm. Prime test was implemented using Agrawal Biswas Algorithm. The result showed that ElGamal method could maintain the confidentiality and integrity of data with MSE and PSNR values 0 and infinity. The Elias Delta Code method generated compression ratio and space-saving each with average values of 62.49%, and 37.51%.

  1. CT Evaluation of Small-Diameter Coronary Artery Stents: Effect of an Integrated Circuit Detector with Iterative Reconstruction.

    PubMed

    Geyer, Lucas L; Glenn, G Russell; De Cecco, Carlo Nicola; Van Horn, Mark; Canstein, Christian; Silverman, Justin R; Krazinski, Aleksander W; Kemper, Jenny M; Bucher, Andreas; Ebersberger, Ullrich; Costello, Philip; Bamberg, Fabian; Schoepf, U Joseph

    2015-09-01

    To use suitable objective methods of analysis to assess the influence of the combination of an integrated-circuit computed tomographic (CT) detector and iterative reconstruction (IR) algorithms on the visualization of small (≤3-mm) coronary artery stents. By using a moving heart phantom, 18 data sets obtained from three coronary artery stents with small diameters were investigated. A second-generation dual-source CT system equipped with an integrated-circuit detector was used. Images were reconstructed with filtered back-projection (FBP) and IR at a section thickness of 0.75 mm (FBP75 and IR75, respectively) and IR at a section thickness of 0.50 mm (IR50). Multirow intensity profiles in Hounsfield units were modeled by using a sum-of-Gaussians fit to analyze in-plane image characteristics. Out-of-plane image characteristics were analyzed with z upslope of multicolumn intensity profiles in Hounsfield units. Statistical analysis was conducted with one-way analysis of variance and the Student t test. Independent of stent diameter and heart rate, IR75 resulted in significantly increased xy sharpness, signal-to-noise ratio, and contrast-to-noise ratio, as well as decreased blurring and noise compared with FBP75 (eg, 2.25-mm stent, 0 beats per minute; xy sharpness, 278.2 vs 252.3; signal-to-noise ratio, 46.6 vs 33.5; contrast-to-noise ratio, 26.0 vs 16.8; blurring, 1.4 vs 1.5; noise, 15.4 vs 21.2; all P < .001). In the z direction, the upslopes were substantially higher in the IR50 reconstructions (2.25-mm stent: IR50, 94.0; IR75, 53.1; and FBP75, 48.1; P < .001). The implementation of an integrated-circuit CT detector provides substantially sharper out-of-plane resolution of coronary artery stents at 0.5-mm section thickness, while the use of iterative image reconstruction mostly improves in-plane stent visualization.

  2. Tephra dispersal and fallout reconstructed integrating field, ground-based and satellite-based data: Application to the 23rd November 2013 Etna paroxysm

    NASA Astrophysics Data System (ADS)

    Poret, M.; Corradini, S.; Merucci, L.; Costa, A.; Andronico, D.; Montopoli, M.; Vulpiani, G.; Scollo, S.; Freret-Lorgeril, V.

    2017-12-01

    On the 23rd November 2013, Etna erupted giving one of the most intense lava fountain recorded. The eruption produced a buoyant plume that rose higher than 10 km a.s.l. from which two volcanic clouds were observed from satellite at two different atmospheric levels. A Previous study described one of the two clouds as mainly composed by ash making use of remote sensing instruments. Besides, the second cloud is made of ice/SO2 droplets and is not measurable in terms of ash mass. Both clouds spread out under north-easterly winds transporting the tephra from Etna towards the Puglia region. The untypical meteorological conditions permit to collect tephra samples in proximal areas to the Etna emission source as well as far away in the Calabria region. The eruption was observed by satellite (MSG-SEVIRI, MODIS) and ground-based (X-band weather radar, VIS/IR cameras and L-band Doppler radar) remote sensing systems. This study uses the FALL3D code to model the evolution of the plume and the tephra deposition by constraining the simulation results with remote sensing products for volcanic cloud (cloud height, fine ash Mass - Ma, Aerosol Optical Depth at 0.55 mm - AOD). Among the input parameters, the Total Grain-Size Distribution (TGSD) is reconstructed by integrating field deposits with estimations from the X-band radar data. The optimal TGSD was selected through an inverse problem method that best-fits both the field deposits and airborne measurements. The results of the simulations capture the main behavior of the two volcanic clouds at their altitudes. The best agreement between the simulated Ma and AOD and the SEVIRI retrievals indicates a PM20 fraction of 3.4 %. The total erupted mass is estimated at 1.6 × 109 kg in consistency with the estimations made from remote sensing data (3.0 × 109 kg) and ground deposit (1.3 × 109 kg).

  3. [Alternatives to animal testing].

    PubMed

    Fabre, Isabelle

    2009-11-01

    The use of alternative methods to animal testing are an integral part of the 3Rs concept (refine, reduce, replace) defined by Russel & Burch in 1959. These approaches include in silico methods (databases and computer models), in vitro physicochemical analysis, biological methods using bacteria or isolated cells, reconstructed enzyme systems, and reconstructed tissues. Emerging "omic" methods used in integrated approaches further help to reduce animal use, while stem cells offer promising approaches to toxicologic and pathophysiologic studies, along with organotypic cultures and bio-artificial organs. Only a few alternative methods can so far be used in stand-alone tests as substitutes for animal testing. The best way to use these methods is to integrate them in tiered testing strategies (ITS), in which animals are only used as a last resort.

  4. Fast space-varying convolution using matrix source coding with applications to camera stray light reduction.

    PubMed

    Wei, Jianing; Bouman, Charles A; Allebach, Jan P

    2014-05-01

    Many imaging applications require the implementation of space-varying convolution for accurate restoration and reconstruction of images. Here, we use the term space-varying convolution to refer to linear operators whose impulse response has slow spatial variation. In addition, these space-varying convolution operators are often dense, so direct implementation of the convolution operator is typically computationally impractical. One such example is the problem of stray light reduction in digital cameras, which requires the implementation of a dense space-varying deconvolution operator. However, other inverse problems, such as iterative tomographic reconstruction, can also depend on the implementation of dense space-varying convolution. While space-invariant convolution can be efficiently implemented with the fast Fourier transform, this approach does not work for space-varying operators. So direct convolution is often the only option for implementing space-varying convolution. In this paper, we develop a general approach to the efficient implementation of space-varying convolution, and demonstrate its use in the application of stray light reduction. Our approach, which we call matrix source coding, is based on lossy source coding of the dense space-varying convolution matrix. Importantly, by coding the transformation matrix, we not only reduce the memory required to store it; we also dramatically reduce the computation required to implement matrix-vector products. Our algorithm is able to reduce computation by approximately factoring the dense space-varying convolution operator into a product of sparse transforms. Experimental results show that our method can dramatically reduce the computation required for stray light reduction while maintaining high accuracy.

  5. EXTRAPOLATION OF THE SOLAR CORONAL MAGNETIC FIELD FROM SDO/HMI MAGNETOGRAM BY A CESE-MHD-NLFFF CODE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jiang Chaowei; Feng Xueshang, E-mail: cwjiang@spaceweather.ac.cn, E-mail: fengx@spaceweather.ac.cn

    Due to the absence of direct measurement, the magnetic field in the solar corona is usually extrapolated from the photosphere in a numerical way. At the moment, the nonlinear force-free field (NLFFF) model dominates the physical models for field extrapolation in the low corona. Recently, we have developed a new NLFFF model with MHD relaxation to reconstruct the coronal magnetic field. This method is based on CESE-MHD model with the conservation-element/solution-element (CESE) spacetime scheme. In this paper, we report the application of the CESE-MHD-NLFFF code to Solar Dynamics Observatory/Helioseismic and Magnetic Imager (SDO/HMI) data with magnetograms sampled for two activemore » regions (ARs), NOAA AR 11158 and 11283, both of which were very non-potential, producing X-class flares and eruptions. The raw magnetograms are preprocessed to remove the force and then inputted into the extrapolation code. Qualitative comparison of the results with the SDO/AIA images shows that our code can reconstruct magnetic field lines resembling the EUV-observed coronal loops. Most important structures of the ARs are reproduced excellently, like the highly sheared field lines that suspend filaments in AR 11158 and twisted flux rope which corresponds to a sigmoid in AR 11283. Quantitative assessment of the results shows that the force-free constraint is fulfilled very well in the strong-field regions but apparently not that well in the weak-field regions because of data noise and numerical errors in the small currents.« less

  6. Green reconstruction of the tsunami-affected areas in India using the integrated coastal zone management concept.

    PubMed

    Sonak, Sangeeta; Pangam, Prajwala; Giriyan, Asha

    2008-10-01

    A tsunami, triggered by a massive undersea earthquake off Sumatra in Indonesia, greatly devastated the lives, property and infrastructure of coastal communities in the coastal states of India, Andaman and Nicobar Islands, Indonesia, Sri Lanka, Malaysia and Thailand. This event attracted the attention of environmental managers at all levels, local, national, regional and global. It also shifted the focus from the impact of human activities on the environment to the impacts of natural hazards. Recovery/reconstruction of these areas is highly challenging. A clear understanding of the complex dynamics of the coast and the types of challenges faced by the several stakeholders of the coast is required. Issues such as sustainability, equity and community participation assume importance. The concept of ICZM (integrated coastal zone management) has been effectively used in most parts of the world. This concept emphasizes the holistic assessment of the coast and a multidisciplinary analysis using participatory processes. It integrates anthropocentric and eco-centric approaches. This paper documents several issues involved in the recovery of tsunami-affected areas and recommends the application of the ICZM concept to the reconstruction efforts.

  7. Magnetic resonance image compression using scalar-vector quantization

    NASA Astrophysics Data System (ADS)

    Mohsenian, Nader; Shahri, Homayoun

    1995-12-01

    A new coding scheme based on the scalar-vector quantizer (SVQ) is developed for compression of medical images. SVQ is a fixed-rate encoder and its rate-distortion performance is close to that of optimal entropy-constrained scalar quantizers (ECSQs) for memoryless sources. The use of a fixed-rate quantizer is expected to eliminate some of the complexity issues of using variable-length scalar quantizers. When transmission of images over noisy channels is considered, our coding scheme does not suffer from error propagation which is typical of coding schemes which use variable-length codes. For a set of magnetic resonance (MR) images, coding results obtained from SVQ and ECSQ at low bit-rates are indistinguishable. Furthermore, our encoded images are perceptually indistinguishable from the original, when displayed on a monitor. This makes our SVQ based coder an attractive compression scheme for picture archiving and communication systems (PACS), currently under consideration for an all digital radiology environment in hospitals, where reliable transmission, storage, and high fidelity reconstruction of images are desired.

  8. Label consistent K-SVD: learning a discriminative dictionary for recognition.

    PubMed

    Jiang, Zhuolin; Lin, Zhe; Davis, Larry S

    2013-11-01

    A label consistent K-SVD (LC-KSVD) algorithm to learn a discriminative dictionary for sparse coding is presented. In addition to using class labels of training data, we also associate label information with each dictionary item (columns of the dictionary matrix) to enforce discriminability in sparse codes during the dictionary learning process. More specifically, we introduce a new label consistency constraint called "discriminative sparse-code error" and combine it with the reconstruction error and the classification error to form a unified objective function. The optimal solution is efficiently obtained using the K-SVD algorithm. Our algorithm learns a single overcomplete dictionary and an optimal linear classifier jointly. The incremental dictionary learning algorithm is presented for the situation of limited memory resources. It yields dictionaries so that feature points with the same class labels have similar sparse codes. Experimental results demonstrate that our algorithm outperforms many recently proposed sparse-coding techniques for face, action, scene, and object category recognition under the same learning conditions.

  9. Model Based Iterative Reconstruction for Bright Field Electron Tomography (Postprint)

    DTIC Science & Technology

    2013-02-01

    which is based on the iterative coordinate descent (ICD), works by constructing a substitute to the original cost4 at every point, and minimizing this...using Beer’s law. Thus the projection integral corresponding to the ith measurement is given by log ( λD λi ) . There can be cases in which the dosage λD...Inputs: Measurements g, Initial reconstruction f ′, Initial dosage d′, Fraction of entries to reject R %Outputs: Reconstruction f̂ and dosage parameter d̂

  10. Tree-ring-based drought reconstruction in the Iberian Range (east of Spain) since 1694

    NASA Astrophysics Data System (ADS)

    Tejedor, Ernesto; de Luis, Martín; Cuadrat, José María; Esper, Jan; Saz, Miguel Ángel

    2016-03-01

    Droughts are a recurrent phenomenon in the Mediterranean basin with negative consequences for society, economic activities, and natural systems. Nevertheless, the study of drought recurrence and severity in Spain has been limited so far due to the relatively short instrumental period. In this work, we present a reconstruction of the standardized precipitation index (SPI) for the Iberian Range. Growth variations and climatic signals within the network are assessed developing a correlation matrix and the data combined to a single chronology integrating 336 samples from 169 trees of five different pine species distributed throughout the province of Teruel. The new chronology, calibrated against regional instrumental climatic data, shows a high and stable correlation with the July SPI integrating moisture conditions over 12 months forming the basis for a 318-year drought reconstruction. The climate signal contained in this reconstruction is highly significant ( p < 0.05) and spatially robust over the interior areas of Spain located above 1000 meters above sea level (masl). According to our SPI reconstruction, seven substantially dry and five wet periods are identified since the late seventeenth century considering ≥±1.76 standard deviations. Besides these, 36 drought and 28 pluvial years were identified. Some of these years, such as 1725, 1741, 1803, and 1879, are also revealed in other drought reconstructions in Romania and Turkey, suggesting that coherent larger-scale synoptic patterns drove these extreme deviations. Since regional drought deviations are also retained in historical documents, the tree-ring-based reconstruction presented here will allow us to cross-validate drought frequency and magnitude in a highly vulnerable region.

  11. Tree-ring-based drought reconstruction in the Iberian Range (east of Spain) since 1694.

    PubMed

    Tejedor, Ernesto; de Luis, Martín; Cuadrat, José María; Esper, Jan; Saz, Miguel Ángel

    2016-03-01

    Droughts are a recurrent phenomenon in the Mediterranean basin with negative consequences for society, economic activities, and natural systems. Nevertheless, the study of drought recurrence and severity in Spain has been limited so far due to the relatively short instrumental period. In this work, we present a reconstruction of the standardized precipitation index (SPI) for the Iberian Range. Growth variations and climatic signals within the network are assessed developing a correlation matrix and the data combined to a single chronology integrating 336 samples from 169 trees of five different pine species distributed throughout the province of Teruel. The new chronology, calibrated against regional instrumental climatic data, shows a high and stable correlation with the July SPI integrating moisture conditions over 12 months forming the basis for a 318-year drought reconstruction. The climate signal contained in this reconstruction is highly significant (p < 0.05) and spatially robust over the interior areas of Spain located above 1000 meters above sea level (masl). According to our SPI reconstruction, seven substantially dry and five wet periods are identified since the late seventeenth century considering ≥±1.76 standard deviations. Besides these, 36 drought and 28 pluvial years were identified. Some of these years, such as 1725, 1741, 1803, and 1879, are also revealed in other drought reconstructions in Romania and Turkey, suggesting that coherent larger-scale synoptic patterns drove these extreme deviations. Since regional drought deviations are also retained in historical documents, the tree-ring-based reconstruction presented here will allow us to cross-validate drought frequency and magnitude in a highly vulnerable region.

  12. Axial Cone Beam Reconstruction by Weighted BPF/DBPF and Orthogonal Butterfly Filtering

    PubMed Central

    Tang, Shaojie; Tang, Xiangyang

    2016-01-01

    Goal The backprojection-filtration (BPF) and the derivative backprojection filtered (DBPF) algorithms, in which Hilbert filtering is the common algorithmic feature, are originally derived for exact helical reconstruction from cone beam (CB) scan data and axial reconstruction from fan beam data, respectively. These two algorithms can be heuristically extended for image reconstruction from axial CB scan data, but induce severe artifacts in images located away from the central plane determined by the circular source trajectory. We propose an algorithmic solution herein to eliminate the artifacts. Methods The solution is an integration of three-dimensional (3D) weighted axial CB-BPF/ DBPF algorithm with orthogonal butterfly filtering, namely axial CB-BPF/DBPF cascaded with orthogonal butterfly filtering. Using the computer simulated Forbild head and thoracic phantoms that are rigorous in inspecting reconstruction accuracy and an anthropomorphic thoracic phantom with projection data acquired by a CT scanner, we evaluate performance of the proposed algorithm. Results Preliminary results show that the orthogonal butterfly filtering can eliminate the severe streak artifacts existing in the images reconstructed by the 3D weighted axial CB-BPF/DBPF algorithm located at off-central planes. Conclusion Integrated with orthogonal butterfly filtering, the 3D weighted CB-BPF/DBPF algorithm can perform at least as well as the 3D weighted CB-FBP algorithm in image reconstruction from axial CB scan data. Significance The proposed 3D weighted axial CB-BPF/DBPF cascaded with orthogonal butterfly filtering can be an algorithmic solution for CT imaging in extensive clinical and preclinical applications. PMID:26660512

  13. Transcriptional landscapes of Axolotl (Ambystoma mexicanum).

    PubMed

    Caballero-Pérez, Juan; Espinal-Centeno, Annie; Falcon, Francisco; García-Ortega, Luis F; Curiel-Quesada, Everardo; Cruz-Hernández, Andrés; Bako, Laszlo; Chen, Xuemei; Martínez, Octavio; Alberto Arteaga-Vázquez, Mario; Herrera-Estrella, Luis; Cruz-Ramírez, Alfredo

    2018-01-15

    The axolotl (Ambystoma mexicanum) is the vertebrate model system with the highest regeneration capacity. Experimental tools established over the past 100 years have been fundamental to start unraveling the cellular and molecular basis of tissue and limb regeneration. In the absence of a reference genome for the Axolotl, transcriptomic analysis become fundamental to understand the genetic basis of regeneration. Here we present one of the most diverse transcriptomic data sets for Axolotl by profiling coding and non-coding RNAs from diverse tissues. We reconstructed a population of 115,906 putative protein coding mRNAs as full ORFs (including isoforms). We also identified 352 conserved miRNAs and 297 novel putative mature miRNAs. Systematic enrichment analysis of gene expression allowed us to identify tissue-specific protein-coding transcripts. We also found putative novel and conserved microRNAs which potentially target mRNAs which are reported as important disease candidates in heart and liver. Copyright © 2017 Elsevier Inc. All rights reserved.

  14. High-resolution coded-aperture design for compressive X-ray tomography using low resolution detectors

    NASA Astrophysics Data System (ADS)

    Mojica, Edson; Pertuz, Said; Arguello, Henry

    2017-12-01

    One of the main challenges in Computed Tomography (CT) is obtaining accurate reconstructions of the imaged object while keeping a low radiation dose in the acquisition process. In order to solve this problem, several researchers have proposed the use of compressed sensing for reducing the amount of measurements required to perform CT. This paper tackles the problem of designing high-resolution coded apertures for compressed sensing computed tomography. In contrast to previous approaches, we aim at designing apertures to be used with low-resolution detectors in order to achieve super-resolution. The proposed method iteratively improves random coded apertures using a gradient descent algorithm subject to constraints in the coherence and homogeneity of the compressive sensing matrix induced by the coded aperture. Experiments with different test sets show consistent results for different transmittances, number of shots and super-resolution factors.

  15. Motion Detection in Ultrasound Image-Sequences Using Tensor Voting

    NASA Astrophysics Data System (ADS)

    Inba, Masafumi; Yanagida, Hirotaka; Tamura, Yasutaka

    2008-05-01

    Motion detection in ultrasound image sequences using tensor voting is described. We have been developing an ultrasound imaging system adopting a combination of coded excitation and synthetic aperture focusing techniques. In our method, frame rate of the system at distance of 150 mm reaches 5000 frame/s. Sparse array and short duration coded ultrasound signals are used for high-speed data acquisition. However, many artifacts appear in the reconstructed image sequences because of the incompleteness of the transmitted code. To reduce the artifacts, we have examined the application of tensor voting to the imaging method which adopts both coded excitation and synthetic aperture techniques. In this study, the basis of applying tensor voting and the motion detection method to ultrasound images is derived. It was confirmed that velocity detection and feature enhancement are possible using tensor voting in the time and space of simulated ultrasound three-dimensional image sequences.

  16. 3D scene reconstruction based on multi-view distributed video coding in the Zernike domain for mobile applications

    NASA Astrophysics Data System (ADS)

    Palma, V.; Carli, M.; Neri, A.

    2011-02-01

    In this paper a Multi-view Distributed Video Coding scheme for mobile applications is presented. Specifically a new fusion technique between temporal and spatial side information in Zernike Moments domain is proposed. Distributed video coding introduces a flexible architecture that enables the design of very low complex video encoders compared to its traditional counterparts. The main goal of our work is to generate at the decoder the side information that optimally blends temporal and interview data. Multi-view distributed coding performance strongly depends on the side information quality built at the decoder. At this aim for improving its quality a spatial view compensation/prediction in Zernike moments domain is applied. Spatial and temporal motion activity have been fused together to obtain the overall side-information. The proposed method has been evaluated by rate-distortion performances for different inter-view and temporal estimation quality conditions.

  17. Determination of current and rotational transform profiles in a current-carrying stellarator using soft x-ray emissivity measurements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ma, X.; Cianciosa, M. R.; Ennis, D. A.

    In this research, collimated soft X-ray (SXR) emissivity measurements from multi-channel cameras on the Compact Toroidal Hybrid (CTH) tokamak/torsatron device are incorporated in the 3D equilibrium reconstruction code V3FIT to reconstruct the shape of flux surfaces and infer the current distribution within the plasma. Equilibrium reconstructions of sawtoothing plasmas that use data from both SXR and external magnetic diagnostics show the central safety factor to be near unity under the assumption that SXR iso-emissivity contours lie on magnetic flux surfaces. The reconstruction results are consistent with those using the external magnetic data and a constraint on the location of qmore » = 1 surfaces determined from the sawtooth inversion surface extracted from SXR brightness profiles. The agreement justifies the use of approximating SXR emission as a flux function in CTH, at least within the core of the plasma, subject to the spatial resolution of the SXR diagnostics. Lastly, this improved reconstruction of the central current density indicates that the current profile peakedness decreases with increasing external transform and that the internal inductance is not a relevant measure of how peaked the current profile is in hybrid discharges.« less

  18. Compressed sensing for energy-efficient wireless telemonitoring of noninvasive fetal ECG via block sparse Bayesian learning.

    PubMed

    Zhang, Zhilin; Jung, Tzyy-Ping; Makeig, Scott; Rao, Bhaskar D

    2013-02-01

    Fetal ECG (FECG) telemonitoring is an important branch in telemedicine. The design of a telemonitoring system via a wireless body area network with low energy consumption for ambulatory use is highly desirable. As an emerging technique, compressed sensing (CS) shows great promise in compressing/reconstructing data with low energy consumption. However, due to some specific characteristics of raw FECG recordings such as nonsparsity and strong noise contamination, current CS algorithms generally fail in this application. This paper proposes to use the block sparse Bayesian learning framework to compress/reconstruct nonsparse raw FECG recordings. Experimental results show that the framework can reconstruct the raw recordings with high quality. Especially, the reconstruction does not destroy the interdependence relation among the multichannel recordings. This ensures that the independent component analysis decomposition of the reconstructed recordings has high fidelity. Furthermore, the framework allows the use of a sparse binary sensing matrix with much fewer nonzero entries to compress recordings. Particularly, each column of the matrix can contain only two nonzero entries. This shows that the framework, compared to other algorithms such as current CS algorithms and wavelet algorithms, can greatly reduce code execution in CPU in the data compression stage.

  19. Non-axisymmetric equilibrium reconstruction and suppression of density limit disruptions in a current-carrying stellarator

    NASA Astrophysics Data System (ADS)

    Ma, Xinxing; Ennis, D. A.; Hanson, J. D.; Hartwell, G. J.; Knowlton, S. F.; Maurer, D. A.

    2017-10-01

    Non-axisymmetric equilibrium reconstructions have been routinely performed with the V3FIT code in the Compact Toroidal Hybrid (CTH), a stellarator/tokamak hybrid. In addition to 50 external magnetic measurements, 160 SXR emissivity measurements are incorporated into V3FIT to reconstruct the magnetic flux surface geometry and infer the current distribution within the plasma. Improved reconstructions of current and q profiles provide insight into understanding the physics of density limit disruptions observed in current-carrying discharges in CTH. It is confirmed that the final scenario of the density limit of CTH plasmas is consistent with classic observations in tokamaks: current profile shrinkage leads to growing MHD instabilities (tearing modes) followed by a loss of MHD equilibrium. It is also observed that the density limit at a given current linearly increases with increasing amounts of 3D shaping fields. Consequently, plasmas with densities up to two times the Greenwald limit are attained. Equilibrium reconstructions show that addition of 3D fields effectively moves resonance surfaces towards the edge of the plasma where the current profile gradient is less, providing a stabilizing effect. This work is supported by US Department of Energy Grant No. DE-FG02-00ER54610.

  20. Determination of current and rotational transform profiles in a current-carrying stellarator using soft x-ray emissivity measurements

    NASA Astrophysics Data System (ADS)

    Ma, X.; Cianciosa, M. R.; Ennis, D. A.; Hanson, J. D.; Hartwell, G. J.; Herfindal, J. L.; Howell, E. C.; Knowlton, S. F.; Maurer, D. A.; Traverso, P. J.

    2018-01-01

    Collimated soft X-ray (SXR) emissivity measurements from multi-channel cameras on the Compact Toroidal Hybrid (CTH) tokamak/torsatron device are incorporated in the 3D equilibrium reconstruction code V3FIT to reconstruct the shape of flux surfaces and infer the current distribution within the plasma. Equilibrium reconstructions of sawtoothing plasmas that use data from both SXR and external magnetic diagnostics show the central safety factor to be near unity under the assumption that SXR iso-emissivity contours lie on magnetic flux surfaces. The reconstruction results are consistent with those using the external magnetic data and a constraint on the location of q = 1 surfaces determined from the sawtooth inversion surface extracted from SXR brightness profiles. The agreement justifies the use of approximating SXR emission as a flux function in CTH, at least within the core of the plasma, subject to the spatial resolution of the SXR diagnostics. This improved reconstruction of the central current density indicates that the current profile peakedness decreases with increasing external transform and that the internal inductance is not a relevant measure of how peaked the current profile is in hybrid discharges.

  1. Determination of current and rotational transform profiles in a current-carrying stellarator using soft x-ray emissivity measurements

    DOE PAGES

    Ma, X.; Cianciosa, M. R.; Ennis, D. A.; ...

    2018-01-31

    In this research, collimated soft X-ray (SXR) emissivity measurements from multi-channel cameras on the Compact Toroidal Hybrid (CTH) tokamak/torsatron device are incorporated in the 3D equilibrium reconstruction code V3FIT to reconstruct the shape of flux surfaces and infer the current distribution within the plasma. Equilibrium reconstructions of sawtoothing plasmas that use data from both SXR and external magnetic diagnostics show the central safety factor to be near unity under the assumption that SXR iso-emissivity contours lie on magnetic flux surfaces. The reconstruction results are consistent with those using the external magnetic data and a constraint on the location of qmore » = 1 surfaces determined from the sawtooth inversion surface extracted from SXR brightness profiles. The agreement justifies the use of approximating SXR emission as a flux function in CTH, at least within the core of the plasma, subject to the spatial resolution of the SXR diagnostics. Lastly, this improved reconstruction of the central current density indicates that the current profile peakedness decreases with increasing external transform and that the internal inductance is not a relevant measure of how peaked the current profile is in hybrid discharges.« less

  2. Integrated modelling framework for short pulse high energy density physics experiments

    NASA Astrophysics Data System (ADS)

    Sircombe, N. J.; Hughes, S. J.; Ramsay, M. G.

    2016-03-01

    Modelling experimental campaigns on the Orion laser at AWE, and developing a viable point-design for fast ignition (FI), calls for a multi-scale approach; a complete description of the problem would require an extensive range of physics which cannot realistically be included in a single code. For modelling the laser-plasma interaction (LPI) we need a fine mesh which can capture the dispersion of electromagnetic waves, and a kinetic model for each plasma species. In the dense material of the bulk target, away from the LPI region, collisional physics dominates. The transport of hot particles generated by the action of the laser is dependent on their slowing and stopping in the dense material and their need to draw a return current. These effects will heat the target, which in turn influences transport. On longer timescales, the hydrodynamic response of the target will begin to play a role as the pressure generated from isochoric heating begins to take effect. Recent effort at AWE [1] has focussed on the development of an integrated code suite based on: the particle in cell code EPOCH, to model LPI; the Monte-Carlo electron transport code THOR, to model the onward transport of hot electrons; and the radiation hydrodynamics code CORVUS, to model the hydrodynamic response of the target. We outline the methodology adopted, elucidate on the advantages of a robustly integrated code suite compared to a single code approach, demonstrate the integrated code suite's application to modelling the heating of buried layers on Orion, and assess the potential of such experiments for the validation of modelling capability in advance of more ambitious HEDP experiments, as a step towards a predictive modelling capability for FI.

  3. Beyond Honour Codes: Bringing Students into the Academic Integrity Equation

    ERIC Educational Resources Information Center

    Richards, Deborah; Saddiqui, Sonia; McGuigan, Nicholas; Homewood, Judi

    2016-01-01

    Honour codes represent a successful and unique, student-led, "bottom-up" approach to the promotion of academic integrity (AI). With increased flexibility, globalisation and distance or blended education options, most institutions operate in very different climates and cultures from the US institutions that have a long-established culture…

  4. The Integration of Linguistic Theory: Internal Reconstruction and the Comparative Method in Descriptive Linguistics.

    ERIC Educational Resources Information Center

    Bailey, Charles-James N.

    The author aims: (1) to show that generative phonology uses essentially the method of internal reconstruction which has previously been employed only in diachronic studies in setting up synchronic underlying phonological representations; (2) to show why synchronic analysis should add the comparative method to its arsenal, together with whatever…

  5. Generative Role of Experiments in Physics and in Teaching Physics: A Suggestion for Epistemological Reconstruction

    ERIC Educational Resources Information Center

    Koponen, Ismo T.; Mantyla, Terhi

    2006-01-01

    In physics teaching experimentality is an integral component in giving the starting point of knowledge formation and conceptualization. However, epistemology of experiments is not often addressed directly in the educational and pedagogical literature. This warrants an attempt to produce an acceptable reconstruction of the epistemological role of…

  6. A Better Management Information System Is Needed to Promote Information Sharing, Effective Planning, and Coordination of Afghanistan Reconstruction Activities

    DTIC Science & Technology

    2009-07-30

    management information systems for collecting data on their reconstruction activities, but there is no single management information system that provides...spreadsheets, presentations, and other ad hoc reports. An integrated management information system that provides a common operating picture of all U.S

  7. Creating Communities of Learning. Schools and Smart Growth in New Jersey.

    ERIC Educational Resources Information Center

    Bird, Kathleen, Ed.

    This paper discusses New Jersey's unprecedented $12.3 billion school construction and reconstruction project, launched in 2000, as an opportunity to reconstruct the state's communities, enhancing quality of life and reducing sprawl. It aims to stimulate a statewide conversation about the opportunity to integrate the design of the next generation…

  8. Incorporating Manual and Autonomous Code Generation

    NASA Technical Reports Server (NTRS)

    McComas, David

    1998-01-01

    Code can be generated manually or using code-generated software tools, but how do you interpret the two? This article looks at a design methodology that combines object-oriented design with autonomic code generation for attitude control flight software. Recent improvements in space flight computers are allowing software engineers to spend more time engineering the applications software. The application developed was the attitude control flight software for an astronomical satellite called the Microwave Anisotropy Probe (MAP). The MAP flight system is being designed, developed, and integrated at NASA's Goddard Space Flight Center. The MAP controls engineers are using Integrated Systems Inc.'s MATRIXx for their controls analysis. In addition to providing a graphical analysis for an environment, MATRIXx includes an autonomic code generation facility called AutoCode. This article examines the forces that shaped the final design and describes three highlights of the design process: (1) Defining the manual to autonomic code interface; (2) Applying object-oriented design to the manual flight code; (3) Implementing the object-oriented design in C.

  9. Performance tuning of N-body codes on modern microprocessors: I. Direct integration with a hermite scheme on x86_64 architecture

    NASA Astrophysics Data System (ADS)

    Nitadori, Keigo; Makino, Junichiro; Hut, Piet

    2006-12-01

    The main performance bottleneck of gravitational N-body codes is the force calculation between two particles. We have succeeded in speeding up this pair-wise force calculation by factors between 2 and 10, depending on the code and the processor on which the code is run. These speed-ups were obtained by writing highly fine-tuned code for x86_64 microprocessors. Any existing N-body code, running on these chips, can easily incorporate our assembly code programs. In the current paper, we present an outline of our overall approach, which we illustrate with one specific example: the use of a Hermite scheme for a direct N2 type integration on a single 2.0 GHz Athlon 64 processor, for which we obtain an effective performance of 4.05 Gflops, for double-precision accuracy. In subsequent papers, we will discuss other variations, including the combinations of N log N codes, single-precision implementations, and performance on other microprocessors.

  10. Refocusing-range and image-quality enhanced optical reconstruction of 3-D objects from integral images using a principal periodic δ-function array

    NASA Astrophysics Data System (ADS)

    Ai, Lingyu; Kim, Eun-Soo

    2018-03-01

    We propose a method for refocusing-range and image-quality enhanced optical reconstruction of three-dimensional (3-D) objects from integral images only by using a 3 × 3 periodic δ-function array (PDFA), which is called a principal PDFA (P-PDFA). By directly convolving the elemental image array (EIA) captured from 3-D objects with the P-PDFAs whose spatial periods correspond to each object's depth, a set of spatially-filtered EIAs (SF-EIAs) are extracted, and from which 3-D objects can be reconstructed to be refocused on their real depth. convolutional operations are performed directly on each of the minimum 3 × 3 EIs of the picked-up EIA, the capturing and refocused-depth ranges of 3-D objects can be greatly enhanced, as well as 3-D objects much improved in image quality can be reconstructed without any preprocessing operations. Through ray-optical analysis and optical experiments with actual 3-D objects, the feasibility of the proposed method has been confirmed.

  11. Nuclear Energy Advanced Modeling and Simulation (NEAMS) waste Integrated Performance and Safety Codes (IPSC) : gap analysis for high fidelity and performance assessment code development.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, Joon H.; Siegel, Malcolm Dean; Arguello, Jose Guadalupe, Jr.

    2011-03-01

    This report describes a gap analysis performed in the process of developing the Waste Integrated Performance and Safety Codes (IPSC) in support of the U.S. Department of Energy (DOE) Office of Nuclear Energy Advanced Modeling and Simulation (NEAMS) Campaign. The goal of the Waste IPSC is to develop an integrated suite of computational modeling and simulation capabilities to quantitatively assess the long-term performance of waste forms in the engineered and geologic environments of a radioactive waste storage or disposal system. The Waste IPSC will provide this simulation capability (1) for a range of disposal concepts, waste form types, engineered repositorymore » designs, and geologic settings, (2) for a range of time scales and distances, (3) with appropriate consideration of the inherent uncertainties, and (4) in accordance with rigorous verification, validation, and software quality requirements. The gap analyses documented in this report were are performed during an initial gap analysis to identify candidate codes and tools to support the development and integration of the Waste IPSC, and during follow-on activities that delved into more detailed assessments of the various codes that were acquired, studied, and tested. The current Waste IPSC strategy is to acquire and integrate the necessary Waste IPSC capabilities wherever feasible, and develop only those capabilities that cannot be acquired or suitably integrated, verified, or validated. The gap analysis indicates that significant capabilities may already exist in the existing THC codes although there is no single code able to fully account for all physical and chemical processes involved in a waste disposal system. Large gaps exist in modeling chemical processes and their couplings with other processes. The coupling of chemical processes with flow transport and mechanical deformation remains challenging. The data for extreme environments (e.g., for elevated temperature and high ionic strength media) that are needed for repository modeling are severely lacking. In addition, most of existing reactive transport codes were developed for non-radioactive contaminants, and they need to be adapted to account for radionuclide decay and in-growth. The accessibility to the source codes is generally limited. Because the problems of interest for the Waste IPSC are likely to result in relatively large computational models, a compact memory-usage footprint and a fast/robust solution procedure will be needed. A robust massively parallel processing (MPP) capability will also be required to provide reasonable turnaround times on the analyses that will be performed with the code. A performance assessment (PA) calculation for a waste disposal system generally requires a large number (hundreds to thousands) of model simulations to quantify the effect of model parameter uncertainties on the predicted repository performance. A set of codes for a PA calculation must be sufficiently robust and fast in terms of code execution. A PA system as a whole must be able to provide multiple alternative models for a specific set of physical/chemical processes, so that the users can choose various levels of modeling complexity based on their modeling needs. This requires PA codes, preferably, to be highly modularized. Most of the existing codes have difficulties meeting these requirements. Based on the gap analysis results, we have made the following recommendations for the code selection and code development for the NEAMS waste IPSC: (1) build fully coupled high-fidelity THCMBR codes using the existing SIERRA codes (e.g., ARIA and ADAGIO) and platform, (2) use DAKOTA to build an enhanced performance assessment system (EPAS), and build a modular code architecture and key code modules for performance assessments. The key chemical calculation modules will be built by expanding the existing CANTERA capabilities as well as by extracting useful components from other existing codes.« less

  12. Quantitative reconstructions in multi-modal photoacoustic and optical coherence tomography imaging

    NASA Astrophysics Data System (ADS)

    Elbau, P.; Mindrinos, L.; Scherzer, O.

    2018-01-01

    In this paper we perform quantitative reconstruction of the electric susceptibility and the Grüneisen parameter of a non-magnetic linear dielectric medium using measurement of a multi-modal photoacoustic and optical coherence tomography system. We consider the mathematical model presented in Elbau et al (2015 Handbook of Mathematical Methods in Imaging ed O Scherzer (New York: Springer) pp 1169-204), where a Fredholm integral equation of the first kind for the Grüneisen parameter was derived. For the numerical solution of the integral equation we consider a Galerkin type method.

  13. Multifacet structure of observed reconstructed integral images.

    PubMed

    Martínez-Corral, Manuel; Javidi, Bahram; Martínez-Cuenca, Raúl; Saavedra, Genaro

    2005-04-01

    Three-dimensional images generated by an integral imaging system suffer from degradations in the form of grid of multiple facets. This multifacet structure breaks the continuity of the observed image and therefore reduces its visual quality. We perform an analysis of this effect and present the guidelines in the design of lenslet imaging parameters for optimization of viewing conditions with respect to the multifacet degradation. We consider the optimization of the system in terms of field of view, observer position and pupil function, lenslet parameters, and type of reconstruction. Numerical tests are presented to verify the theoretical analysis.

  14. Building a model for disease classification integration in oncology, an approach based on the national cancer institute thesaurus.

    PubMed

    Jouhet, Vianney; Mougin, Fleur; Bréchat, Bérénice; Thiessard, Frantz

    2017-02-07

    Identifying incident cancer cases within a population remains essential for scientific research in oncology. Data produced within electronic health records can be useful for this purpose. Due to the multiplicity of providers, heterogeneous terminologies such as ICD-10 and ICD-O-3 are used for oncology diagnosis recording purpose. To enable disease identification based on these diagnoses, there is a need for integrating disease classifications in oncology. Our aim was to build a model integrating concepts involved in two disease classifications, namely ICD-10 (diagnosis) and ICD-O-3 (topography and morphology), despite their structural heterogeneity. Based on the NCIt, a "derivative" model for linking diagnosis and topography-morphology combinations was defined and built. ICD-O-3 and ICD-10 codes were then used to instantiate classes of the "derivative" model. Links between terminologies obtained through the model were then compared to mappings provided by the Surveillance, Epidemiology, and End Results (SEER) program. The model integrated 42% of neoplasm ICD-10 codes (excluding metastasis), 98% of ICD-O-3 morphology codes (excluding metastasis) and 68% of ICD-O-3 topography codes. For every codes instantiating at least a class in the "derivative" model, comparison with SEER mappings reveals that all mappings were actually available in the model as a link between the corresponding codes. We have proposed a method to automatically build a model for integrating ICD-10 and ICD-O-3 based on the NCIt. The resulting "derivative" model is a machine understandable resource that enables an integrated view of these heterogeneous terminologies. The NCIt structure and the available relationships can help to bridge disease classifications taking into account their structural and granular heterogeneities. However, (i) inconsistencies exist within the NCIt leading to misclassifications in the "derivative" model, (ii) the "derivative" model only integrates a part of ICD-10 and ICD-O-3. The NCIt is not sufficient for integration purpose and further work based on other termino-ontological resources is needed in order to enrich the model and avoid identified inconsistencies.

  15. Simultaneous reconstruction of multiple depth images without off-focus points in integral imaging using a graphics processing unit.

    PubMed

    Yi, Faliu; Lee, Jieun; Moon, Inkyu

    2014-05-01

    The reconstruction of multiple depth images with a ray back-propagation algorithm in three-dimensional (3D) computational integral imaging is computationally burdensome. Further, a reconstructed depth image consists of a focus and an off-focus area. Focus areas are 3D points on the surface of an object that are located at the reconstructed depth, while off-focus areas include 3D points in free-space that do not belong to any object surface in 3D space. Generally, without being removed, the presence of an off-focus area would adversely affect the high-level analysis of a 3D object, including its classification, recognition, and tracking. Here, we use a graphics processing unit (GPU) that supports parallel processing with multiple processors to simultaneously reconstruct multiple depth images using a lookup table containing the shifted values along the x and y directions for each elemental image in a given depth range. Moreover, each 3D point on a depth image can be measured by analyzing its statistical variance with its corresponding samples, which are captured by the two-dimensional (2D) elemental images. These statistical variances can be used to classify depth image pixels as either focus or off-focus points. At this stage, the measurement of focus and off-focus points in multiple depth images is also implemented in parallel on a GPU. Our proposed method is conducted based on the assumption that there is no occlusion of the 3D object during the capture stage of the integral imaging process. Experimental results have demonstrated that this method is capable of removing off-focus points in the reconstructed depth image. The results also showed that using a GPU to remove the off-focus points could greatly improve the overall computational speed compared with using a CPU.

  16. A return mapping algorithm for isotropic and anisotropic plasticity models using a line search method

    DOE PAGES

    Scherzinger, William M.

    2016-05-01

    The numerical integration of constitutive models in computational solid mechanics codes allows for the solution of boundary value problems involving complex material behavior. Metal plasticity models, in particular, have been instrumental in the development of these codes. Here, most plasticity models implemented in computational codes use an isotropic von Mises yield surface. The von Mises, of J 2, yield surface has a simple predictor-corrector algorithm - the radial return algorithm - to integrate the model.

  17. A novel partial volume effects correction technique integrating deconvolution associated with denoising within an iterative PET image reconstruction

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Merlin, Thibaut, E-mail: thibaut.merlin@telecom-bretagne.eu; Visvikis, Dimitris; Fernandez, Philippe

    2015-02-15

    Purpose: Partial volume effect (PVE) plays an important role in both qualitative and quantitative PET image accuracy, especially for small structures. A previously proposed voxelwise PVE correction method applied on PET reconstructed images involves the use of Lucy–Richardson deconvolution incorporating wavelet-based denoising to limit the associated propagation of noise. The aim of this study is to incorporate the deconvolution, coupled with the denoising step, directly inside the iterative reconstruction process to further improve PVE correction. Methods: The list-mode ordered subset expectation maximization (OSEM) algorithm has been modified accordingly with the application of the Lucy–Richardson deconvolution algorithm to the current estimationmore » of the image, at each reconstruction iteration. Acquisitions of the NEMA NU2-2001 IQ phantom were performed on a GE DRX PET/CT system to study the impact of incorporating the deconvolution inside the reconstruction [with and without the point spread function (PSF) model] in comparison to its application postreconstruction and to standard iterative reconstruction incorporating the PSF model. The impact of the denoising step was also evaluated. Images were semiquantitatively assessed by studying the trade-off between the intensity recovery and the noise level in the background estimated as relative standard deviation. Qualitative assessments of the developed methods were additionally performed on clinical cases. Results: Incorporating the deconvolution without denoising within the reconstruction achieved superior intensity recovery in comparison to both standard OSEM reconstruction integrating a PSF model and application of the deconvolution algorithm in a postreconstruction process. The addition of the denoising step permitted to limit the SNR degradation while preserving the intensity recovery. Conclusions: This study demonstrates the feasibility of incorporating the Lucy–Richardson deconvolution associated with a wavelet-based denoising in the reconstruction process to better correct for PVE. Future work includes further evaluations of the proposed method on clinical datasets and the use of improved PSF models.« less

  18. Geometry Calibration of the SVT in the CLAS12 Detector

    NASA Astrophysics Data System (ADS)

    Davies, Peter; Gilfoyle, Gerard

    2016-09-01

    A new detector called CLAS12 is being built in Hall B as part of the 12 GeV Upgrade at Jefferson Lab to learn how quarks and gluons form nuclei. The Silicon Vertex Tracker (SVT) is one of the subsystems designed to track the trajectory of charged particles as they are emitted from the target at large angles. The sensors of the SVT consist of long, narrow, strips embedded in a silicon substrate. There are 256 strips in a sensor, with a stereo angle of 0 -3° degrees. The location of the strips must be known to a precision of a few microns in order to accurately reconstruct particle tracks with the required resolution of 50-60 microns. Our first step toward achieving this resolution was to validate the nominal geometry relative to the design specification. We also resolved differences between the design and the CLAS12, Geant4-based simulation code GEMC. We developed software to apply alignment shifts to the nominal design geometry from a survey of fiducial points on the structure that supports each sensor. The final geometry will be generated by a common package written in JAVA to ensure consistency between the simulation and Reconstruction codes. The code will be tested by studying the impact of known distortions of the nominal geometry in simulation. Work supported by the Univeristy of Richmond and the US Department of Energy.

  19. Grid point extraction and coding for structured light system

    NASA Astrophysics Data System (ADS)

    Song, Zhan; Chung, Ronald

    2011-09-01

    A structured light system simplifies three-dimensional reconstruction by illuminating a specially designed pattern to the target object, thereby generating a distinct texture on it for imaging and further processing. Success of the system hinges upon what features are to be coded in the projected pattern, extracted in the captured image, and matched between the projector's display panel and the camera's image plane. The codes have to be such that they are largely preserved in the image data upon illumination from the projector, reflection from the target object, and projective distortion in the imaging process. The features also need to be reliably extracted in the image domain. In this article, a two-dimensional pseudorandom pattern consisting of rhombic color elements is proposed, and the grid points between the pattern elements are chosen as the feature points. We describe how a type classification of the grid points plus the pseudorandomness of the projected pattern can equip each grid point with a unique label that is preserved in the captured image. We also present a grid point detector that extracts the grid points without the need of segmenting the pattern elements, and that localizes the grid points in subpixel accuracy. Extensive experiments are presented to illustrate that, with the proposed pattern feature definition and feature detector, more features points in higher accuracy can be reconstructed in comparison with the existing pseudorandomly encoded structured light systems.

  20. Molecular Evolution of Aminoacyl tRNA Synthetase Proteins in the Early History of Life

    NASA Astrophysics Data System (ADS)

    Fournier, Gregory P.; Andam, Cheryl P.; Alm, Eric J.; Gogarten, J. Peter

    2011-12-01

    Aminoacyl-tRNA synthetases (aaRS) consist of several families of functionally conserved proteins essential for translation and protein synthesis. Like nearly all components of the translation machinery, most aaRS families are universally distributed across cellular life, being inherited from the time of the Last Universal Common Ancestor (LUCA). However, unlike the rest of the translation machinery, aaRS have undergone numerous ancient horizontal gene transfers, with several independent events detected between domains, and some possibly involving lineages diverging before the time of LUCA. These transfers reveal the complexity of molecular evolution at this early time, and the chimeric nature of genomes within cells that gave rise to the major domains. Additionally, given the role of these protein families in defining the amino acids used for protein synthesis, sequence reconstruction of their pre-LUCA ancestors can reveal the evolutionary processes at work in the origin of the genetic code. In particular, sequence reconstructions of the paralog ancestors of isoleucyl- and valyl- RS provide strong empirical evidence that at least for this divergence, the genetic code did not co-evolve with the aaRSs; rather, both amino acids were already part of the genetic code before their cognate aaRSs diverged from their common ancestor. The implications of this observation for the early evolution of RNA-directed protein biosynthesis are discussed.

  1. ARC integration into the NEAMS Workbench

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stauff, N.; Gaughan, N.; Kim, T.

    2017-01-01

    One of the objectives of the Nuclear Energy Advanced Modeling and Simulation (NEAMS) Integration Product Line (IPL) is to facilitate the deployment of the high-fidelity codes developed within the program. The Workbench initiative was launched in FY-2017 by the IPL to facilitate the transition from conventional tools to high fidelity tools. The Workbench provides a common user interface for model creation, real-time validation, execution, output processing, and visualization for integrated codes.

  2. An ECG signals compression method and its validation using NNs.

    PubMed

    Fira, Catalina Monica; Goras, Liviu

    2008-04-01

    This paper presents a new algorithm for electrocardiogram (ECG) signal compression based on local extreme extraction, adaptive hysteretic filtering and Lempel-Ziv-Welch (LZW) coding. The algorithm has been verified using eight of the most frequent normal and pathological types of cardiac beats and an multi-layer perceptron (MLP) neural network trained with original cardiac patterns and tested with reconstructed ones. Aspects regarding the possibility of using the principal component analysis (PCA) to cardiac pattern classification have been investigated as well. A new compression measure called "quality score," which takes into account both the reconstruction errors and the compression ratio, is proposed.

  3. Adjustable lossless image compression based on a natural splitting of an image into drawing, shading, and fine-grained components

    NASA Technical Reports Server (NTRS)

    Novik, Dmitry A.; Tilton, James C.

    1993-01-01

    The compression, or efficient coding, of single band or multispectral still images is becoming an increasingly important topic. While lossy compression approaches can produce reconstructions that are visually close to the original, many scientific and engineering applications require exact (lossless) reconstructions. However, the most popular and efficient lossless compression techniques do not fully exploit the two-dimensional structural links existing in the image data. We describe here a general approach to lossless data compression that effectively exploits two-dimensional structural links of any length. After describing in detail two main variants on this scheme, we discuss experimental results.

  4. Tri-Lab Co-Design Milestone: In-Depth Performance Portability Analysis of Improved Integrated Codes on Advanced Architecture.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hoekstra, Robert J.; Hammond, Simon David; Richards, David

    2017-09-01

    This milestone is a tri-lab deliverable supporting ongoing Co-Design efforts impacting applications in the Integrated Codes (IC) program element Advanced Technology Development and Mitigation (ATDM) program element. In FY14, the trilabs looked at porting proxy application to technologies of interest for ATS procurements. In FY15, a milestone was completed evaluating proxy applications in multiple programming models and in FY16, a milestone was completed focusing on the migration of lessons learned back into production code development. This year, the co-design milestone focuses on extracting the knowledge gained and/or code revisions back into production applications.

  5. Computational tools and lattice design for the PEP-II B-Factory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cai, Y.; Irwin, J.; Nosochkov, Y.

    1997-02-01

    Several accelerator codes were used to design the PEP-II lattices, ranging from matrix-based codes, such as MAD and DIMAD, to symplectic-integrator codes, such as TRACY and DESPOT. In addition to element-by-element tracking, we constructed maps to determine aberration strengths. Furthermore, we have developed a fast and reliable method (nPB tracking) to track particles with a one-turn map. This new technique allows us to evaluate performance of the lattices on the entire tune-plane. Recently, we designed and implemented an object-oriented code in C++ called LEGO which integrates and expands upon TRACY and DESPOT. {copyright} {ital 1997 American Institute of Physics.}

  6. Computational tools and lattice design for the PEP-II B-Factory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cai Yunhai; Irwin, John; Nosochkov, Yuri

    1997-02-01

    Several accelerator codes were used to design the PEP-II lattices, ranging from matrix-based codes, such as MAD and DIMAD, to symplectic-integrator codes, such as TRACY and DESPOT. In addition to element-by-element tracking, we constructed maps to determine aberration strengths. Furthermore, we have developed a fast and reliable method (nPB tracking) to track particles with a one-turn map. This new technique allows us to evaluate performance of the lattices on the entire tune-plane. Recently, we designed and implemented an object-oriented code in C++ called LEGO which integrates and expands upon TRACY and DESPOT.

  7. RAY-RAMSES: a code for ray tracing on the fly in N-body simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barreira, Alexandre; Llinares, Claudio; Bose, Sownak

    2016-05-01

    We present a ray tracing code to compute integrated cosmological observables on the fly in AMR N-body simulations. Unlike conventional ray tracing techniques, our code takes full advantage of the time and spatial resolution attained by the N-body simulation by computing the integrals along the line of sight on a cell-by-cell basis through the AMR simulation grid. Moroever, since it runs on the fly in the N-body run, our code can produce maps of the desired observables without storing large (or any) amounts of data for post-processing. We implemented our routines in the RAMSES N-body code and tested the implementationmore » using an example of weak lensing simulation. We analyse basic statistics of lensing convergence maps and find good agreement with semi-analytical methods. The ray tracing methodology presented here can be used in several cosmological analysis such as Sunyaev-Zel'dovich and integrated Sachs-Wolfe effect studies as well as modified gravity. Our code can also be used in cross-checks of the more conventional methods, which can be important in tests of theory systematics in preparation for upcoming large scale structure surveys.« less

  8. Integrated coding-aware intra-ONU scheduling for passive optical networks with inter-ONU traffic

    NASA Astrophysics Data System (ADS)

    Li, Yan; Dai, Shifang; Wu, Weiwei

    2016-12-01

    Recently, with the soaring of traffic among optical network units (ONUs), network coding (NC) is becoming an appealing technique for improving the performance of passive optical networks (PONs) with such inter-ONU traffic. However, in the existed NC-based PONs, NC can only be implemented by buffering inter-ONU traffic at the optical line terminal (OLT) to wait for the establishment of coding condition, such passive uncertain waiting severely limits the effect of NC technique. In this paper, we will study integrated coding-aware intra-ONU scheduling in which the scheduling of inter-ONU traffic within each ONU will be undertaken by the OLT to actively facilitate the forming of coding inter-ONU traffic based on the global inter-ONU traffic distribution, and then the performance of PONs with inter-ONU traffic can be significantly improved. We firstly design two report message patterns and an inter-ONU traffic transmission framework as the basis for the integrated coding-aware intra-ONU scheduling. Three specific scheduling strategies are then proposed for adapting diverse global inter-ONU traffic distributions. The effectiveness of the work is finally evaluated by both theoretical analysis and simulations.

  9. Multidisciplinary High-Fidelity Analysis and Optimization of Aerospace Vehicles. Part 1; Formulation

    NASA Technical Reports Server (NTRS)

    Walsh, J. L.; Townsend, J. C.; Salas, A. O.; Samareh, J. A.; Mukhopadhyay, V.; Barthelemy, J.-F.

    2000-01-01

    An objective of the High Performance Computing and Communication Program at the NASA Langley Research Center is to demonstrate multidisciplinary shape and sizing optimization of a complete aerospace vehicle configuration by using high-fidelity, finite element structural analysis and computational fluid dynamics aerodynamic analysis in a distributed, heterogeneous computing environment that includes high performance parallel computing. A software system has been designed and implemented to integrate a set of existing discipline analysis codes, some of them computationally intensive, into a distributed computational environment for the design of a highspeed civil transport configuration. The paper describes the engineering aspects of formulating the optimization by integrating these analysis codes and associated interface codes into the system. The discipline codes are integrated by using the Java programming language and a Common Object Request Broker Architecture (CORBA) compliant software product. A companion paper presents currently available results.

  10. Ultra-high resolution coded wavefront sensor.

    PubMed

    Wang, Congli; Dun, Xiong; Fu, Qiang; Heidrich, Wolfgang

    2017-06-12

    Wavefront sensors and more general phase retrieval methods have recently attracted a lot of attention in a host of application domains, ranging from astronomy to scientific imaging and microscopy. In this paper, we introduce a new class of sensor, the Coded Wavefront Sensor, which provides high spatio-temporal resolution using a simple masked sensor under white light illumination. Specifically, we demonstrate megapixel spatial resolution and phase accuracy better than 0.1 wavelengths at reconstruction rates of 50 Hz or more, thus opening up many new applications from high-resolution adaptive optics to real-time phase retrieval in microscopy.

  11. Photometric Mapping of Two Kepler Eclipsing Binaries: KIC11560447 and KIC8868650

    NASA Astrophysics Data System (ADS)

    Senavci, Hakan Volkan; Özavci, I.; Isik, E.; Hussain, G. A. J.; O'Neal, D. O.; Yilmaz, M.; Selam, S. O.

    2018-04-01

    We present the surface maps of two eclipsing binary systems KIC11560447 and KIC8868650, using the Kepler light curves covering approximately 4 years. We use the code DoTS, which is based on maximum entropy method in order to reconstruct the surface maps. We also perform numerical tests of DoTS to check the ability of the code in terms of tracking phase migration of spot clusters. The resulting latitudinally averaged maps of KIC11560447 show that spots drift towards increasing orbital longitudes, while the overall behaviour of spots on KIC8868650 drifts towards decreasing latitudes.

  12. Adaptive Integration of the Compressed Algorithm of CS and NPC for the ECG Signal Compressed Algorithm in VLSI Implementation

    PubMed Central

    Tseng, Yun-Hua; Lu, Chih-Wen

    2017-01-01

    Compressed sensing (CS) is a promising approach to the compression and reconstruction of electrocardiogram (ECG) signals. It has been shown that following reconstruction, most of the changes between the original and reconstructed signals are distributed in the Q, R, and S waves (QRS) region. Furthermore, any increase in the compression ratio tends to increase the magnitude of the change. This paper presents a novel approach integrating the near-precise compressed (NPC) and CS algorithms. The simulation results presented notable improvements in signal-to-noise ratio (SNR) and compression ratio (CR). The efficacy of this approach was verified by fabricating a highly efficient low-cost chip using the Taiwan Semiconductor Manufacturing Company’s (TSMC) 0.18-μm Complementary Metal-Oxide-Semiconductor (CMOS) technology. The proposed core has an operating frequency of 60 MHz and gate counts of 2.69 K. PMID:28991216

  13. Integrated Sachs-Wolfe map reconstruction in the presence of systematic errors

    NASA Astrophysics Data System (ADS)

    Weaverdyck, Noah; Muir, Jessica; Huterer, Dragan

    2018-02-01

    The decay of gravitational potentials in the presence of dark energy leads to an additional, late-time contribution to anisotropies in the cosmic microwave background (CMB) at large angular scales. The imprint of this so-called integrated Sachs-Wolfe (ISW) effect to the CMB angular power spectrum has been detected and studied in detail, but reconstructing its spatial contributions to the CMB map, which would offer the tantalizing possibility of separating the early- from the late-time contributions to CMB temperature fluctuations, is more challenging. Here, we study the technique for reconstructing the ISW map based on information from galaxy surveys and focus in particular on how its accuracy is impacted by the presence of photometric calibration errors in input galaxy maps, which were previously found to be a dominant contaminant for ISW signal estimation. We find that both including tomographic information from a single survey and using data from multiple, complementary galaxy surveys improve the reconstruction by mitigating the impact of spurious power contributions from calibration errors. A high-fidelity reconstruction further requires one to account for the contribution of calibration errors to the observed galaxy power spectrum in the model used to construct the ISW estimator. We find that if the photometric calibration errors in galaxy surveys can be independently controlled at the level required to obtain unbiased dark energy constraints, then it is possible to reconstruct ISW maps with excellent accuracy using a combination of maps from two galaxy surveys with properties similar to Euclid and SPHEREx.

  14. Secure fixation of femoral bone plug with a suspensory button in anatomical anterior cruciate ligament reconstruction with bone-patellar tendon-bone graft

    PubMed Central

    TAKETOMI, SHUJI; INUI, HIROSHI; NAKAMURA, KENSUKE; YAMAGAMI, RYOTA; TAHARA, KEITARO; SANADA, TAKAKI; MASUDA, HIRONARI; TANAKA, SAKAE; NAKAGAWA, TAKUMI

    2015-01-01

    Purpose the efficacy and safety of using a suspensory button for femoral fixation in anatomical anterior cruciate ligament (ACL) reconstruction with bone-patellar tendon-bone (BPTB) graft have not been established. The purpose of the current study was to evaluate bone plug integration onto the femoral socket and migration of the bone plug and the EndoButton (EB) (Smith & Nephew, Andover, MA, USA) after rectangular tunnel ACL reconstruction with BPTB autograft. Methods thirty-four patients who underwent anatomical rectangular ACL reconstruction with BPTB graft using EB for femoral fixation and in whom three-dimensional (3D) computed tomography (CT) was performed one week and one year after surgery were included in this study. Bone plug integration onto the femoral socket, bone plug migration, soft tissue interposition, EB migration and EB rotation were evaluated on 3D CT. The clinical outcome was also assessed and correlated with the imaging outcomes. Results the bone plug was integrated onto the femoral socket in all cases. The incidence of bone plug migration, soft tissue interposition, EB migration and EB rotation was 15, 15, 9 and 56%, respectively. No significant association was observed between the imaging outcomes. The postoperative mean Lysholm score was 97.1 ± 5.0 points. The postoperative side-to-side difference, evaluated using a KT-2000 arthrometer, averaged 0.5 ± 1.3 mm. There were no complications associated with EB use. Imaging outcomes did not affect the postoperative KT side-to-side difference. Conclusions the EB is considered a reliable device for femoral fixation in anatomical rectangular tunnel ACL reconstruction with BPTB autograft. Level of evidence Level IV, therapeutic case series. PMID:26889465

  15. Innovative Moments in Grief Therapy: Reconstructing Meaning Following Perinatal Death

    ERIC Educational Resources Information Center

    Alves, Daniela; Mendes, Ines; Goncalves, Miguel M.; Neimeyer, Robert A.

    2012-01-01

    This article presents an intensive analysis of a good outcome case of constructivist grief therapy with a bereaved mother, using the Innovative Moments Coding System (IMCS). Inspired by M. White and D. Epston's narrative therapy, the IMCS conceptualizes therapeutic change as resulting from the elaboration and expansion of unique outcomes (or as we…

  16. Cognitive Pathways: Analysis of Students' Written Texts for Science Understanding

    ERIC Educational Resources Information Center

    Grimberg, Bruna Irene; Hand, Brian

    2009-01-01

    The purpose of this study was to reconstruct writers' reasoning process as reflected in their written texts. The codes resulting from the text analysis were related to cognitive operations, ranging from simple to more sophisticated ones. The sequence of the cognitive operations as the text unfolded represents the writer's cognitive pathway at the…

  17. Repertoires, Characters and Scenes: Sociolinguistic Difference in Turkish-German Comedy

    ERIC Educational Resources Information Center

    Androutsopoulos, Jannis

    2012-01-01

    This paper examines representations of sociolinguistic difference in a German "ethnic comedy" as a means to contribute to a framework for the sociolinguistic study of film. Three levels of analysis of sociolinguistic difference in film are distinguished: repertoire analysis reconstructs the entirety of codes used in a film and their…

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yang, Chao

    Sparx, a new environment for Cryo-EM image processing; Cryo-EM, Single particle reconstruction, principal component analysis; Hardware Req.: PC, MAC, Supercomputer, Mainframe, Multiplatform, Workstation. Software Req.: operating system is Unix; Compiler C++; type of files: source code, object library, executable modules, compilation instructions; sample problem input data. Location/transmission: http://sparx-em.org; User manual & paper: http://sparx-em.org;

  19. PARALLELISATION OF THE MODEL-BASED ITERATIVE RECONSTRUCTION ALGORITHM DIRA.

    PubMed

    Örtenberg, A; Magnusson, M; Sandborg, M; Alm Carlsson, G; Malusek, A

    2016-06-01

    New paradigms for parallel programming have been devised to simplify software development on multi-core processors and many-core graphical processing units (GPU). Despite their obvious benefits, the parallelisation of existing computer programs is not an easy task. In this work, the use of the Open Multiprocessing (OpenMP) and Open Computing Language (OpenCL) frameworks is considered for the parallelisation of the model-based iterative reconstruction algorithm DIRA with the aim to significantly shorten the code's execution time. Selected routines were parallelised using OpenMP and OpenCL libraries; some routines were converted from MATLAB to C and optimised. Parallelisation of the code with the OpenMP was easy and resulted in an overall speedup of 15 on a 16-core computer. Parallelisation with OpenCL was more difficult owing to differences between the central processing unit and GPU architectures. The resulting speedup was substantially lower than the theoretical peak performance of the GPU; the cause was explained. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  20. MELCOR simulations of the severe accident at Fukushima Daiichi Unit 3

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cardoni, Jeffrey; Gauntt, Randall; Kalinich, Donald

    In response to the accident at the Fukushima Daiichi nuclear power station in Japan, the U.S. Nuclear Regulatory Commission and U.S. Department of Energy agreed to jointly sponsor an accident reconstruction study as a means of assessing the severe accident modeling capability of the MELCOR code. Objectives of the project included reconstruction of the accident progressions using computer models and accident data, and validation of the MELCOR code and the Fukushima models against plant data. A MELCOR 2.1 model of the Fukushima Daiichi Unit 3 reactor is developed using plant-specific information and accident-specific boundary conditions, which involve considerable uncertainty duemore » to the inherent nature of severe accidents. Publicly available thermal-hydraulic data and radioactivity release estimates have evolved significantly since the accidents. Such data are expected to continually change as the reactors are decommissioned and more measurements are performed. As a result, the MELCOR simulations in this work primarily use boundary conditions that are based on available plant data as of May 2012.« less

  1. A comparison of semiglobal and local dense matching algorithms for surface reconstruction

    NASA Astrophysics Data System (ADS)

    Dall'Asta, E.; Roncella, R.

    2014-06-01

    Encouraged by the growing interest in automatic 3D image-based reconstruction, the development and improvement of robust stereo matching techniques is one of the most investigated research topic of the last years in photogrammetry and computer vision. The paper is focused on the comparison of some stereo matching algorithms (local and global) which are very popular both in photogrammetry and computer vision. In particular, the Semi-Global Matching (SGM), which realizes a pixel-wise matching and relies on the application of consistency constraints during the matching cost aggregation, will be discussed. The results of some tests performed on real and simulated stereo image datasets, evaluating in particular the accuracy of the obtained digital surface models, will be presented. Several algorithms and different implementation are considered in the comparison, using freeware software codes like MICMAC and OpenCV, commercial software (e.g. Agisoft PhotoScan) and proprietary codes implementing Least Square e Semi-Global Matching algorithms. The comparisons will also consider the completeness and the level of detail within fine structures, and the reliability and repeatability of the obtainable data.

  2. MELCOR simulations of the severe accident at Fukushima Daiichi Unit 3

    DOE PAGES

    Cardoni, Jeffrey; Gauntt, Randall; Kalinich, Donald; ...

    2014-05-01

    In response to the accident at the Fukushima Daiichi nuclear power station in Japan, the U.S. Nuclear Regulatory Commission and U.S. Department of Energy agreed to jointly sponsor an accident reconstruction study as a means of assessing the severe accident modeling capability of the MELCOR code. Objectives of the project included reconstruction of the accident progressions using computer models and accident data, and validation of the MELCOR code and the Fukushima models against plant data. A MELCOR 2.1 model of the Fukushima Daiichi Unit 3 reactor is developed using plant-specific information and accident-specific boundary conditions, which involve considerable uncertainty duemore » to the inherent nature of severe accidents. Publicly available thermal-hydraulic data and radioactivity release estimates have evolved significantly since the accidents. Such data are expected to continually change as the reactors are decommissioned and more measurements are performed. As a result, the MELCOR simulations in this work primarily use boundary conditions that are based on available plant data as of May 2012.« less

  3. Sparse coding joint decision rule for ear print recognition

    NASA Astrophysics Data System (ADS)

    Guermoui, Mawloud; Melaab, Djamel; Mekhalfi, Mohamed Lamine

    2016-09-01

    Human ear recognition has been promoted as a profitable biometric over the past few years. With respect to other modalities, such as the face and iris, that have undergone a significant investigation in the literature, ear pattern is relatively still uncommon. We put forth a sparse coding-induced decision-making for ear recognition. It jointly involves the reconstruction residuals and the respective reconstruction coefficients pertaining to the input features (co-occurrence of adjacent local binary patterns) for a further fusion. We particularly show that combining both components (i.e., the residuals as well as the coefficients) yields better outcomes than the case when either of them is deemed singly. The proposed method has been evaluated on two benchmark datasets, namely IITD1 (125 subject) and IITD2 (221 subjects). The recognition rates of the suggested scheme amount for 99.5% and 98.95% for both datasets, respectively, which suggest that our method decently stands out against reference state-of-the-art methodologies. Furthermore, experiments conclude that the presented scheme manifests a promising robustness under large-scale occlusion scenarios.

  4. Determination of the Core of a Minimal Bacterial Gene Set†

    PubMed Central

    Gil, Rosario; Silva, Francisco J.; Peretó, Juli; Moya, Andrés

    2004-01-01

    The availability of a large number of complete genome sequences raises the question of how many genes are essential for cellular life. Trying to reconstruct the core of the protein-coding gene set for a hypothetical minimal bacterial cell, we have performed a computational comparative analysis of eight bacterial genomes. Six of the analyzed genomes are very small due to a dramatic genome size reduction process, while the other two, corresponding to free-living relatives, are larger. The available data from several systematic experimental approaches to define all the essential genes in some completely sequenced bacterial genomes were also considered, and a reconstruction of a minimal metabolic machinery necessary to sustain life was carried out. The proposed minimal genome contains 206 protein-coding genes with all the genetic information necessary for self-maintenance and reproduction in the presence of a full complement of essential nutrients and in the absence of environmental stress. The main features of such a minimal gene set, as well as the metabolic functions that must be present in the hypothetical minimal cell, are discussed. PMID:15353568

  5. Environmental implications of the use of sulfidic back-bay sediments for dune reconstruction — Lessons learned post Hurricane Sandy

    USGS Publications Warehouse

    Plumlee, Geoffrey S.; Benzel, William M.; Hoefen, Todd M.; Hageman, Philip L.; Morman, Suzette A.; Reilly, Timothy J.; Adams, Monique; Berry, Cyrus J.; Fischer, Jeffrey; Fisher, Irene

    2016-01-01

    Some barrier-island dunes damaged or destroyed by Hurricane Sandy's storm surges in October 2012 have been reconstructed using sediments dredged from back bays. These sand-, clay-, and iron sulfide-rich sediments were used to make berm-like cores for the reconstructed dunes, which were then covered by beach sand. In November 2013, we sampled and analyzed partially weathered materials collected from the cores of reconstructed dunes. There are generally low levels of metal toxicants in the reconstructed dune materials. However oxidation of reactive iron sulfides by percolating rainwater produces acid-sulfate pore waters, which evaporate during dry periods to produce efflorescent gypsum and sodium jarosite salts. The results suggest use of sulfidic sediments in dune reconstruction has both drawbacks (e.g., potential to generate acid runoff from dune cores following rainfall, enhanced corrosion of steel bulwarks) and possible benefits (e.g., efflorescent salts may enhance structural integrity).

  6. Image reconstruction for PET/CT scanners: past achievements and future challenges

    PubMed Central

    Tong, Shan; Alessio, Adam M; Kinahan, Paul E

    2011-01-01

    PET is a medical imaging modality with proven clinical value for disease diagnosis and treatment monitoring. The integration of PET and CT on modern scanners provides a synergy of the two imaging modalities. Through different mathematical algorithms, PET data can be reconstructed into the spatial distribution of the injected radiotracer. With dynamic imaging, kinetic parameters of specific biological processes can also be determined. Numerous efforts have been devoted to the development of PET image reconstruction methods over the last four decades, encompassing analytic and iterative reconstruction methods. This article provides an overview of the commonly used methods. Current challenges in PET image reconstruction include more accurate quantitation, TOF imaging, system modeling, motion correction and dynamic reconstruction. Advances in these aspects could enhance the use of PET/CT imaging in patient care and in clinical research studies of pathophysiology and therapeutic interventions. PMID:21339831

  7. Self-Powered Forward Error-Correcting Biosensor Based on Integration of Paper-Based Microfluidics and Self-Assembled Quick Response Codes.

    PubMed

    Yuan, Mingquan; Liu, Keng-Ku; Singamaneni, Srikanth; Chakrabartty, Shantanu

    2016-10-01

    This paper extends our previous work on silver-enhancement based self-assembling structures for designing reliable, self-powered biosensors with forward error correcting (FEC) capability. At the core of the proposed approach is the integration of paper-based microfluidics with quick response (QR) codes that can be optically scanned using a smart-phone. The scanned information is first decoded to obtain the location of a web-server which further processes the self-assembled QR image to determine the concentration of target analytes. The integration substrate for the proposed FEC biosensor is polyethylene and the patterning of the QR code on the substrate has been achieved using a combination of low-cost ink-jet printing and a regular ballpoint dispensing pen. A paper-based microfluidics channel has been integrated underneath the substrate for acquiring, mixing and flowing the sample to areas on the substrate where different parts of the code can self-assemble in presence of immobilized gold nanorods. In this paper we demonstrate the proof-of-concept detection using prototypes of QR encoded FEC biosensors.

  8. A phylogenetic Kalman filter for ancestral trait reconstruction using molecular data.

    PubMed

    Lartillot, Nicolas

    2014-02-15

    Correlation between life history or ecological traits and genomic features such as nucleotide or amino acid composition can be used for reconstructing the evolutionary history of the traits of interest along phylogenies. Thus far, however, such ancestral reconstructions have been done using simple linear regression approaches that do not account for phylogenetic inertia. These reconstructions could instead be seen as a genuine comparative regression problem, such as formalized by classical generalized least-square comparative methods, in which the trait of interest and the molecular predictor are represented as correlated Brownian characters coevolving along the phylogeny. Here, a Bayesian sampler is introduced, representing an alternative and more efficient algorithmic solution to this comparative regression problem, compared with currently existing generalized least-square approaches. Technically, ancestral trait reconstruction based on a molecular predictor is shown to be formally equivalent to a phylogenetic Kalman filter problem, for which backward and forward recursions are developed and implemented in the context of a Markov chain Monte Carlo sampler. The comparative regression method results in more accurate reconstructions and a more faithful representation of uncertainty, compared with simple linear regression. Application to the reconstruction of the evolution of optimal growth temperature in Archaea, using GC composition in ribosomal RNA stems and amino acid composition of a sample of protein-coding genes, confirms previous findings, in particular, pointing to a hyperthermophilic ancestor for the kingdom. The program is freely available at www.phylobayes.org.

  9. Ensemble Kalman filter for the reconstruction of the Earth's mantle circulation

    NASA Astrophysics Data System (ADS)

    Bocher, Marie; Fournier, Alexandre; Coltice, Nicolas

    2018-02-01

    Recent advances in mantle convection modeling led to the release of a new generation of convection codes, able to self-consistently generate plate-like tectonics at their surface. Those models physically link mantle dynamics to surface tectonics. Combined with plate tectonic reconstructions, they have the potential to produce a new generation of mantle circulation models that use data assimilation methods and where uncertainties in plate tectonic reconstructions are taken into account. We provided a proof of this concept by applying a suboptimal Kalman filter to the reconstruction of mantle circulation (Bocher et al., 2016). Here, we propose to go one step further and apply the ensemble Kalman filter (EnKF) to this problem. The EnKF is a sequential Monte Carlo method particularly adapted to solve high-dimensional data assimilation problems with nonlinear dynamics. We tested the EnKF using synthetic observations consisting of surface velocity and heat flow measurements on a 2-D-spherical annulus model and compared it with the method developed previously. The EnKF performs on average better and is more stable than the former method. Less than 300 ensemble members are sufficient to reconstruct an evolution. We use covariance adaptive inflation and localization to correct for sampling errors. We show that the EnKF results are robust over a wide range of covariance localization parameters. The reconstruction is associated with an estimation of the error, and provides valuable information on where the reconstruction is to be trusted or not.

  10. Current Development Status of an Integrated Tool for Modeling Quasi-static Deformation in the Solid Earth

    NASA Astrophysics Data System (ADS)

    Williams, C. A.; Dicaprio, C.; Simons, M.

    2003-12-01

    With the advent of projects such as the Plate Boundary Observatory and future InSAR missions, spatially dense geodetic data of high quality will provide an increasingly detailed picture of the movement of the earth's surface. To interpret such information, powerful and easily accessible modeling tools are required. We are presently developing such a tool that we feel will meet many of the needs for evaluating quasi-static earth deformation. As a starting point, we begin with a modified version of the finite element code TECTON, which has been specifically designed to solve tectonic problems involving faulting and viscoelastic/plastic earth behavior. As our first priority, we are integrating the code into the GeoFramework, which is an extension of the Python-based Pyre modeling framework. The goal of this framework is to provide simplified user interfaces for powerful modeling codes, to provide easy access to utilities such as meshers and visualization tools, and to provide a tight integration between different modeling tools so they can interact with each other. The initial integration of the code into this framework is essentially complete, and a more thorough integration, where Python-based drivers control the entire solution, will be completed in the near future. We have an evolving set of priorities that we expect to solidify as we receive more input from the modeling community. Current priorities include the development of linear and quadratic tetrahedral elements, the development of a parallelized version of the code using the PETSc libraries, the addition of more complex rheologies, realistic fault friction models, adaptive time stepping, and spherical geometries. In this presentation we describe current progress toward our various priorities, briefly describe the structure of the code within the GeoFramework, and demonstrate some sample applications.

  11. Integration of Plant Metabolomics Data with Metabolic Networks: Progresses and Challenges.

    PubMed

    Töpfer, Nadine; Seaver, Samuel M D; Aharoni, Asaph

    2018-01-01

    In the last decade, plant genome-scale modeling has developed rapidly and modeling efforts have advanced from representing metabolic behavior of plant heterotrophic cell suspensions to studying the complex interplay of cell types, tissues, and organs. A crucial driving force for such developments is the availability and integration of "omics" data (e.g., transcriptomics, proteomics, and metabolomics) which enable the reconstruction, extraction, and application of context-specific metabolic networks. In this chapter, we demonstrate a workflow to integrate gas chromatography coupled to mass spectrometry (GC-MS)-based metabolomics data of tomato fruit pericarp (flesh) tissue, at five developmental stages, with a genome-scale reconstruction of tomato metabolism. This method allows for the extraction of context-specific networks reflecting changing activities of metabolic pathways throughout fruit development and maturation.

  12. 2D image of local density and magnetic fluctuations from line-integrated interferometry-polarimetry measurements.

    PubMed

    Lin, L; Ding, W X; Brower, D L

    2014-11-01

    Combined polarimetry-interferometry capability permits simultaneous measurement of line-integrated density and Faraday effect with fast time response (∼1 μs) and high sensitivity. Faraday effect fluctuations with phase shift of order 0.05° associated with global tearing modes are resolved with an uncertainty ∼0.01°. For physics investigations, local density fluctuations are obtained by inverting the line-integrated interferometry data. The local magnetic and current density fluctuations are then reconstructed using a parameterized fit of the polarimetry data. Reconstructed 2D images of density and magnetic field fluctuations in a poloidal cross section exhibit significantly different spatial structure. Combined with their relative phase, the magnetic-fluctuation-induced particle transport flux and its spatial distribution are resolved.

  13. 2D image of local density and magnetic fluctuations from line-integrated interferometry-polarimetry measurements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lin, L., E-mail: lianglin@ucla.edu; Ding, W. X.; Brower, D. L.

    2014-11-15

    Combined polarimetry-interferometry capability permits simultaneous measurement of line-integrated density and Faraday effect with fast time response (∼1 μs) and high sensitivity. Faraday effect fluctuations with phase shift of order 0.05° associated with global tearing modes are resolved with an uncertainty ∼0.01°. For physics investigations, local density fluctuations are obtained by inverting the line-integrated interferometry data. The local magnetic and current density fluctuations are then reconstructed using a parameterized fit of the polarimetry data. Reconstructed 2D images of density and magnetic field fluctuations in a poloidal cross section exhibit significantly different spatial structure. Combined with their relative phase, the magnetic-fluctuation-induced particlemore » transport flux and its spatial distribution are resolved.« less

  14. Regularized iterative integration combined with non-linear diffusion filtering for phase-contrast x-ray computed tomography.

    PubMed

    Burger, Karin; Koehler, Thomas; Chabior, Michael; Allner, Sebastian; Marschner, Mathias; Fehringer, Andreas; Willner, Marian; Pfeiffer, Franz; Noël, Peter

    2014-12-29

    Phase-contrast x-ray computed tomography has a high potential to become clinically implemented because of its complementarity to conventional absorption-contrast.In this study, we investigate noise-reducing but resolution-preserving analytical reconstruction methods to improve differential phase-contrast imaging. We apply the non-linear Perona-Malik filter on phase-contrast data prior or post filtered backprojected reconstruction. Secondly, the Hilbert kernel is replaced by regularized iterative integration followed by ramp filtered backprojection as used for absorption-contrast imaging. Combining the Perona-Malik filter with this integration algorithm allows to successfully reveal relevant sample features, quantitatively confirmed by significantly increased structural similarity indices and contrast-to-noise ratios. With this concept, phase-contrast imaging can be performed at considerably lower dose.

  15. Effect of automated tube voltage selection, integrated circuit detector and advanced iterative reconstruction on radiation dose and image quality of 3rd generation dual-source aortic CT angiography: An intra-individual comparison.

    PubMed

    Mangold, Stefanie; De Cecco, Carlo N; Wichmann, Julian L; Canstein, Christian; Varga-Szemes, Akos; Caruso, Damiano; Fuller, Stephen R; Bamberg, Fabian; Nikolaou, Konstantin; Schoepf, U Joseph

    2016-05-01

    To compare, on an intra-individual basis, the effect of automated tube voltage selection (ATVS), integrated circuit detector and advanced iterative reconstruction on radiation dose and image quality of aortic CTA studies using 2nd and 3rd generation dual-source CT (DSCT). We retrospectively evaluated 32 patients who had undergone CTA of the entire aorta with both 2nd generation DSCT at 120kV using filtered back projection (FBP) (protocol 1) and 3rd generation DSCT using ATVS, an integrated circuit detector and advanced iterative reconstruction (protocol 2). Contrast-to-noise ratio (CNR) was calculated. Image quality was subjectively evaluated using a five-point scale. Radiation dose parameters were recorded. All studies were considered of diagnostic image quality. CNR was significantly higher with protocol 2 (15.0±5.2 vs 11.0±4.2; p<.0001). Subjective image quality analysis revealed no significant differences for evaluation of attenuation (p=0.08501) but image noise was rated significantly lower with protocol 2 (p=0.0005). Mean tube voltage and effective dose were 94.7±14.1kV and 6.7±3.9mSv with protocol 2; 120±0kV and 11.5±5.2mSv with protocol 1 (p<0.0001, respectively). Aortic CTA performed with 3rd generation DSCT, ATVS, integrated circuit detector, and advanced iterative reconstruction allow a substantial reduction of radiation exposure while improving image quality in comparison to 120kV imaging with FBP. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  16. Simultaneous CT-MRI Reconstruction for Constrained Imaging Geometries using Structural Coupling and Compressive Sensing

    PubMed Central

    Xi, Yan; Zhao, Jun; Bennett, James R.; Stacy, Mitchel R.; Sinusas, Albert J.; Wang, Ge

    2016-01-01

    Objective A unified reconstruction framework is presented for simultaneous CT-MRI reconstruction. Significance Combined CT-MRI imaging has the potential for improved results in existing preclinical and clinical applications, as well as opening novel research directions for future applications. Methods In an ideal CT-MRI scanner, CT and MRI acquisitions would occur simultaneously, and hence would be inherently registered in space and time. Alternatively, separately acquired CT and MRI scans can be fused to simulate an instantaneous acquisition. In this study, structural coupling and compressive sensing techniques are combined to unify CT and MRI reconstructions. A bidirectional image estimation method was proposed to connect images from different modalities. Hence, CT and MRI data serve as prior knowledge to each other for better CT and MRI image reconstruction than what could be achieved with separate reconstruction. Results Our integrated reconstruction methodology is demonstrated with numerical phantom and real-dataset based experiments, and has yielded promising results. PMID:26672028

  17. Optical asymmetric cryptography based on elliptical polarized light linear truncation and a numerical reconstruction technique.

    PubMed

    Lin, Chao; Shen, Xueju; Wang, Zhisong; Zhao, Cheng

    2014-06-20

    We demonstrate a novel optical asymmetric cryptosystem based on the principle of elliptical polarized light linear truncation and a numerical reconstruction technique. The device of an array of linear polarizers is introduced to achieve linear truncation on the spatially resolved elliptical polarization distribution during image encryption. This encoding process can be characterized as confusion-based optical cryptography that involves no Fourier lens and diffusion operation. Based on the Jones matrix formalism, the intensity transmittance for this truncation is deduced to perform elliptical polarized light reconstruction based on two intensity measurements. Use of a quick response code makes the proposed cryptosystem practical, with versatile key sensitivity and fault tolerance. Both simulation and preliminary experimental results that support theoretical analysis are presented. An analysis of the resistance of the proposed method on a known public key attack is also provided.

  18. Tomo3D 2.0--exploitation of advanced vector extensions (AVX) for 3D reconstruction.

    PubMed

    Agulleiro, Jose-Ignacio; Fernandez, Jose-Jesus

    2015-02-01

    Tomo3D is a program for fast tomographic reconstruction on multicore computers. Its high speed stems from code optimization, vectorization with Streaming SIMD Extensions (SSE), multithreading and optimization of disk access. Recently, Advanced Vector eXtensions (AVX) have been introduced in the x86 processor architecture. Compared to SSE, AVX double the number of simultaneous operations, thus pointing to a potential twofold gain in speed. However, in practice, achieving this potential is extremely difficult. Here, we provide a technical description and an assessment of the optimizations included in Tomo3D to take advantage of AVX instructions. Tomo3D 2.0 allows huge reconstructions to be calculated in standard computers in a matter of minutes. Thus, it will be a valuable tool for electron tomography studies with increasing resolution needs. Copyright © 2014 Elsevier Inc. All rights reserved.

  19. Kinetic equilibrium reconstruction for the NBI- and ICRH-heated H-mode plasma on EAST tokamak

    NASA Astrophysics Data System (ADS)

    Zhen, ZHENG; Nong, XIANG; Jiale, CHEN; Siye, DING; Hongfei, DU; Guoqiang, LI; Yifeng, WANG; Haiqing, LIU; Yingying, LI; Bo, LYU; Qing, ZANG

    2018-04-01

    The equilibrium reconstruction is important to study the tokamak plasma physical processes. To analyze the contribution of fast ions to the equilibrium, the kinetic equilibria at two time-slices in a typical H-mode discharge with different auxiliary heatings are reconstructed by using magnetic diagnostics, kinetic diagnostics and TRANSP code. It is found that the fast-ion pressure might be up to one-third of the plasma pressure and the contribution is mainly in the core plasma due to the neutral beam injection power is primarily deposited in the core region. The fast-ion current contributes mainly in the core region while contributes little to the pedestal current. A steep pressure gradient in the pedestal is observed which gives rise to a strong edge current. It is proved that the fast ion effects cannot be ignored and should be considered in the future study of EAST.

  20. Experimental quantum compressed sensing for a seven-qubit system

    PubMed Central

    Riofrío, C. A.; Gross, D.; Flammia, S. T.; Monz, T.; Nigg, D.; Blatt, R.; Eisert, J.

    2017-01-01

    Well-controlled quantum devices with their increasing system size face a new roadblock hindering further development of quantum technologies. The effort of quantum tomography—the reconstruction of states and processes of a quantum device—scales unfavourably: state-of-the-art systems can no longer be characterized. Quantum compressed sensing mitigates this problem by reconstructing states from incomplete data. Here we present an experimental implementation of compressed tomography of a seven-qubit system—a topological colour code prepared in a trapped ion architecture. We are in the highly incomplete—127 Pauli basis measurement settings—and highly noisy—100 repetitions each—regime. Originally, compressed sensing was advocated for states with few non-zero eigenvalues. We argue that low-rank estimates are appropriate in general since statistical noise enables reliable reconstruction of only the leading eigenvectors. The remaining eigenvectors behave consistently with a random-matrix model that carries no information about the true state. PMID:28513587

Top