Sample records for reconstructed process facility

  1. 40 CFR 60.560 - Applicability and designation of affected facilities.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... section in a polypropylene or polyethylene production process is a potential affected facility for both... constructed, modified, or reconstructed and, in some instances, on the type of production process. (i) The... reconstructed after January 10, 1989, regardless of the type of production process being used, is January 10...

  2. 40 CFR 60.560 - Applicability and designation of affected facilities.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... section in a polypropylene or polyethylene production process is a potential affected facility for both... constructed, modified, or reconstructed and, in some instances, on the type of production process. (i) The... reconstructed after January 10, 1989, regardless of the type of production process being used, is January 10...

  3. 17 CFR 37.406 - Trade reconstruction.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 17 Commodity and Securities Exchanges 1 2014-04-01 2014-04-01 false Trade reconstruction. 37.406 Section 37.406 Commodity and Securities Exchanges COMMODITY FUTURES TRADING COMMISSION SWAP EXECUTION FACILITIES Monitoring of Trading and Trade Processing § 37.406 Trade reconstruction. The swap execution...

  4. Synthetic Fiber Production Facilities: New Source Performance Standards (NSPS)

    EPA Pesticide Factsheets

    These standards limits emissions of volatile organic compounds (VOC) from new and reconstructed synthetic fiber production facilities that use solvent-spinning processes. Includes rule history and summary.

  5. 40 CFR 63.1158 - Emission standards for new or reconstructed sources.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... Process Facilities and Hydrochloric Acid Regeneration Plants § 63.1158 Emission standards for new or... percent. (b) Hydrochloric acid regeneration plants. (1) No owner or operator of a new or reconstructed...

  6. 40 CFR 63.1158 - Emission standards for new or reconstructed sources.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... Process Facilities and Hydrochloric Acid Regeneration Plants § 63.1158 Emission standards for new or... percent. (b) Hydrochloric acid regeneration plants. (1) No owner or operator of a new or reconstructed...

  7. 40 CFR 63.1158 - Emission standards for new or reconstructed sources.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Process Facilities and Hydrochloric Acid Regeneration Plants § 63.1158 Emission standards for new or... percent. (b) Hydrochloric acid regeneration plants. (1) No owner or operator of a new or reconstructed...

  8. 40 CFR 63.1158 - Emission standards for new or reconstructed sources.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Process Facilities and Hydrochloric Acid Regeneration Plants § 63.1158 Emission standards for new or... percent. (b) Hydrochloric acid regeneration plants. (1) No owner or operator of a new or reconstructed...

  9. Image-Based Reconstruction and Analysis of Dynamic Scenes in a Landslide Simulation Facility

    NASA Astrophysics Data System (ADS)

    Scaioni, M.; Crippa, J.; Longoni, L.; Papini, M.; Zanzi, L.

    2017-12-01

    The application of image processing and photogrammetric techniques to dynamic reconstruction of landslide simulations in a scaled-down facility is described. Simulations are also used here for active-learning purpose: students are helped understand how physical processes happen and which kinds of observations may be obtained from a sensor network. In particular, the use of digital images to obtain multi-temporal information is presented. On one side, using a multi-view sensor set up based on four synchronized GoPro 4 Black® cameras, a 4D (3D spatial position and time) reconstruction of the dynamic scene is obtained through the composition of several 3D models obtained from dense image matching. The final textured 4D model allows one to revisit in dynamic and interactive mode a completed experiment at any time. On the other side, a digital image correlation (DIC) technique has been used to track surface point displacements from the image sequence obtained from the camera in front of the simulation facility. While the 4D model may provide a qualitative description and documentation of the experiment running, DIC analysis output quantitative information such as local point displacements and velocities, to be related to physical processes and to other observations. All the hardware and software equipment adopted for the photogrammetric reconstruction has been based on low-cost and open-source solutions.

  10. 40 CFR Table 1 to Subpart Lllll of... - Emission Limitations

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... must meet the following emission limitation— 1. Each blowing still, Group 1 asphalt loading rack, and Group 1 asphalt storage tank at existing, new, and reconstructed asphalt processing facilities; and each Group 1 asphalt storage tank at existing, new, and reconstructed roofing manufacturing lines; and each...

  11. 40 CFR Table 1 to Subpart Lllll of... - Emission Limitations

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... must meet the following emission limitation— 1. Each blowing still, Group 1 asphalt loading rack, and Group 1 asphalt storage tank at existing, new, and reconstructed asphalt processing facilities; and each Group 1 asphalt storage tank at existing, new, and reconstructed roofing manufacturing lines; and each...

  12. RF tomography of metallic objects in free space: preliminary results

    NASA Astrophysics Data System (ADS)

    Li, Jia; Ewing, Robert L.; Berdanier, Charles; Baker, Christopher

    2015-05-01

    RF tomography has great potential in defense and homeland security applications. A distributed sensing research facility is under development at Air Force Research Lab. To develop a RF tomographic imaging system for the facility, preliminary experiments have been performed in an indoor range with 12 radar sensors distributed on a circle of 3m radius. Ultra-wideband pulses are used to illuminate single and multiple metallic targets. The echoes received by distributed sensors were processed and combined for tomography reconstruction. Traditional matched filter algorithm and truncated singular value decomposition (SVD) algorithm are compared in terms of their complexity, accuracy, and suitability for distributed processing. A new algorithm is proposed for shape reconstruction, which jointly estimates the object boundary and scatter points on the waveform's propagation path. The results show that the new algorithm allows accurate reconstruction of object shape, which is not available through the matched filter and truncated SVD algorithms.

  13. Reconstruction of 3d Objects of Assets and Facilities by Using Benchmark Points

    NASA Astrophysics Data System (ADS)

    Baig, S. U.; Rahman, A. A.

    2013-08-01

    Acquiring and modeling 3D geo-data of building assets and facility objects is one of the challenges. A number of methods and technologies are being utilized for this purpose. Total station, GPS, photogrammetric and terrestrial laser scanning are few of these technologies. In this paper, points commonly shared by potential facades of assets and facilities modeled from point clouds are identified. These points are useful for modeling process to reconstruct 3D models of assets and facilities stored to be used for management purposes. These models are segmented through different planes to produce accurate 2D plans. This novel method improves the efficiency and quality of construction of models of assets and facilities with the aim utilize in 3D management projects such as maintenance of buildings or group of items that need to be replaced, or renovated for new services.

  14. Defining and reconstructing clinical processes based on IHE and BPMN 2.0.

    PubMed

    Strasser, Melanie; Pfeifer, Franz; Helm, Emmanuel; Schuler, Andreas; Altmann, Josef

    2011-01-01

    This paper describes the current status and the results of our process management system for defining and reconstructing clinical care processes, which contributes to compare, analyze and evaluate clinical processes and further to identify high cost tasks or stays. The system is founded on IHE, which guarantees standardized interfaces and interoperability between clinical information systems. At the heart of the system there is BPMN, a modeling notation and specification language, which allows the definition and execution of clinical processes. The system provides functionality to define healthcare information system independent clinical core processes and to execute the processes in a workflow engine. Furthermore, the reconstruction of clinical processes is done by evaluating an IHE audit log database, which records patient movements within a health care facility. The main goal of the system is to assist hospital operators and clinical process managers to detect discrepancies between defined and actual clinical processes and as well to identify main causes of high medical costs. Beyond that, the system can potentially contribute to reconstruct and improve clinical processes and enhance cost control and patient care quality.

  15. Sustainable data policy for a data production facility: a work in (continual) progress

    NASA Astrophysics Data System (ADS)

    Ketcham, R. A.

    2017-12-01

    The University of Texas High-Resolution X-Ray Computed Tomography Facility (UTCT) has been producing volumetric data and data products of geological and other scientific specimens and engineering materials for over 20 years. Data volumes, both in terms of the size of individual data sets and overall facility production, have progressively grown and fluctuated near the upper boundary of what can be managed by contemporary workstations and lab-scale servers and network infrastructure, making data policy a preoccupation for our entire history. Although all projects have been archived since our first day of operation, policies on which data to keep (raw, reconstructed after corrections, processed) have varied, and been periodically revisited in consideration of the cost of curation and the likelihood of revisiting and reprocessing data when better techniques become available, such as improved artifact corrections or iterative tomographic reconstruction. Advances in instrumentation regularly make old data obsolete and more advantageous to reacquire, but the simple act of getting a sample to a scanning facility is a practical barrier that cannot be overlooked. In our experience, the main times that raw data have been revisited using improved processing to improve image quality were predictable, high-impact charismatic projects (e.g., Archaeopteryx, A. Afarensis "Lucy"). These cases actually provided the impetus for development of the new techniques (ring and beam hardening artifact reduction), which were subsequently incorporated into our data processing pipeline going forward but were rarely if ever retroactively applied to earlier data sets. The only other times raw data have been reprocessed were when reconstruction parameters were inappropriate, due to unnoticed sample features or human error, which are usually recognized fairly quickly. The optimal data retention policy thus remains an open question, although erring on the side of caution remains the default position.

  16. A high-throughput system for high-quality tomographic reconstruction of large datasets at Diamond Light Source

    PubMed Central

    Atwood, Robert C.; Bodey, Andrew J.; Price, Stephen W. T.; Basham, Mark; Drakopoulos, Michael

    2015-01-01

    Tomographic datasets collected at synchrotrons are becoming very large and complex, and, therefore, need to be managed efficiently. Raw images may have high pixel counts, and each pixel can be multidimensional and associated with additional data such as those derived from spectroscopy. In time-resolved studies, hundreds of tomographic datasets can be collected in sequence, yielding terabytes of data. Users of tomographic beamlines are drawn from various scientific disciplines, and many are keen to use tomographic reconstruction software that does not require a deep understanding of reconstruction principles. We have developed Savu, a reconstruction pipeline that enables users to rapidly reconstruct data to consistently create high-quality results. Savu is designed to work in an ‘orthogonal’ fashion, meaning that data can be converted between projection and sinogram space throughout the processing workflow as required. The Savu pipeline is modular and allows processing strategies to be optimized for users' purposes. In addition to the reconstruction algorithms themselves, it can include modules for identification of experimental problems, artefact correction, general image processing and data quality assessment. Savu is open source, open licensed and ‘facility-independent’: it can run on standard cluster infrastructure at any institution. PMID:25939626

  17. STAR Data Reconstruction at NERSC/Cori, an adaptable Docker container approach for HPC

    NASA Astrophysics Data System (ADS)

    Mustafa, Mustafa; Balewski, Jan; Lauret, Jérôme; Porter, Jefferson; Canon, Shane; Gerhardt, Lisa; Hajdu, Levente; Lukascsyk, Mark

    2017-10-01

    As HPC facilities grow their resources, adaptation of classic HEP/NP workflows becomes a need. Linux containers may very well offer a way to lower the bar to exploiting such resources and at the time, help collaboration to reach vast elastic resources on such facilities and address their massive current and future data processing challenges. In this proceeding, we showcase STAR data reconstruction workflow at Cori HPC system at NERSC. STAR software is packaged in a Docker image and runs at Cori in Shifter containers. We highlight two of the typical end-to-end optimization challenges for such pipelines: 1) data transfer rate which was carried over ESnet after optimizing end points and 2) scalable deployment of conditions database in an HPC environment. Our tests demonstrate equally efficient data processing workflows on Cori/HPC, comparable to standard Linux clusters.

  18. 40 CFR 63.1159 - Operational and equipment standards for existing, new, or reconstructed sources.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Pollutants for Steel Pickling-HCl Process Facilities and Hydrochloric Acid Regeneration Plants § 63.1159... regeneration plant. The owner or operator of an affected plant must operate the affected plant at all times...

  19. 40 CFR 63.1159 - Operational and equipment standards for existing, new, or reconstructed sources.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... Pollutants for Steel Pickling-HCl Process Facilities and Hydrochloric Acid Regeneration Plants § 63.1159... regeneration plant. The owner or operator of an affected plant must operate the affected plant at all times...

  20. 40 CFR 63.1159 - Operational and equipment standards for existing, new, or reconstructed sources.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Pollutants for Steel Pickling-HCl Process Facilities and Hydrochloric Acid Regeneration Plants § 63.1159... regeneration plant. The owner or operator of an affected plant must operate the affected plant at all times...

  1. Use of a corrugated beam pipe as a passive deflector for bunch length measurements

    NASA Astrophysics Data System (ADS)

    Seok, Jimin; Chung, Moses; Kang, Heung-Sik; Min, Chang-Ki; Na, Donghyun

    2018-02-01

    We report the experimental demonstration of bunch length measurements using a corrugated metallic beam pipe as a passive deflector. The corrugated beam pipe has been adopted for reducing longitudinal chirping after the bunch compressors in several XFEL facilities worldwide. In the meantime, there have been attempts to measure the electron bunch's longitudinal current profile using the dipole wakefields generated in the corrugated pipe. Nevertheless, the bunch shape reconstructed from the nonlinearly deflected beam suffers from significant distortion, particularly near the head of the bunch. In this paper, we introduce an iterative process to improve the resolution of the bunch shape reconstruction. The astra and elegant simulations have been performed for pencil beam and cigar beam cases, in order to verify the effectiveness of the reconstruction process. To overcome the undesirable effects of transverse beam spreads, a measurement scheme involving both the corrugated beam pipe and the spectrometer magnet has been employed, both of which do not require a dedicated (and likely very expensive) rf system. A proof-of-principle experiment was carried out at Pohang Accelerator Laboratory (PAL) Injector Test Facility (ITF), and its results are discussed together with a comparison with the rf deflector measurement.

  2. Evaluation Aspects of Building Structures Reconstructed After a Failure or Catastrophe

    NASA Astrophysics Data System (ADS)

    Krentowski, Janusz R.; Knyziak, Piotr

    2017-10-01

    The article presents the characteristics of several steel structures, among others modernized industrial dye house, school sports hall, truck repair workshop, that have been rebuilt after a disaster or a catastrophe. The structures were analyzed in detail, and the evaluation and reconstruction processes were described. The emergencies that occurred during exploitation of the buildings were the result of multiple mistakes: incorrectly defined intervals between inspections, errors during periodic inspections, incorrect repair work recommendations. The concepts of reinforcement work implemented by the authors, enabling the long-term future failure-free operation of the objects, were presented. Recommendations for monitoring of the facilities, applied after reinforcement or reconstruction, have been formulated. The methodology for the implementation of specialized investigations, such as geodetic, optical, geological, chemical strength tests, both destructive and non-destructive, has been defined. The need to determine the limit values of deformations, deflections, damage or other faults of structural elements and the entire rebuilt facilities, as well as defining conditions for objects’ withdrawal from operation in subsequent exceptional situations was indicated.

  3. 40 CFR 270.72 - Changes during interim status.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... reconstruction of the hazardous waste management facility. Reconstruction occurs when the capital investment in the changes to the facility exceeds 50 percent of the capital cost of a comparable entirely new...

  4. SYRMEP Tomo Project: a graphical user interface for customizing CT reconstruction workflows.

    PubMed

    Brun, Francesco; Massimi, Lorenzo; Fratini, Michela; Dreossi, Diego; Billé, Fulvio; Accardo, Agostino; Pugliese, Roberto; Cedola, Alessia

    2017-01-01

    When considering the acquisition of experimental synchrotron radiation (SR) X-ray CT data, the reconstruction workflow cannot be limited to the essential computational steps of flat fielding and filtered back projection (FBP). More refined image processing is often required, usually to compensate artifacts and enhance the quality of the reconstructed images. In principle, it would be desirable to optimize the reconstruction workflow at the facility during the experiment (beamtime). However, several practical factors affect the image reconstruction part of the experiment and users are likely to conclude the beamtime with sub-optimal reconstructed images. Through an example of application, this article presents SYRMEP Tomo Project (STP), an open-source software tool conceived to let users design custom CT reconstruction workflows. STP has been designed for post-beamtime (off-line use) and for a new reconstruction of past archived data at user's home institution where simple computing resources are available. Releases of the software can be downloaded at the Elettra Scientific Computing group GitHub repository https://github.com/ElettraSciComp/STP-Gui.

  5. Online Tracking Algorithms on GPUs for the P̅ANDA Experiment at FAIR

    NASA Astrophysics Data System (ADS)

    Bianchi, L.; Herten, A.; Ritman, J.; Stockmanns, T.; Adinetz, A.; Kraus, J.; Pleiter, D.

    2015-12-01

    P̅ANDA is a future hadron and nuclear physics experiment at the FAIR facility in construction in Darmstadt, Germany. In contrast to the majority of current experiments, PANDA's strategy for data acquisition is based on event reconstruction from free-streaming data, performed in real time entirely by software algorithms using global detector information. This paper reports the status of the development of algorithms for the reconstruction of charged particle tracks, optimized online data processing applications, using General-Purpose Graphic Processing Units (GPU). Two algorithms for trackfinding, the Triplet Finder and the Circle Hough, are described, and details of their GPU implementations are highlighted. Average track reconstruction times of less than 100 ns are obtained running the Triplet Finder on state-of- the-art GPU cards. In addition, a proof-of-concept system for the dispatch of data to tracking algorithms using Message Queues is presented.

  6. Scientific approach and practical experience for reconstruction of waste water treatment plants in Russia

    NASA Astrophysics Data System (ADS)

    Makisha, Nikolay; Gogina, Elena

    2017-11-01

    Protection of water bodies has a strict dependence on reliable operation of engineering systems and facilities for water supply and sewage. The majority of these plants and stations has been constructed in 1970-1980's in accordance with rules and regulations of that time. So now most of them require reconstruction due to serious physical or/and technological wear. The current condition of water supply and sewage systems and facilities frequently means a hidden source of serious danger for normal life support and ecological safety of cities and towns. The article reveals an obtained experience and modern approaches for reconstruction of waste water and sludge treatment plants that proved their efficiency even if applied in limited conditions such as area limits, investments limits. The main directions of reconstruction: overhaul repair and partial modernization of existing facilities on the basis of initial project; - restoration and modernization of existing systems on the basis on the current documents and their current condition; upgrade of waste water treatment plants (WWTPs) performance on the basis of modern technologies and methods; reconstruction of sewage systems and facilities and treatment quality improvement.

  7. AOF LTAO mode: reconstruction strategy and first test results

    NASA Astrophysics Data System (ADS)

    Oberti, Sylvain; Kolb, Johann; Le Louarn, Miska; La Penna, Paolo; Madec, Pierre-Yves; Neichel, Benoit; Sauvage, Jean-François; Fusco, Thierry; Donaldson, Robert; Soenke, Christian; Suárez Valles, Marcos; Arsenault, Robin

    2016-07-01

    GALACSI is the Adaptive Optics (AO) system serving the instrument MUSE in the framework of the Adaptive Optics Facility (AOF) project. Its Narrow Field Mode (NFM) is a Laser Tomography AO (LTAO) mode delivering high resolution in the visible across a small Field of View (FoV) of 7.5" diameter around the optical axis. From a reconstruction standpoint, GALACSI NFM intends to optimize the correction on axis by estimating the turbulence in volume via a tomographic process, then projecting the turbulence profile onto one single Deformable Mirror (DM) located in the pupil, close to the ground. In this paper, the laser tomographic reconstruction process is described. Several methods (virtual DM, virtual layer projection) are studied, under the constraint of a single matrix vector multiplication. The pseudo-synthetic interaction matrix model and the LTAO reconstructor design are analysed. Moreover, the reconstruction parameter space is explored, in particular the regularization terms. Furthermore, we present here the strategy to define the modal control basis and split the reconstruction between the Low Order (LO) loop and the High Order (HO) loop. Finally, closed loop performance obtained with a 3D turbulence generator will be analysed with respect to the most relevant system parameters to be tuned.

  8. Study on beam geometry and image reconstruction algorithm in fast neutron computerized tomography at NECTAR facility

    NASA Astrophysics Data System (ADS)

    Guo, J.; Bücherl, T.; Zou, Y.; Guo, Z.

    2011-09-01

    Investigations on the fast neutron beam geometry for the NECTAR facility are presented. The results of MCNP simulations and experimental measurements of the beam distributions at NECTAR are compared. Boltzmann functions are used to describe the beam profile in the detection plane assuming the area source to be set up of large number of single neutron point sources. An iterative algebraic reconstruction algorithm is developed, realized and verified by both simulated and measured projection data. The feasibility for improved reconstruction in fast neutron computerized tomography at the NECTAR facility is demonstrated.

  9. Development of digital reconstructed radiography software at new treatment facility for carbon-ion beam scanning of National Institute of Radiological Sciences.

    PubMed

    Mori, Shinichiro; Inaniwa, Taku; Kumagai, Motoki; Kuwae, Tsunekazu; Matsuzaki, Yuka; Furukawa, Takuji; Shirai, Toshiyuki; Noda, Koji

    2012-06-01

    To increase the accuracy of carbon ion beam scanning therapy, we have developed a graphical user interface-based digitally-reconstructed radiograph (DRR) software system for use in routine clinical practice at our center. The DRR software is used in particular scenarios in the new treatment facility to achieve the same level of geometrical accuracy at the treatment as at the imaging session. DRR calculation is implemented simply as the summation of CT image voxel values along the X-ray projection ray. Since we implemented graphics processing unit-based computation, the DRR images are calculated with a speed sufficient for the particular clinical practice requirements. Since high spatial resolution flat panel detector (FPD) images should be registered to the reference DRR images in patient setup process in any scenarios, the DRR images also needs higher spatial resolution close to that of FPD images. To overcome the limitation of the CT spatial resolution imposed by the CT voxel size, we applied image processing to improve the calculated DRR spatial resolution. The DRR software introduced here enabled patient positioning with sufficient accuracy for the implementation of carbon-ion beam scanning therapy at our center.

  10. Scramjet test flow reconstruction for a large-scale expansion tube, Part 1: quasi-one-dimensional modelling

    NASA Astrophysics Data System (ADS)

    Gildfind, D. E.; Jacobs, P. A.; Morgan, R. G.; Chan, W. Y. K.; Gollan, R. J.

    2018-07-01

    Large-scale free-piston driven expansion tubes have uniquely high total pressure capabilities which make them an important resource for development of access-to-space scramjet engine technology. However, many aspects of their operation are complex, and their test flows are fundamentally unsteady and difficult to measure. While computational fluid dynamics methods provide an important tool for quantifying these flows, these calculations become very expensive with increasing facility size and therefore have to be carefully constructed to ensure sufficient accuracy is achieved within feasible computational times. This study examines modelling strategies for a Mach 10 scramjet test condition developed for The University of Queensland's X3 facility. The present paper outlines the challenges associated with test flow reconstruction, describes the experimental set-up for the X3 experiments, and then details the development of an experimentally tuned quasi-one-dimensional CFD model of the full facility. The 1-D model, which accurately captures longitudinal wave processes, is used to calculate the transient flow history in the shock tube. This becomes the inflow to a higher-fidelity 2-D axisymmetric simulation of the downstream facility, detailed in the Part 2 companion paper, leading to a validated, fully defined nozzle exit test flow.

  11. Scramjet test flow reconstruction for a large-scale expansion tube, Part 1: quasi-one-dimensional modelling

    NASA Astrophysics Data System (ADS)

    Gildfind, D. E.; Jacobs, P. A.; Morgan, R. G.; Chan, W. Y. K.; Gollan, R. J.

    2017-11-01

    Large-scale free-piston driven expansion tubes have uniquely high total pressure capabilities which make them an important resource for development of access-to-space scramjet engine technology. However, many aspects of their operation are complex, and their test flows are fundamentally unsteady and difficult to measure. While computational fluid dynamics methods provide an important tool for quantifying these flows, these calculations become very expensive with increasing facility size and therefore have to be carefully constructed to ensure sufficient accuracy is achieved within feasible computational times. This study examines modelling strategies for a Mach 10 scramjet test condition developed for The University of Queensland's X3 facility. The present paper outlines the challenges associated with test flow reconstruction, describes the experimental set-up for the X3 experiments, and then details the development of an experimentally tuned quasi-one-dimensional CFD model of the full facility. The 1-D model, which accurately captures longitudinal wave processes, is used to calculate the transient flow history in the shock tube. This becomes the inflow to a higher-fidelity 2-D axisymmetric simulation of the downstream facility, detailed in the Part 2 companion paper, leading to a validated, fully defined nozzle exit test flow.

  12. TomoBank: a tomographic data repository for computational x-ray science

    NASA Astrophysics Data System (ADS)

    De Carlo, Francesco; Gürsoy, Doğa; Ching, Daniel J.; Joost Batenburg, K.; Ludwig, Wolfgang; Mancini, Lucia; Marone, Federica; Mokso, Rajmund; Pelt, Daniël M.; Sijbers, Jan; Rivers, Mark

    2018-03-01

    There is a widening gap between the fast advancement of computational methods for tomographic reconstruction and their successful implementation in production software at various synchrotron facilities. This is due in part to the lack of readily available instrument datasets and phantoms representative of real materials for validation and comparison of new numerical methods. Recent advancements in detector technology have made sub-second and multi-energy tomographic data collection possible (Gibbs et al 2015 Sci. Rep. 5 11824), but have also increased the demand to develop new reconstruction methods able to handle in situ (Pelt and Batenburg 2013 IEEE Trans. Image Process. 22 5238-51) and dynamic systems (Mohan et al 2015 IEEE Trans. Comput. Imaging 1 96-111) that can be quickly incorporated in beamline production software (Gürsoy et al 2014 J. Synchrotron Radiat. 21 1188-93). The x-ray tomography data bank, tomoBank, provides a repository of experimental and simulated datasets with the aim to foster collaboration among computational scientists, beamline scientists, and experimentalists and to accelerate the development and implementation of tomographic reconstruction methods for synchrotron facility production software by providing easy access to challenging datasets and their descriptors.

  13. Stepwise training for reconstructive microsurgery: the journey to becoming a confident microsurgeon in singapore.

    PubMed

    Ramachandran, Savitha; Ong, Yee-Siang; Chin, Andrew Yh; Song, In-Chin; Ogden, Bryan; Tan, Bien-Keem

    2014-05-01

    Microsurgery training in Singapore began in 1980 with the opening of the Experimental Surgical Unit. Since then, the unit has continued to grow and have held microsurgical training courses biannually. The road to becoming a full-fledged reconstructive surgeon requires the mastering of both microvascular as well as flap raising techniques and requires time, patience and good training facilities. In Singapore, over the past 2 decades, we have had the opportunity to develop good training facilities and to refine our surgical education programmes in reconstructive microsurgery. In this article, we share our experience with training in reconstructive microsurgery.

  14. Stepwise Training for Reconstructive Microsurgery: The Journey to Becoming a Confident Microsurgeon in Singapore

    PubMed Central

    Ong, Yee-Siang; Chin, Andrew YH; Song, In-Chin; Ogden, Bryan; Tan, Bien-Keem

    2014-01-01

    Microsurgery training in Singapore began in 1980 with the opening of the Experimental Surgical Unit. Since then, the unit has continued to grow and have held microsurgical training courses biannually. The road to becoming a full-fledged reconstructive surgeon requires the mastering of both microvascular as well as flap raising techniques and requires time, patience and good training facilities. In Singapore, over the past 2 decades, we have had the opportunity to develop good training facilities and to refine our surgical education programmes in reconstructive microsurgery. In this article, we share our experience with training in reconstructive microsurgery. PMID:24883269

  15. Milestones of mathematical model for business process management related to cost estimate documentation in petroleum industry

    NASA Astrophysics Data System (ADS)

    Khamidullin, R. I.

    2018-05-01

    The paper is devoted to milestones of the optimal mathematical model for a business process related to cost estimate documentation compiled during construction and reconstruction of oil and gas facilities. It describes the study and analysis of fundamental issues in petroleum industry, which are caused by economic instability and deterioration of a business strategy. Business process management is presented as business process modeling aimed at the improvement of the studied business process, namely main criteria of optimization and recommendations for the improvement of the above-mentioned business model.

  16. 40 CFR 63.8186 - When do I have to comply with this subpart?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... December 19, 2006. (b) If you have a new or reconstructed mercury recovery facility and its initial startup..., 2003. (c) If you have a new or reconstructed mercury recovery facility and its initial startup date is... recordkeeping and reporting requirement in this subpart that applies to you upon initial startup. (d) You must...

  17. 40 CFR 63.8186 - When do I have to comply with this subpart?

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... December 19, 2006. (b) If you have a new or reconstructed mercury recovery facility and its initial startup..., 2003. (c) If you have a new or reconstructed mercury recovery facility and its initial startup date is... recordkeeping and reporting requirement in this subpart that applies to you upon initial startup. (d) You must...

  18. 40 CFR 63.8186 - When do I have to comply with this subpart?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... December 19, 2006. (b) If you have a new or reconstructed mercury recovery facility and its initial startup..., 2003. (c) If you have a new or reconstructed mercury recovery facility and its initial startup date is... recordkeeping and reporting requirement in this subpart that applies to you upon initial startup. (d) You must...

  19. 40 CFR 63.8186 - When do I have to comply with this subpart?

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... December 19, 2006. (b) If you have a new or reconstructed mercury recovery facility and its initial startup..., 2003. (c) If you have a new or reconstructed mercury recovery facility and its initial startup date is... recordkeeping and reporting requirement in this subpart that applies to you upon initial startup. (d) You must...

  20. 40 CFR 63.8186 - When do I have to comply with this subpart?

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... December 19, 2006. (b) If you have a new or reconstructed mercury recovery facility and its initial startup..., 2003. (c) If you have a new or reconstructed mercury recovery facility and its initial startup date is... recordkeeping and reporting requirement in this subpart that applies to you upon initial startup. (d) You must...

  1. A GPU-Based Architecture for Real-Time Data Assessment at Synchrotron Experiments

    NASA Astrophysics Data System (ADS)

    Chilingaryan, Suren; Mirone, Alessandro; Hammersley, Andrew; Ferrero, Claudio; Helfen, Lukas; Kopmann, Andreas; Rolo, Tomy dos Santos; Vagovic, Patrik

    2011-08-01

    Advances in digital detector technology leads presently to rapidly increasing data rates in imaging experiments. Using fast two-dimensional detectors in computed tomography, the data acquisition can be much faster than the reconstruction if no adequate measures are taken, especially when a high photon flux at synchrotron sources is used. We have optimized the reconstruction software employed at the micro-tomography beamlines of our synchrotron facilities to use the computational power of modern graphic cards. The main paradigm of our approach is the full utilization of all system resources. We use a pipelined architecture, where the GPUs are used as compute coprocessors to reconstruct slices, while the CPUs are preparing the next ones. Special attention is devoted to minimize data transfers between the host and GPU memory and to execute memory transfers in parallel with the computations. We were able to reduce the reconstruction time by a factor 30 and process a typical data set of 20 GB in 40 seconds. The time needed for the first evaluation of the reconstructed sample is reduced significantly and quasi real-time visualization is now possible.

  2. Disposition of elderly patients after head and neck reconstruction.

    PubMed

    Hatcher, Jeanne L; Bell, Elizabeth Bradford; Browne, J Dale; Waltonen, Joshua D

    2013-11-01

    A patient's needs at discharge, particularly the need for nursing facility placement, may affect hospital length of stay and health care costs. The association between age and disposition after microvascular reconstruction of the head and neck has yet to be reported in the literature. To determine whether elderly patients are more likely to be discharged to a nursing or other care facility as opposed to returning home after microvascular reconstruction of the head and neck. From January 1, 2001, through December 31, 2010, patients undergoing microvascular reconstruction at an academic medical center were identified and their medical records systematically reviewed. During the study period, 457 patients were identified by Current Procedural Terminology codes for microvascular free tissue transfer for a head and neck defect regardless of cause. Seven patients were excluded for inadequate data on the postoperative disposition or American Society of Anesthesiologists (ASA) score. A total of 450 were included for analysis. Demographic and surgical data were collected, including the patient age, ASA score, and postoperative length of stay. These variables were then compared between groups of patients discharged to different posthospitalization care facilities. The mean age of participants was 59.1 years. Most patients (n = 386 [85.8%]) were discharged home with or without home health services. The mean age of those discharged home was 57.5 years; discharge to home was the reference for comparison and odds ratio (OR) calculation. For those discharged to a skilled nursing facility, mean age was 67.1 years (OR, 1.055; P < .001). Mean age of those discharged to a long-term acute care facility was 71.5 years (OR, 1.092; P = .002). Length of stay also affected the disposition to a skilled nursing facility (OR, 1.098), as did the ASA score (OR, 2.988). Elderly patients are less likely to be discharged home after free flap reconstruction. Age, ASA score, and length of stay are independent factors for discharge to a nursing or other care facility.

  3. Petabyte Class Storage at Jefferson Lab (CEBAF)

    NASA Technical Reports Server (NTRS)

    Chambers, Rita; Davis, Mark

    1996-01-01

    By 1997, the Thomas Jefferson National Accelerator Facility will collect over one Terabyte of raw information per day of Accelerator operation from three concurrently operating Experimental Halls. When post-processing is included, roughly 250 TB of raw and formatted experimental data will be generated each year. By the year 2000, a total of one Petabyte will be stored on-line. Critical to the experimental program at Jefferson Lab (JLab) is the networking and computational capability to collect, store, retrieve, and reconstruct data on this scale. The design criteria include support of a raw data stream of 10-12 MB/second from Experimental Hall B, which will operate the CEBAF (Continuous Electron Beam Accelerator Facility) Large Acceptance Spectrometer (CLAS). Keeping up with this data stream implies design strategies that provide storage guarantees during accelerator operation, minimize the number of times data is buffered allow seamless access to specific data sets for the researcher, synchronize data retrievals with the scheduling of postprocessing calculations on the data reconstruction CPU farms, as well as support the site capability to perform data reconstruction and reduction at the same overall rate at which new data is being collected. The current implementation employs state-of-the-art StorageTek Redwood tape drives and robotics library integrated with the Open Storage Manager (OSM) Hierarchical Storage Management software (Computer Associates, International), the use of Fibre Channel RAID disks dual-ported between Sun Microsystems SMP servers, and a network-based interface to a 10,000 SPECint92 data processing CPU farm. Issues of efficiency, scalability, and manageability will become critical to meet the year 2000 requirements for a Petabyte of near-line storage interfaced to over 30,000 SPECint92 of data processing power.

  4. 40 CFR 63.1346 - Standards for new or reconstructed raw material dryers.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 11 2010-07-01 2010-07-01 true Standards for new or reconstructed raw... Industry Emission Standards and Operating Limits § 63.1346 Standards for new or reconstructed raw material dryers. (a) New or reconstructed raw material dryers located at facilities that are major sources can not...

  5. Multislice spiral CT simulator for dynamic cardiopulmonary studies

    NASA Astrophysics Data System (ADS)

    De Francesco, Silvia; Ferreira da Silva, Augusto M.

    2002-04-01

    We've developed a Multi-slice Spiral CT Simulator modeling the acquisition process of a real tomograph over a 4-dimensional phantom (4D MCAT) of the human thorax. The simulator allows us to visually characterize artifacts due to insufficient temporal sampling and a priori evaluate the quality of the images obtained in cardio-pulmonary studies (both with single-/multi-slice and ECG gated acquisition processes). The simulating environment allows both for conventional and spiral scanning modes and includes a model of noise in the acquisition process. In case of spiral scanning, reconstruction facilities include longitudinal interpolation methods (360LI and 180LI both for single and multi-slice). Then, the reconstruction of the section is performed through FBP. The reconstructed images/volumes are affected by distortion due to insufficient temporal sampling of the moving object. The developed simulating environment allows us to investigate the nature of the distortion characterizing it qualitatively and quantitatively (using, for example, Herman's measures). Much of our work is focused on the determination of adequate temporal sampling and sinogram regularization techniques. At the moment, the simulator model is limited to the case of multi-slice tomograph, being planned as a next step of development the extension to cone beam or area detectors.

  6. AsteriX: a Web server to automatically extract ligand coordinates from figures in PDF articles.

    PubMed

    Lounnas, V; Vriend, G

    2012-02-27

    Coordinates describing the chemical structures of small molecules that are potential ligands for pharmaceutical targets are used at many stages of the drug design process. The coordinates of the vast majority of ligands can be obtained from either publicly accessible or commercial databases. However, interesting ligands sometimes are only available from the scientific literature, in which case their coordinates need to be reconstructed manually--a process that consists of a series of time-consuming steps. We present a Web server that helps reconstruct the three-dimensional (3D) coordinates of ligands for which a two-dimensional (2D) picture is available in a PDF file. The software, called AsteriX, analyses every picture contained in the PDF file and attempts to determine automatically whether or not it contains ligands. Areas in pictures that may contain molecular structures are processed to extract connectivity and atom type information that allow coordinates to be subsequently reconstructed. The AsteriX Web server was tested on a series of articles containing a large diversity in graphical representations. In total, 88% of 3249 ligand structures present in the test set were identified as chemical diagrams. Of these, about half were interpreted correctly as 3D structures, and a further one-third required only minor manual corrections. It is principally impossible to always correctly reconstruct 3D coordinates from pictures because there are many different protocols for drawing a 2D image of a ligand, but more importantly a wide variety of semantic annotations are possible. The AsteriX Web server therefore includes facilities that allow the users to augment partial or partially correct 3D reconstructions. All 3D reconstructions are submitted, checked, and corrected by the users domain at the server and are freely available for everybody. The coordinates of the reconstructed ligands are made available in a series of formats commonly used in drug design research. The AsteriX Web server is freely available at http://swift.cmbi.ru.nl/bitmapb/.

  7. Facility cost analysis in outpatient plastic surgery: implications for the academic health center.

    PubMed

    Pacella, Salvatore J; Comstock, Matthew C; Kuzon, William M

    2008-04-01

    The authors examined the economic patterns of outpatient aesthetic and reconstructive plastic surgical procedures performed within an academic health center. For fiscal years 2003 and 2004, the University of Michigan Health System's accounting database was queried to identify all outpatient plastic surgery cases (aesthetic and reconstructive) from four surgical facilities. Total facility charges, cost, revenue, and margin were calculated for each case. Contribution margin (total revenue minus variable direct cost) was compared with total case time to determine average contribution margin per operating suite case minute for subsets of aesthetic and reconstructive procedures. A total of 3603 cases (3457 reconstructive and 146 aesthetic) were identified. Payer mix included Blue Cross (36.7 percent), health maintenance organization (28.7 percent), other commercial payers (17.4 percent), Medicare/Medicaid (13.5 percent), and self-pay (3.7 percent). The most profitable cases were reconstructive laser procedures ($66.20; n = 361), scar revision ($36.01; n = 25), and facial trauma ($32.17; n = 64). The least profitable were hand arthroplasty ($13.93; n = 35), arthroscopy ($17.25; n = 15), and breast reduction ($17.46; n = 210). Aesthetic procedures (n = 144) yielded a significantly higher contribution margin per case minute ($24.21) compared with reconstructive procedures ($22.28; n = 3093) (p = 0.01). Plastic surgical cases performed at dedicated ambulatory surgery centers ($28.60; n = 1477) yielded significantly higher contribution margin per case minute compared with those performed at hospital-based facilities ($25.58; n = 2123) (p < 0.01). Use of standardized accounting (contribution margin per case minute) can be a strategically effective method for determining the most profitable and appropriate case mix. Within academic health centers, aesthetic surgery can be a profitable enterprise; dedicated ambulatory surgery centers yield higher profitability.

  8. Craniofacial Reconstruction by a Cost-Efficient Template-Based Process Using 3D Printing

    PubMed Central

    Beiglboeck, Fabian; Honigmann, Philipp; Jaquiéry, Claude; Thieringer, Florian

    2017-01-01

    Summary: Craniofacial defects often result in aesthetic and functional deficits, which affect the patient’s psyche and wellbeing. Patient-specific implants remain the optimal solution, but their use is limited or impractical due to their high costs. This article describes a fast and cost-efficient workflow of in-house manufactured patient-specific implants for craniofacial reconstruction and cranioplasty. As a proof of concept, we present a case of reconstruction of a craniofacial defect with involvement of the supraorbital rim. The following hybrid manufacturing process combines additive manufacturing with silicone molding and an intraoperative, manual fabrication process. A computer-aided design template is 3D printed from thermoplastics by a fused deposition modeling 3D printer and then silicone molded manually. After sterilization of the patient-specific mold, it is used intraoperatively to produce an implant from polymethylmethacrylate. Due to the combination of these 2 straightforward processes, the procedure can be kept very simple, and no advanced equipment is needed, resulting in minimal financial expenses. The whole fabrication of the mold is performed within approximately 2 hours depending on the template’s size and volume. This reliable technique is easy to adopt and suitable for every health facility, especially those with limited financial resources in less privileged countries, enabling many more patients to profit from patient-specific treatment. PMID:29263977

  9. Stereo Cameras for Clouds (STEREOCAM) Instrument Handbook

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Romps, David; Oktem, Rusen

    2017-10-31

    The three pairs of stereo camera setups aim to provide synchronized and stereo calibrated time series of images that can be used for 3D cloud mask reconstruction. Each camera pair is positioned at approximately 120 degrees from the other pair, with a 17o-19o pitch angle from the ground, and at 5-6 km distance from the U.S. Department of Energy (DOE) Central Facility at the Atmospheric Radiation Measurement (ARM) Climate Research Facility Southern Great Plains (SGP) observatory to cover the region from northeast, northwest, and southern views. Images from both cameras of the same stereo setup can be paired together tomore » obtain 3D reconstruction by triangulation. 3D reconstructions from the ring of three stereo pairs can be combined together to generate a 3D mask from surrounding views. This handbook delivers all stereo reconstruction parameters of the cameras necessary to make 3D reconstructions from the stereo camera images.« less

  10. Reconstructing Historical VOC Concentrations in Drinking Water for Epidemiological Studies at a U.S. Military Base: Summary of Results

    PubMed Central

    Maslia, Morris L.; Aral, Mustafa M.; Ruckart, Perri Z.; Bove, Frank J.

    2017-01-01

    A U.S. government health agency conducted epidemiological studies to evaluate whether exposures to drinking water contaminated with volatile organic compounds (VOC) at U.S. Marine Corps Base Camp Lejeune, North Carolina, were associated with increased health risks to children and adults. These health studies required knowledge of contaminant concentrations in drinking water—at monthly intervals—delivered to family housing, barracks, and other facilities within the study area. Because concentration data were limited or unavailable during much of the period of contamination (1950s–1985), the historical reconstruction process was used to quantify estimates of monthly mean contaminant-specific concentrations. This paper integrates many efforts, reports, and papers into a synthesis of the overall approach to, and results from, a drinking-water historical reconstruction study. Results show that at the Tarawa Terrace water treatment plant (WTP) reconstructed (simulated) tetrachloroethylene (PCE) concentrations reached a maximum monthly average value of 183 micrograms per liter (μg/L) compared to a one-time maximum measured value of 215 μg/L and exceeded the U.S. Environmental Protection Agency’s current maximum contaminant level (MCL) of 5 μg/L during the period November 1957–February 1987. At the Hadnot Point WTP, reconstructed trichloroethylene (TCE) concentrations reached a maximum monthly average value of 783 μg/L compared to a one-time maximum measured value of 1400 μg/L during the period August 1953–December 1984. The Hadnot Point WTP also provided contaminated drinking water to the Holcomb Boulevard housing area continuously prior to June 1972, when the Holcomb Boulevard WTP came on line (maximum reconstructed TCE concentration of 32 μg/L) and intermittently during the period June 1972–February 1985 (maximum reconstructed TCE concentration of 66 μg/L). Applying the historical reconstruction process to quantify contaminant-specific monthly drinking-water concentrations is advantageous for epidemiological studies when compared to using the classical exposed versus unexposed approach. PMID:28868161

  11. Diagnostic delay amongst tuberculosis patients in Jogjakarta Province, Indonesia is related to the quality of services in DOTS facilities.

    PubMed

    Ahmad, Riris Andono; Mahendradhata, Yodi; Utarini, Adi; de Vlas, Sake J

    2011-04-01

    To understand determinants of care-seeking patterns and diagnostic delay amongst tuberculosis (TB) patients diagnosed at direct observed treatment short course (DOTS) facilities in Jogjakarta, Indonesia. Cross-sectional survey amongst newly diagnosed TB patients in 89 DOTS facilities whose history of care-seeking was reconstructed through retrospective interviews gathering data on socio-demographic determinants, onset of TB symptoms, type of health facilities visited, duration of each care-seeking action were recorded. Two hundred and fifty-three TB patients were included in the study whose median duration of patients' delay was 1 week and whose total duration of diagnostic delay was 5.4 weeks. The median number of visits was 4. Many of the patients' socio-demographic determinants were not associated with the care-seeking patterns, and no socio-demographic determinants were associated with the duration of diagnostic delay. More than 60% of TB patients started their care-seeking processes outside DOTS facilities, but the number of visits in DOTS facilities was greater during the overall care-seeking process. Surprisingly, patient's immediate visits to a DOTS facility did not correspond to shorter diagnostic delay. Diagnostic delay in Jogjakarta province was not associated with patients' socio demographic factors, but rather with the health system providing DOTS services. This suggests that strengthening the health system and improving diagnostic quality within DOTS services is now a more rational strategy than expanding the TB programme to engage more providers. © 2010 Blackwell Publishing Ltd.

  12. Techniques in helical scanning, dynamic imaging and image segmentation for improved quantitative analysis with X-ray micro-CT

    NASA Astrophysics Data System (ADS)

    Sheppard, Adrian; Latham, Shane; Middleton, Jill; Kingston, Andrew; Myers, Glenn; Varslot, Trond; Fogden, Andrew; Sawkins, Tim; Cruikshank, Ron; Saadatfar, Mohammad; Francois, Nicolas; Arns, Christoph; Senden, Tim

    2014-04-01

    This paper reports on recent advances at the micro-computed tomography facility at the Australian National University. Since 2000 this facility has been a significant centre for developments in imaging hardware and associated software for image reconstruction, image analysis and image-based modelling. In 2010 a new instrument was constructed that utilises theoretically-exact image reconstruction based on helical scanning trajectories, allowing higher cone angles and thus better utilisation of the available X-ray flux. We discuss the technical hurdles that needed to be overcome to allow imaging with cone angles in excess of 60°. We also present dynamic tomography algorithms that enable the changes between one moment and the next to be reconstructed from a sparse set of projections, allowing higher speed imaging of time-varying samples. Researchers at the facility have also created a sizeable distributed-memory image analysis toolkit with capabilities ranging from tomographic image reconstruction to 3D shape characterisation. We show results from image registration and present some of the new imaging and experimental techniques that it enables. Finally, we discuss the crucial question of image segmentation and evaluate some recently proposed techniques for automated segmentation.

  13. 42 CFR 82.4 - How Will DOL Use the Results of the NIOSH Dose Reconstructions?

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... reconstruction results together with information on cancer diagnosis and other personal information provided to... probability that the cancer of the covered employee was caused by radiation exposure at a covered facility of...

  14. 42 CFR 82.4 - How Will DOL Use the Results of the NIOSH Dose Reconstructions?

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... reconstruction results together with information on cancer diagnosis and other personal information provided to... probability that the cancer of the covered employee was caused by radiation exposure at a covered facility of...

  15. 42 CFR 82.4 - How Will DOL Use the Results of the NIOSH Dose Reconstructions?

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... reconstruction results together with information on cancer diagnosis and other personal information provided to... probability that the cancer of the covered employee was caused by radiation exposure at a covered facility of...

  16. 42 CFR 82.4 - How Will DOL Use the Results of the NIOSH Dose Reconstructions?

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... reconstruction results together with information on cancer diagnosis and other personal information provided to... probability that the cancer of the covered employee was caused by radiation exposure at a covered facility of...

  17. 42 CFR 82.4 - How Will DOL Use the Results of the NIOSH Dose Reconstructions?

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... reconstruction results together with information on cancer diagnosis and other personal information provided to... probability that the cancer of the covered employee was caused by radiation exposure at a covered facility of...

  18. Use of Barriers in Rural Open Road Conditions--A Synthesis Study

    DOT National Transportation Integrated Search

    2012-05-01

    The use of wide medians and clear zones that do not require median and roadside barriers is the current design practice for new and : reconstructed rural highway facilities. Constructing or reconstructing roads with fullwidth medians and clear zon...

  19. Use of Barriers in Rural Open Road Condition--A Synthesis Study

    DOT National Transportation Integrated Search

    2012-05-01

    The use of wide medians and clear zones that do not require median and roadside barriers is the current design practice for new and : reconstructed rural highway facilities. Constructing or reconstructing roads with fullwidth medians and clear zon...

  20. 75 FR 16072 - Fisheries Finance Program; Final Program Notice and Announcement of Availability of Federal...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-03-31

    ...NMFS announces the availability of long-term direct loans made underthe Fisheries Finance Program (FFP). The FFP provides financing for the purchase of used vessels or the reconstruction of vessels (limited to reconstructions that do not add to fishing capacity); refinancing for existing debt obligations; financing or refinancing fisheries shoreside facilities or aquacultural facilities; and the purchase or refinancing of Individual Fishing Quota (IFQ) in the North Pacific. FFP loans are not issued for purposes which could contribute to over capitalization of the fishing industry.

  1. 40 CFR 63.9500 - What emission limitations must I meet?

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... (CONTINUED) National Emission Standards for Hazardous Air Pollutants for Friction Materials Manufacturing..., reconstructed, or existing large solvent mixer at your friction materials manufacturing facility, you must limit...) For each new, reconstructed, or existing small solvent mixer at your friction materials manufacturing...

  2. 40 CFR 63.9500 - What emission limitations must I meet?

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... (CONTINUED) National Emission Standards for Hazardous Air Pollutants for Friction Materials Manufacturing..., reconstructed, or existing large solvent mixer at your friction materials manufacturing facility, you must limit...) For each new, reconstructed, or existing small solvent mixer at your friction materials manufacturing...

  3. 40 CFR 63.9500 - What emission limitations must I meet?

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... (CONTINUED) National Emission Standards for Hazardous Air Pollutants for Friction Materials Manufacturing..., reconstructed, or existing large solvent mixer at your friction materials manufacturing facility, you must limit...) For each new, reconstructed, or existing small solvent mixer at your friction materials manufacturing...

  4. 40 CFR 63.9500 - What emission limitations must I meet?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... (CONTINUED) National Emission Standards for Hazardous Air Pollutants for Friction Materials Manufacturing..., reconstructed, or existing large solvent mixer at your friction materials manufacturing facility, you must limit...) For each new, reconstructed, or existing small solvent mixer at your friction materials manufacturing...

  5. International aid and natural disasters: a pre- and post-earthquake longitudinal study of the healthcare infrastructure in Leogane, Haiti.

    PubMed

    Kligerman, Maxwell; Barry, Michele; Walmer, David; Bendavid, Eran

    2015-02-01

    The reconstruction of healthcare systems in developing countries after natural disasters is poorly understood. Using data collected before and after the 2010 Haiti earthquake, we detail the response of aid agencies and their interaction with local healthcare providers in Leogane, the city closest to the epicenter. We find that the period after the earthquake was associated with an increase in the total number of healthcare facilities, inpatient beds, and surgical facilities and that international aid has been a driving force behind this recovery. Aid has funded 12 of 13 new healthcare facilities that have opened since the earthquake as well as the reconstruction of 7 of 8 healthcare facilities that have been rebuilt. Despite increases in free, aid-financed healthcare, private Haitian healthcare facilities have remained at a constant number. The planned phase-out of several aid-financed facilities, however, will leave Leogane with fewer inpatient beds and healthcare services compared with the pre-earthquake period. © The American Society of Tropical Medicine and Hygiene.

  6. 40 CFR 63.8190 - What emission limitations must I meet?

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... (CONTINUED) National Emission Standards for Hazardous Air Pollutants: Mercury Emissions From Mercury Cell... this section that applies to you. (1) New or reconstructed mercury cell chlor-alkali production facility. Emissions of mercury are prohibited from a new or reconstructed mercury cell chlor-alkali...

  7. 40 CFR 63.8190 - What emission limitations must I meet?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... (CONTINUED) National Emission Standards for Hazardous Air Pollutants: Mercury Emissions From Mercury Cell... this section that applies to you. (1) New or reconstructed mercury cell chlor-alkali production facility. Emissions of mercury are prohibited from a new or reconstructed mercury cell chlor-alkali...

  8. 40 CFR 63.8190 - What emission limitations must I meet?

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... (CONTINUED) National Emission Standards for Hazardous Air Pollutants: Mercury Emissions From Mercury Cell... this section that applies to you. (1) New or reconstructed mercury cell chlor-alkali production facility. Emissions of mercury are prohibited from a new or reconstructed mercury cell chlor-alkali...

  9. 40 CFR 63.8190 - What emission limitations must I meet?

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... (CONTINUED) National Emission Standards for Hazardous Air Pollutants: Mercury Emissions From Mercury Cell... this section that applies to you. (1) New or reconstructed mercury cell chlor-alkali production facility. Emissions of mercury are prohibited from a new or reconstructed mercury cell chlor-alkali...

  10. 40 CFR 60.100 - Applicability, designation of affected facility, and reconstruction.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... petroleum refineries: fluid catalytic cracking unit catalyst regenerators, fuel gas combustion devices, and... petroleum refinery. (b) Any fluid catalytic cracking unit catalyst regenerator or fuel gas combustion device... regenerator under paragraph (b) of this section which commences construction, reconstruction, or modification...

  11. 40 CFR 60.100 - Applicability, designation of affected facility, and reconstruction.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... petroleum refineries: fluid catalytic cracking unit catalyst regenerators, fuel gas combustion devices, and... petroleum refinery. (b) Any fluid catalytic cracking unit catalyst regenerator or fuel gas combustion device... regenerator under paragraph (b) of this section which commences construction, reconstruction, or modification...

  12. Cost-effectiveness analysis of the most common orthopaedic surgery procedures: knee arthroscopy and knee anterior cruciate ligament reconstruction.

    PubMed

    Lubowitz, James H; Appleby, David

    2011-10-01

    The purpose of this study was to determine the cost-effectiveness of knee arthroscopy and anterior cruciate ligament (ACL) reconstruction. Retrospective analysis of prospectively collected data from a single-surgeon, institutional review board-approved outcomes registry included 2 cohorts: surgically treated knee arthroscopy and ACL reconstruction patients. Our outcome measure is cost-effectiveness (cost of a quality-adjusted life-year [QALY]). The QALY is calculated by multiplying difference in health-related quality of life, before and after treatment, by life expectancy. Health-related quality of life is measured by use of the Quality of Well-Being scale, which has been validated for cost-effectiveness analysis. Costs are facility charges per the facility cost-to-charges ratio plus surgeon fee. Sensitivity analyses are performed to determine the effect of variations in costs or outcomes. There were 93 knee arthroscopy and 35 ACL reconstruction patients included at a mean follow-up of 2.1 years. Cost per QALY was $5,783 for arthroscopy and $10,326 for ACL reconstruction (2009 US dollars). Sensitivity analysis shows that our results are robust (relatively insensitive) to variations in costs or outcomes. Knee arthroscopy and knee ACL reconstruction are very cost-effective. Copyright © 2011 Arthroscopy Association of North America. Published by Elsevier Inc. All rights reserved.

  13. CHARACTERIZATION OF EXPOSURES TO WORKERS COVERED UNDER THE U.S. ENERGY EMPLOYEES COMPENSATION ACT

    PubMed Central

    Neton, James W.

    2015-01-01

    Since the mid-1940s, hundreds of thousands of workers have been engaged in nuclear weapons-related activities for the U.S. Department of Energy (DOE) and its predecessor agencies. In 2000, Congress promulgated the Energy Employees Occupational Illness Compensation Program Act of 2000 (EEOICPA), which provides monetary compensation and medical benefits to certain energy employees who have developed cancer. Under Part B of EEOICPA, the National Institute for Occupational Safety and Health (NIOSH) is required to estimate radiation doses for those workers who have filed a claim, or whose survivors have filed a claim, under Part B of the Act. To date, over 39,000 dose reconstructions have been completed for workers from more than 200 facilities. These reconstructions have included assessment of both internal and external exposure at all major DOE facilities, as well as at a large number of private companies [known as Atomic Weapons Employer (AWE) facilities in the Act] that engaged in contract work for the DOE and its predecessor agencies. To complete these dose reconstructions, NIOSH has captured and reviewed thousands of historical documents related to site operations and worker/workplace monitoring practices at these facilities. Using the data collected and reviewed pursuant to NIOSH’s role under EEOICPA, this presentation will characterize historical internal and external exposures received by workers at DOE and AWE facilities. To the extent possible, use will be made of facility specific coworker models to highlight changes in exposure patterns over time. In addition, the effects that these exposures have on compensation rates for workers are discussed. PMID:24378500

  14. 18 CFR 153.13 - Emergency reconstruction.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... loss of gas supply or capacity are applicable to facilities subject to section 3 of the Natural Gas Act... 18 Conservation of Power and Water Resources 1 2010-04-01 2010-04-01 false Emergency reconstruction. 153.13 Section 153.13 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY...

  15. 40 CFR 63.1162 - Monitoring requirements.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... Hydrochloric Acid Regeneration Plants § 63.1162 Monitoring requirements. (a) The owner or operator of a new, reconstructed, or existing steel pickling facility or acid regeneration plant subject to this subpart shall: (1... Administrator. (b) The owner or operator of a new, reconstructed, or existing acid regeneration plant subject to...

  16. 40 CFR 63.1162 - Monitoring requirements.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... Hydrochloric Acid Regeneration Plants § 63.1162 Monitoring requirements. (a) The owner or operator of a new, reconstructed, or existing steel pickling facility or acid regeneration plant subject to this subpart shall: (1... Administrator. (b) The owner or operator of a new, reconstructed, or existing acid regeneration plant subject to...

  17. 40 CFR 63.1162 - Monitoring requirements.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Hydrochloric Acid Regeneration Plants § 63.1162 Monitoring requirements. (a) The owner or operator of a new, reconstructed, or existing steel pickling facility or acid regeneration plant subject to this subpart shall: (1... Administrator. (b) The owner or operator of a new, reconstructed, or existing acid regeneration plant subject to...

  18. 40 CFR 63.1162 - Monitoring requirements.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Hydrochloric Acid Regeneration Plants § 63.1162 Monitoring requirements. (a) The owner or operator of a new, reconstructed, or existing steel pickling facility or acid regeneration plant subject to this subpart shall: (1... Administrator. (b) The owner or operator of a new, reconstructed, or existing acid regeneration plant subject to...

  19. 40 CFR 63.1162 - Monitoring requirements.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... Hydrochloric Acid Regeneration Plants § 63.1162 Monitoring requirements. (a) The owner or operator of a new, reconstructed, or existing steel pickling facility or acid regeneration plant subject to this subpart shall: (1... Administrator. (b) The owner or operator of a new, reconstructed, or existing acid regeneration plant subject to...

  20. Characterization of exposures to workers covered under the U.S. Energy Employees Compensation Act.

    PubMed

    Neton, James W

    2014-02-01

    Since the mid-1940s, hundreds of thousands of workers have been engaged in nuclear weapons-related activities for the U.S. Department of Energy (DOE) and its predecessor agencies. In 2000, Congress promulgated the Energy Employees Occupational Illness Compensation Program Act of 2000 (EEOICPA), which provides monetary compensation and medical benefits to certain energy employees who have developed cancer. Under Part B of EEOICPA, the National Institute for Occupational Safety and Health (NIOSH) is required to estimate radiation doses for those workers who have filed a claim, or whose survivors have filed a claim, under Part B of the Act. To date, over 39,000 dose reconstructions have been completed for workers from more than 200 facilities. These reconstructions have included assessment of both internal and external exposure at all major DOE facilities, as well as at a large number of private companies [known as Atomic Weapons Employer (AWE) facilities in the Act] that engaged in contract work for the DOE and its predecessor agencies. To complete these dose reconstructions, NIOSH has captured and reviewed thousands of historical documents related to site operations and worker/workplace monitoring practices at these facilities. Using the data collected and reviewed pursuant to NIOSH's role under EEOICPA, this presentation will characterize historical internal and external exposures received by workers at DOE and AWE facilities. To the extent possible, use will be made of facility specific coworker models to highlight changes in exposure patterns over time. In addition, the effects that these exposures have on compensation rates for workers are discussed.Introduction of Characterization of Exposures to Workers (Video 1:59, http://links.lww.com/HP/A3).

  1. Shortening and Angulation for Soft-Tissue Reconstruction of Extremity Wounds in a Combat Support Hospital

    DTIC Science & Technology

    2009-08-01

    l\\I ILIT\\R\\’ ’\\ I EDICt E. 174. K:83X. 2009 Shortening and Angulation for Soft-Tissue Reconstruction of Extremity Wounds in a Combat Support...team in theater. Thereafter. they can be rapidly evacuated to treatment facilities in their respective countries for definitive reconstruct ion of...cripl "’"’ rccCI\\ ec.J ft•r re’ 1ew 1n ovcmb.:r 2008. The revbe<.l manu,cnpl "a’ accepted tor publicauon 1n May 2()()9. 838 vide a reconstructive

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    De Carlo, Francesco; Gürsoy, Doğa; Ching, Daniel J.

    There is a widening gap between the fast advancement of computational methods for tomographic reconstruction and their successful implementation in production software at various synchrotron facilities. This is due in part to the lack of readily available instrument datasets and phantoms representative of real materials for validation and comparison of new numerical methods. Recent advancements in detector technology made sub-second and multi-energy tomographic data collection possible [1], but also increased the demand to develop new reconstruction methods able to handle in-situ [2] and dynamic systems [3] that can be quickly incorporated in beamline production software [4]. The X-ray Tomography Datamore » Bank, tomoBank, provides a repository of experimental and simulated datasets with the aim to foster collaboration among computational scientists, beamline scientists, and experimentalists and to accelerate the development and implementation of tomographic reconstruction methods for synchrotron facility production software by providing easy access to challenging dataset and their descriptors.« less

  3. The Generic Data Capture Facility

    NASA Technical Reports Server (NTRS)

    Connell, Edward B.; Barnes, William P.; Stallings, William H.

    1987-01-01

    The Generic Data Capture Facility, which can provide data capture support for a variety of different types of spacecraft while enabling operations costs to be carefully controlled, is discussed. The data capture functions, data protection, isolation of users from data acquisition problems, data reconstruction, and quality and accounting are addressed. The TDM and packet data formats utilized by the system are described, and the development of generic facilities is considered.

  4. Rebuilding Schools after the Wenchuan Earthquake: China Visits OECD, Italy and Turkey

    ERIC Educational Resources Information Center

    CELE Exchange, 2009

    2009-01-01

    As the reconstruction efforts continue in China in the wake of the Great Wenchuan earthquake in May 2008, the China Development Research Foundation, with the support of the OECD Centre for Effective Learning Environments, organised an International Training Programme on the Post-Earthquake Reconstruction of Public Facilities from 1 to 11 December…

  5. 40 CFR 63.9495 - When do I have to comply with this subpart?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... October 18, 2005. (b) If you have a new or reconstructed solvent mixer and its initial startup date is... initial startup. (c) If your friction materials manufacturing facility is an area source that increases... reconstructed sources upon startup or no later than October 18, 2002, whichever is later. (2) For any portion of...

  6. 77 FR 6681 - Approval and Promulgation of State Plans for Designated Facilities and Pollutants; State of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-02-09

    ... tons per day of municipal solid waste (MSW). This action corrects an error in the regulatory language... per day of municipal solid waste (MSW), and for which construction, reconstruction, or modification... Municipal Waste Combustor (LMWC) Emissions From Existing Facilities; Correction AGENCY: Environmental...

  7. 40 CFR 60.750 - Applicability, designation of affected facility, and delegation of authority.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Standards of Performance for Municipal Solid Waste Landfills § 60.750 Applicability, designation of affected facility, and delegation of authority. (a) The provisions of this subpart apply to each municipal solid waste landfill that commenced construction, reconstruction or modification on or after May 30, 1991...

  8. 40 CFR 60.750 - Applicability, designation of affected facility, and delegation of authority.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Standards of Performance for Municipal Solid Waste Landfills § 60.750 Applicability, designation of affected facility, and delegation of authority. (a) The provisions of this subpart apply to each municipal solid waste landfill that commenced construction, reconstruction or modification on or after May 30, 1991...

  9. Myths and realities about the recovery of L׳Aquila after the earthquake

    PubMed Central

    Contreras, Diana; Blaschke, Thomas; Kienberger, Stefan; Zeil, Peter

    2014-01-01

    There is a set of myths which are linked to the recovery of L׳Aquila, such as: the L׳Aquila recovery has come to a halt, it is still in an early recovery phase, and there is economic stagnation. The objective of this paper is threefold: (a) to identify and develop a set of spatial indicators for the case of L׳Aquila, (b) to test the feasibility of a numerical assessment of these spatial indicators as a method to monitor the progress of a recovery process after an earthquake and (c) to answer the question whether the recovery process in L׳Aquila stagnates or not. We hypothesize that after an earthquake the spatial distribution of expert defined variables can constitute an index to assess the recovery process more objectively. In these articles, we aggregated several indicators of building conditions to characterize the physical dimension, and we developed building use indicators to serve as proxies for the socio-economic dimension while aiming for transferability of this approach. The methodology of this research entailed six steps: (1) fieldwork, (2) selection of a sampling area, (3) selection of the variables and indicators for the physical and socio-economic dimensions, (4) analyses of the recovery progress using spatial indicators by comparing the changes in the restricted core area as well as building use over time; (5) selection and integration of the results through expert weighting; and (6) determining hotspots of recovery in L׳Aquila. Eight categories of building conditions and twelve categories of building use were identified. Both indicators: building condition and building use are aggregated into a recovery index. The reconstruction process in the city center of L׳Aquila seems to stagnate, which is reflected by the five following variables: percentage of buildings with on-going reconstruction, partial reconstruction, reconstruction projected residential building use and transport facilities. These five factors were still at low levels within the core area in 2012. Nevertheless, we can conclude that the recovery process in L׳Aquila did not come to a halt but is still ongoing, albeit being slow. PMID:26779431

  10. An improved schlieren method for measurement and automatic reconstruction of the far-field focal spot

    PubMed Central

    Wang, Zhengzhou; Hu, Bingliang; Yin, Qinye

    2017-01-01

    The schlieren method of measuring far-field focal spots offers many advantages at the Shenguang III laser facility such as low cost and automatic laser-path collimation. However, current methods of far-field focal spot measurement often suffer from low precision and efficiency when the final focal spot is merged manually, thereby reducing the accuracy of reconstruction. In this paper, we introduce an improved schlieren method to construct the high dynamic-range image of far-field focal spots and improve the reconstruction accuracy and efficiency. First, a detection method based on weak light beam sampling and magnification imaging was designed; images of the main and side lobes of the focused laser irradiance in the far field were obtained using two scientific CCD cameras. Second, using a self-correlation template matching algorithm, a circle the same size as the schlieren ball was dug from the main lobe cutting image and used to change the relative region of the main lobe cutting image within a 100×100 pixel region. The position that had the largest correlation coefficient between the side lobe cutting image and the main lobe cutting image when a circle was dug was identified as the best matching point. Finally, the least squares method was used to fit the center of the side lobe schlieren small ball, and the error was less than 1 pixel. The experimental results show that this method enables the accurate, high-dynamic-range measurement of a far-field focal spot and automatic image reconstruction. Because the best matching point is obtained through image processing rather than traditional reconstruction methods based on manual splicing, this method is less sensitive to the efficiency of focal-spot reconstruction and thus offers better experimental precision. PMID:28207758

  11. 40 CFR Table 2 to Subpart Ooo - Stack Emission Limits for Affected Facilities With Capture Systems

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 60.671) that commenced construction, modification, or reconstruction after August 31, 1983 but before April 22, 2008 0.05 g/dscm (0.022 gr/dscf) a 7 percent for dry control devices b An initial performance....670 and 60.671) that commence construction, modification, or reconstruction on or after April 22, 2008...

  12. 42 CFR 83.13 - How will NIOSH evaluate petitions, other than petitions by claimants covered under § 83.14?

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ..., DEPARTMENT OF HEALTH AND HUMAN SERVICES OCCUPATIONAL SAFETY AND HEALTH RESEARCH AND RELATED ACTIVITIES... of employment at the DOE facility or AWE facility; (6) NIOSH records from epidemiological research on... research, dose reconstructions, medical screening programs, and other related activities conducted to...

  13. 42 CFR 83.13 - How will NIOSH evaluate petitions, other than petitions by claimants covered under § 83.14?

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ..., DEPARTMENT OF HEALTH AND HUMAN SERVICES OCCUPATIONAL SAFETY AND HEALTH RESEARCH AND RELATED ACTIVITIES... of employment at the DOE facility or AWE facility; (6) NIOSH records from epidemiological research on... research, dose reconstructions, medical screening programs, and other related activities conducted to...

  14. 42 CFR 83.13 - How will NIOSH evaluate petitions, other than petitions by claimants covered under § 83.14?

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ..., DEPARTMENT OF HEALTH AND HUMAN SERVICES OCCUPATIONAL SAFETY AND HEALTH RESEARCH AND RELATED ACTIVITIES... of employment at the DOE facility or AWE facility; (6) NIOSH records from epidemiological research on... research, dose reconstructions, medical screening programs, and other related activities conducted to...

  15. 42 CFR 83.13 - How will NIOSH evaluate petitions, other than petitions by claimants covered under § 83.14?

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ..., DEPARTMENT OF HEALTH AND HUMAN SERVICES OCCUPATIONAL SAFETY AND HEALTH RESEARCH AND RELATED ACTIVITIES... of employment at the DOE facility or AWE facility; (6) NIOSH records from epidemiological research on... research, dose reconstructions, medical screening programs, and other related activities conducted to...

  16. Synthetic fiber production facilities: Background information for proposed standards

    NASA Astrophysics Data System (ADS)

    Goodwin, D. R.

    1982-10-01

    Standards of performance to control emissions of volatile organic compounds (VOC) from new, modified, and reconstructed synthetic fiber production facilities are being proposed under section III of the Clean Air Act. This document contains information on the background and authority, regulatory alternatives considered, and environmental and economic impacts of the regulatory alternatives.

  17. Scaling up close-range surveys, a challenge for the generalization of as-built data in industrial applications

    NASA Astrophysics Data System (ADS)

    Hullo, J.-F.; Thibault, G.

    2014-06-01

    As-built CAD data reconstructed from Terrestrial Laser Scanner (TLS) data are used for more than two decades by Electricité de France (EDF) to prepare maintenance operations in its facilities. But today, the big picture is renewed: "as-built virtual reality" must address a huge scale-up to provide data to an increasing number of applications. In this paper, we first present a wide multi-sensor multi-purpose scanning campaign performed in a 10 floor building of a power plant in 2013: 1083 TLS stations (about 40.109 3D points referenced under a 2 cm tolerance) and 1025 RGB panoramic images (340.106 pixels per point of view). As expected, this very large survey of high precision measurements in a complex environment stressed sensors and tools that were developed for more favourable conditions and smaller data sets. The whole survey process (tools and methods used from acquisition and processing to CAD reconstruction) underwent a detailed follow-up in order to state on the locks to a possible generalization to other buildings. Based on these recent feedbacks, we have highlighted some of these current bottlenecks in this paper: sensors denoising, automation in processes, data validation tools improvements, standardization of formats and (meta-) data structures.

  18. Petroleum and hazardous material releases from industrial facilities associated with Hurricane Katrina.

    PubMed

    Santella, Nicholas; Steinberg, Laura J; Sengul, Hatice

    2010-04-01

    Hurricane Katrina struck an area dense with industry, causing numerous releases of petroleum and hazardous materials. This study integrates information from a number of sources to describe the frequency, causes, and effects of these releases in order to inform analysis of risk from future hurricanes. Over 200 onshore releases of hazardous chemicals, petroleum, or natural gas were reported. Storm surge was responsible for the majority of petroleum releases and failure of storage tanks was the most common mechanism of release. Of the smaller number of hazardous chemical releases reported, many were associated with flaring from plant startup, shutdown, or process upset. In areas impacted by storm surge, 10% of the facilities within the Risk Management Plan (RMP) and Toxic Release Inventory (TRI) databases and 28% of SIC 1311 facilities experienced accidental releases. In areas subject only to hurricane strength winds, a lower fraction (1% of RMP and TRI and 10% of SIC 1311 facilities) experienced a release while 1% of all facility types reported a release in areas that experienced tropical storm strength winds. Of industrial facilities surveyed, more experienced indirect disruptions such as displacement of workers, loss of electricity and communication systems, and difficulty acquiring supplies and contractors for operations or reconstruction (55%), than experienced releases. To reduce the risk of hazardous material releases and speed the return to normal operations under these difficult conditions, greater attention should be devoted to risk-based facility design and improved prevention and response planning.

  19. TomoBank: a tomographic data repository for computational x-ray science

    DOE PAGES

    De Carlo, Francesco; Gürsoy, Doğa; Ching, Daniel J.; ...

    2018-02-08

    There is a widening gap between the fast advancement of computational methods for tomographic reconstruction and their successful implementation in production software at various synchrotron facilities. This is due in part to the lack of readily available instrument datasets and phantoms representative of real materials for validation and comparison of new numerical methods. Recent advancements in detector technology made sub-second and multi-energy tomographic data collection possible [1], but also increased the demand to develop new reconstruction methods able to handle in-situ [2] and dynamic systems [3] that can be quickly incorporated in beamline production software [4]. The X-ray Tomography Datamore » Bank, tomoBank, provides a repository of experimental and simulated datasets with the aim to foster collaboration among computational scientists, beamline scientists, and experimentalists and to accelerate the development and implementation of tomographic reconstruction methods for synchrotron facility production software by providing easy access to challenging dataset and their descriptors.« less

  20. Local reconstruction in computed tomography of diffraction enhanced imaging

    NASA Astrophysics Data System (ADS)

    Huang, Zhi-Feng; Zhang, Li; Kang, Ke-Jun; Chen, Zhi-Qiang; Zhu, Pei-Ping; Yuan, Qing-Xi; Huang, Wan-Xia

    2007-07-01

    Computed tomography of diffraction enhanced imaging (DEI-CT) based on synchrotron radiation source has extremely high sensitivity of weakly absorbing low-Z samples in medical and biological fields. The authors propose a modified backprojection filtration(BPF)-type algorithm based on PI-line segments to reconstruct region of interest from truncated refraction-angle projection data in DEI-CT. The distribution of refractive index decrement in the sample can be directly estimated from its reconstruction images, which has been proved by experiments at the Beijing Synchrotron Radiation Facility. The algorithm paves the way for local reconstruction of large-size samples by the use of DEI-CT with small field of view based on synchrotron radiation source.

  1. Reconstruction of bar {p}p events in PANDA

    NASA Astrophysics Data System (ADS)

    Spataro, S.

    2012-08-01

    The PANDA experiment will study anti-proton proton and anti-proton nucleus collisions in the HESR complex of the facility FAIR, in a beam momentum range from 2 GeV jc up to 15 GeV/c. In preparation for the experiment, a software framework based on ROOT (PandaRoot) is being developed for the simulation, reconstruction and analysis of physics events, running also on a GRID infrastructure. Detailed geometry descriptions and different realistic reconstruction algorithms are implemented, currently used for the realization of the Technical Design Reports. The contribution will report about the reconstruction capabilities of the Panda spectrometer, focusing mainly on the performances of the tracking system and the results for the analysis of physics benchmark channels.

  2. Unlocking the Mystery of Columbia's Tragic Accident Through Materials Characterization

    NASA Technical Reports Server (NTRS)

    Shah, Sandeep; Jerman, Gregory; Coston, James

    2003-01-01

    The wing and underbelly reconstruction of Space Shuttle Columbia took place at the Shuttle Landing Facility Hangar after the accident which destroyed STS-107. Fragments were placed on a grid according to their original location on the orbiter. Some Reinforced Carbon-Carbon (RCC) panels of the left wing leading edge and other parts from both leading edges were recovered and incorporated into the reconstruction. The recovered parts were tracked on a database according to a number and also tracked on a map of the orbiter. This viewgraph presentation describes the process of failure analysis undertaken by the Materials and Processes (M&P) Problem Resolution Team. The team started with factual observations about the accident, and identified highest level questions for it to answer in order to understand where on the orbiter failure occured, what component(s) failed, and what was the sequence of events. The finding of Columbia's MADS/OEX data recorder shifted the focus of the team's analysis to the left wing leading edge damage. The team placed particular attention on slag deposits on some of the RCC panels. The presentation lists analysis techniques, and lower level questions for the team to answer.

  3. 40 CFR 60.110b - Applicability and designation of affected facility.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... an attachment to the notification required by 40 CFR 65.5(b). [52 FR 11429, Apr. 8, 1987, as amended... designation of affected facility. (a) Except as provided in paragraph (b) of this section, the affected..., reconstruction, or modification is commenced after July 23, 1984. (b) This subpart does not apply to storage...

  4. 40 CFR 60.110b - Applicability and designation of affected facility.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... an attachment to the notification required by 40 CFR 65.5(b). [52 FR 11429, Apr. 8, 1987, as amended... designation of affected facility. (a) Except as provided in paragraph (b) of this section, the affected..., reconstruction, or modification is commenced after July 23, 1984. (b) This subpart does not apply to storage...

  5. 40 CFR 60.110b - Applicability and designation of affected facility.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... an attachment to the notification required by 40 CFR 65.5(b). [52 FR 11429, Apr. 8, 1987, as amended... designation of affected facility. (a) Except as provided in paragraph (b) of this section, the affected..., reconstruction, or modification is commenced after July 23, 1984. (b) This subpart does not apply to storage...

  6. 40 CFR 60.110b - Applicability and designation of affected facility.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... an attachment to the notification required by 40 CFR 65.5(b). [52 FR 11429, Apr. 8, 1987, as amended... designation of affected facility. (a) Except as provided in paragraph (b) of this section, the affected..., reconstruction, or modification is commenced after July 23, 1984. (b) This subpart does not apply to storage...

  7. 40 CFR 60.110b - Applicability and designation of affected facility.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... an attachment to the notification required by 40 CFR 65.5(b). [52 FR 11429, Apr. 8, 1987, as amended... designation of affected facility. (a) Except as provided in paragraph (b) of this section, the affected..., reconstruction, or modification is commenced after July 23, 1984. (b) This subpart does not apply to storage...

  8. Little Boy replication: justification and construction

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Malenfant, R.E.

    A reconstruction of the Little Boy weapon allowed experiments to evaluate yield, leakage measurements for comparison with calculations, and phenomenological measurements to evaluate various in-situ dosimeters. The reconstructed weapon was operated at sustained delayed critical at the Los Alamos Critical Assembly Facility. The present experiments provide a wealth of information to benchmark calculations and demonstrate that the 1965 measurements on the Ichiban assembly (a spherical mockup of Little Boy) were in error.

  9. 20 CFR 30.318 - Can the FAB consider objections to HHS's reconstruction of a radiation dose or to the guidelines...

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... reconstruction of a radiation dose or to the guidelines OWCP uses to determine if a claimed cancer was at least... if a claimed cancer was at least as likely as not related to employment? (a) If the claimant objects... if a claimed cancer was at least as likely as not related to employment at a DOE facility, an atomic...

  10. 20 CFR 30.318 - Can the FAB consider objections to HHS's reconstruction of a radiation dose or to the guidelines...

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... reconstruction of a radiation dose or to the guidelines OWCP uses to determine if a claimed cancer was at least... if a claimed cancer was at least as likely as not related to employment? (a) If the claimant objects... if a claimed cancer was at least as likely as not related to employment at a DOE facility, an atomic...

  11. 20 CFR 30.318 - Can the FAB consider objections to HHS's reconstruction of a radiation dose or to the guidelines...

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... reconstruction of a radiation dose or to the guidelines OWCP uses to determine if a claimed cancer was at least... if a claimed cancer was at least as likely as not related to employment? (a) If the claimant objects... if a claimed cancer was at least as likely as not related to employment at a DOE facility, an atomic...

  12. 20 CFR 30.318 - Can the FAB consider objections to HHS's reconstruction of a radiation dose or to the guidelines...

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... reconstruction of a radiation dose or to the guidelines OWCP uses to determine if a claimed cancer was at least... if a claimed cancer was at least as likely as not related to employment? (a) If the claimant objects... if a claimed cancer was at least as likely as not related to employment at a DOE facility, an atomic...

  13. 20 CFR 30.318 - Can the FAB consider objections to HHS's reconstruction of a radiation dose or to the guidelines...

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... reconstruction of a radiation dose or to the guidelines OWCP uses to determine if a claimed cancer was at least... if a claimed cancer was at least as likely as not related to employment? (a) If the claimant objects... if a claimed cancer was at least as likely as not related to employment at a DOE facility, an atomic...

  14. Model-based multi-fringe interferometry using Zernike polynomials

    NASA Astrophysics Data System (ADS)

    Gu, Wei; Song, Weihong; Wu, Gaofeng; Quan, Haiyang; Wu, Yongqian; Zhao, Wenchuan

    2018-06-01

    In this paper, a general phase retrieval method is proposed, which is based on one single interferogram with a small amount of fringes (either tilt or power). Zernike polynomials are used to characterize the phase to be measured; the phase distribution is reconstructed by a non-linear least squares method. Experiments show that the proposed method can obtain satisfactory results compared to the standard phase-shifting interferometry technique. Additionally, the retrace errors of proposed method can be neglected because of the few fringes; it does not need any auxiliary phase shifting facilities (low cost) and it is easy to implement without the process of phase unwrapping.

  15. Breaking the Mold.

    ERIC Educational Resources Information Center

    Huckabee, Christopher

    2003-01-01

    Using the example of a Texas elementary school, describes how to eliminate mold and mildew from school facilities, including discovering the problem, responding quickly, reconstructing the area, and crisis planning and prevention. (EV)

  16. IRVE-II Post-Flight Trajectory Reconstruction

    NASA Technical Reports Server (NTRS)

    O'Keefe, Stephen A.; Bose, David M.

    2010-01-01

    NASA s Inflatable Re-entry Vehicle Experiment (IRVE) II successfully demonstrated an inflatable aerodynamic decelerator after being launched aboard a sounding rocket from Wallops Flight Facility (WFF). Preliminary day of flight data compared well with pre-flight Monte Carlo analysis, and a more complete trajectory reconstruction performed with an Extended Kalman Filter (EKF) approach followed. The reconstructed trajectory and comparisons to an attitude solution provided by NASA Sounding Rocket Operations Contract (NSROC) personnel at WFF are presented. Additional comparisons are made between the reconstructed trajectory and pre and post-flight Monte Carlo trajectory predictions. Alternative observations of the trajectory are summarized which leverage flight accelerometer measurements, the pre-flight aerodynamic database, and on-board flight video. Finally, analysis of the payload separation and aeroshell deployment events are presented. The flight trajectory is reconstructed to fidelity sufficient to assess overall project objectives related to flight dynamics and overall, IRVE-II flight dynamics are in line with expectations

  17. Little Boy replication: justification and construction

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Malenfant, R.E.

    A reconstruction of the Little Boy weapon allowed experiments to evaluate yield, leakage measurements for comparison with calculations, and phenomenological measurements to evaluate various in-situ dosimeters. The reconstructed weapon was operated at sustained delayed critical at the Los Alamos Critical Assembly Facility. The present experiments provide a wealth of information to benchmark calculations and demonstrate that the 1965 measurements on the Ichiban assembly (a spherical mockup of Little Boy) were in error. 5 references, 2 figures.

  18. NCDOT : bridge policy

    DOT National Transportation Integrated Search

    1994-11-01

    NCDOTs Bridge Policy establishes controlling design elements for new and reconstructed bridges on the state road system. It includes information to address sidewalks and bicycle facilities on bridges, including minimum handrail heights and sidewal...

  19. Non-rigid Reconstruction of Casting Process with Temperature Feature

    NASA Astrophysics Data System (ADS)

    Lin, Jinhua; Wang, Yanjie; Li, Xin; Wang, Ying; Wang, Lu

    2017-09-01

    Off-line reconstruction of rigid scene has made a great progress in the past decade. However, the on-line reconstruction of non-rigid scene is still a very challenging task. The casting process is a non-rigid reconstruction problem, it is a high-dynamic molding process lacking of geometric features. In order to reconstruct the casting process robustly, an on-line fusion strategy is proposed for dynamic reconstruction of casting process. Firstly, the geometric and flowing feature of casting are parameterized in manner of TSDF (truncated signed distance field) which is a volumetric block, parameterized casting guarantees real-time tracking and optimal deformation of casting process. Secondly, data structure of the volume grid is extended to have temperature value, the temperature interpolation function is build to generate the temperature of each voxel. This data structure allows for dynamic tracking of temperature of casting during deformation stages. Then, the sparse RGB features is extracted from casting scene to search correspondence between geometric representation and depth constraint. The extracted color data guarantees robust tracking of flowing motion of casting. Finally, the optimal deformation of the target space is transformed into a nonlinear regular variational optimization problem. This optimization step achieves smooth and optimal deformation of casting process. The experimental results show that the proposed method can reconstruct the casting process robustly and reduce drift in the process of non-rigid reconstruction of casting.

  20. Study of GLAO-corrected PSF evolution for the MUSE Wide Field Mode. Expected performance and requirements for PSF reconstruction

    NASA Astrophysics Data System (ADS)

    Fusco, T.; Villecroze, R.; Jarno, A.; Bacon, R.

    2011-09-01

    The second generation instrument MUSE for the VLT has been designed to profit of the ESO Adaptive Optics Facility (AOF). The two Adaptive Optics (AO) modes (GLAO in Wide Field Mode [WFM] and LTAO in Narrow Field Mode [NFM]) will be used. To achieve its key science goals, MUSE will require information on the full system (Atmosphere, AO, telescope and instrument) image quality and its variation with Field position and wavelength. For example, optimal summation of a large number of deep field exposures in WFM will require a good knowledge of the PSF. In this paper, we will present an exhaustive analysis of the MUSE Wide Field Mode PSF evolution both spatially and spectrally. For that purpose we have coupled a complete AO simulation tool developed at ONERA with the MUSE instrumental PSF simulation. Relative impact of atmospheric and system parameters (seeing, Cn^2, LGS and NGS positions etc ...) with respect to differential MUSE aberrations per channel (i.e. slicer and IFU) is analysed. The results allow us (in close collaboration with astronomers) to define pertinent parameters (fit parameters using a Moffat function) for a PSF reconstruction process (estimation of this parameters using GLAO telemetry) and to propose an efficient and robust algorithm to be implemented in the MUSE pipeline. The extension of the spatial and spectral PSF analysis to the NFM case is discussed and preliminary results are given. Some specific requirements for the generalisation of the GLAO PSF reconstruction process to the LTAO case are derived from these early results.

  1. Propeller flap reconstruction of abdominal defects: review of the literature and case report.

    PubMed

    Scaglioni, Mario F; Giuseppe, Alberto Di; Chang, Edward I

    2015-01-01

    The abdominal wall is perfused anteriorly by the superior and deep epigastric vessels with a smaller contribution from the superficial system. The lateral abdominal wall is perfused predominantly from perforators arising from the intercostal vessels. Reconstruction of soft tissue defects involving the abdomen presents a difficult challenge for reconstructive surgeons. Pedicle perforator propeller flaps can be used to reconstruct defects of the abdomen, and here we present a thorough review of the literature as well as a case illustrating the perforasome propeller flap concept. A patient underwent resection for dermatofibrosarcoma protuberans resulting in a large defect of the epigastric soft tissue. A propeller flap was designed based on a perforator arising from the superior deep epigastric vessels and was rotated 90° into the defect allowing primary closure of the donor site. The patient healed uneventfully and was without recurrent disease 37 months following reconstruction. Perforator propeller flaps can be used successfully in reconstruction of abdominal defects and should be incorporated into the armamentarium of reconstructive microsurgeons already facile with perforator dissections. © 2014 Wiley Periodicals, Inc.

  2. Integrating dynamic and distributed compressive sensing techniques to enhance image quality of the compressive line sensing system for unmanned aerial vehicles application

    NASA Astrophysics Data System (ADS)

    Ouyang, Bing; Hou, Weilin; Caimi, Frank M.; Dalgleish, Fraser R.; Vuorenkoski, Anni K.; Gong, Cuiling

    2017-07-01

    The compressive line sensing imaging system adopts distributed compressive sensing (CS) to acquire data and reconstruct images. Dynamic CS uses Bayesian inference to capture the correlated nature of the adjacent lines. An image reconstruction technique that incorporates dynamic CS in the distributed CS framework was developed to improve the quality of reconstructed images. The effectiveness of the technique was validated using experimental data acquired in an underwater imaging test facility. Results that demonstrate contrast and resolution improvements will be presented. The improved efficiency is desirable for unmanned aerial vehicles conducting long-duration missions.

  3. 40 CFR Table 4 to Subpart Ggg of... - [Reserved

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... FOR DESIGNATED FACILITIES AND POLLUTANTS Federal Plan Requirements for Municipal Solid Waste Landfills That Commenced Construction Prior to May 30, 1991 and Have Not Been Modified or Reconstructed Since May...

  4. Reconstruction dynamics of recorded holograms in photochromic glass.

    PubMed

    Mihailescu, Mona; Pavel, Eugen; Nicolae, Vasile B

    2011-06-20

    We have investigated the dynamics of the record-erase process of holograms in photochromic glass using continuum Nd:YVO₄ laser radiation (λ=532 nm). A bidimensional microgrid pattern was formed and visualized in photochromic glass, and its diffraction efficiency decay versus time (during reconstruction step) gave us information (D, Δn) about the diffusion process inside the material. The recording and reconstruction processes were carried out in an off-axis setup, and the images of the reconstructed object were recorded by a CCD camera. Measurements realized on reconstructed object images using holograms recorded at a different incident power laser have shown a two-stage process involved in silver atom kinetics.

  5. 75 FR 68200 - Medical Devices; Radiology Devices; Reclassification of Full-Field Digital Mammography System

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-11-05

    ... exposure control, image processing and reconstruction programs, patient and equipment supports, component..., acquisition workstation, automatic exposure control, image processing and reconstruction programs, patient and... may include was revised by adding automatic exposure control, image processing and reconstruction...

  6. High-rise construction in historical cities through the example of Saint Petersburg

    NASA Astrophysics Data System (ADS)

    Granstrem, Maria; Zolotareva, Milena; Slavina, Tatyana

    2018-03-01

    The article sets forth results of the landscape visual analysis of the interaction of high-rise construction facilities with the environment of historical urban spaces. A toxic connection of high-rise construction facilities with the established urban landscape was analyzed and recorded. One of the latest stages of the reconstruction of historical cities, which penetrated many European countries at the end of the 20th century, also started in the beginning of the 21st century in Russia, where the reconstruction of historical facilities and territories became one of the leading trends of architectural activity. Therefore, problems of the interaction between the old city and new high-rise construction nearby historical centers are extremely relevant for Russian architects. Specific features of Russian high-rise construction within visual borders of historical cities, developed at the turn of the 20th-21st centuries, repeat past urban-planning mistakes spread in Europe in the second half of the 20th century. High-rise construction in close proximity to historical centers of cities violates an established scale and destroys a historical city silhouette.

  7. Breast reconstruction after mastectomy at a comprehensive cancer center.

    PubMed

    Connors, Shahnjayla K; Goodman, Melody S; Myckatyn, Terence; Margenthaler, Julie; Gehlert, Sarah

    2016-01-01

    Breast reconstruction after mastectomy is an integral part of breast cancer treatment that positively impacts quality of life in breast cancer survivors. Although breast reconstruction rates have increased over time, African American women remain less likely to receive breast reconstruction compared to Caucasian women. National Cancer Institute-designated Comprehensive Cancer Centers, specialized institutions with more standardized models of cancer treatment, report higher breast reconstruction rates than primary healthcare facilities. Whether breast reconstruction disparities are reduced for women treated at comprehensive cancer centers is unclear. The purpose of this study was to further investigate breast reconstruction rates and determinants at a comprehensive cancer center in St. Louis, Missouri. Sociodemographic and clinical data were obtained for women who received mastectomy for definitive surgical treatment for breast cancer between 2000 and 2012. Logistic regression was used to identify factors associated with the receipt of breast reconstruction. We found a breast reconstruction rate of 54 % for the study sample. Women who were aged 55 and older, had public insurance, received unilateral mastectomy, and received adjuvant radiation therapy were significantly less likely to receive breast reconstruction. African American women were 30 % less likely to receive breast reconstruction than Caucasian women. These findings suggest that racial disparities in breast reconstruction persist in comprehensive cancer centers. Future research should further delineate the determinants of breast reconstruction disparities across various types of healthcare institutions. Only then can we develop interventions to ensure all eligible women have access to breast reconstruction and the improved quality of life it affords breast cancer survivors.

  8. Bayesian Abel Inversion in Quantitative X-Ray Radiography

    DOE PAGES

    Howard, Marylesa; Fowler, Michael; Luttman, Aaron; ...

    2016-05-19

    A common image formation process in high-energy X-ray radiography is to have a pulsed power source that emits X-rays through a scene, a scintillator that absorbs X-rays and uoresces in the visible spectrum in response to the absorbed photons, and a CCD camera that images the visible light emitted from the scintillator. The intensity image is related to areal density, and, for an object that is radially symmetric about a central axis, the Abel transform then gives the object's volumetric density. Two of the primary drawbacks to classical variational methods for Abel inversion are their sensitivity to the type andmore » scale of regularization chosen and the lack of natural methods for quantifying the uncertainties associated with the reconstructions. In this work we cast the Abel inversion problem within a statistical framework in order to compute volumetric object densities from X-ray radiographs and to quantify uncertainties in the reconstruction. A hierarchical Bayesian model is developed with a likelihood based on a Gaussian noise model and with priors placed on the unknown density pro le, the data precision matrix, and two scale parameters. This allows the data to drive the localization of features in the reconstruction and results in a joint posterior distribution for the unknown density pro le, the prior parameters, and the spatial structure of the precision matrix. Results of the density reconstructions and pointwise uncertainty estimates are presented for both synthetic signals and real data from a U.S. Department of Energy X-ray imaging facility.« less

  9. Three-dimensional reconstruction of indoor whole elements based on mobile LiDAR point cloud data

    NASA Astrophysics Data System (ADS)

    Gong, Yuejian; Mao, Wenbo; Bi, Jiantao; Ji, Wei; He, Zhanjun

    2014-11-01

    Ground-based LiDAR is one of the most effective city modeling tools at present, which has been widely used for three-dimensional reconstruction of outdoor objects. However, as for indoor objects, there are some technical bottlenecks due to lack of GPS signal. In this paper, based on the high-precision indoor point cloud data which was obtained by LiDAR, an international advanced indoor mobile measuring equipment, high -precision model was fulfilled for all indoor ancillary facilities. The point cloud data we employed also contain color feature, which is extracted by fusion with CCD images. Thus, it has both space geometric feature and spectral information which can be used for constructing objects' surface and restoring color and texture of the geometric model. Based on Autodesk CAD platform and with help of PointSence plug, three-dimensional reconstruction of indoor whole elements was realized. Specifically, Pointools Edit Pro was adopted to edit the point cloud, then different types of indoor point cloud data was processed, including data format conversion, outline extracting and texture mapping of the point cloud model. Finally, three-dimensional visualization of the real-world indoor was completed. Experiment results showed that high-precision 3D point cloud data obtained by indoor mobile measuring equipment can be used for indoor whole elements' 3-d reconstruction and that methods proposed in this paper can efficiently realize the 3 -d construction of indoor whole elements. Moreover, the modeling precision could be controlled within 5 cm, which was proved to be a satisfactory result.

  10. An improved ring removal procedure for in-line x-ray phase contrast tomography

    NASA Astrophysics Data System (ADS)

    Massimi, Lorenzo; Brun, Francesco; Fratini, Michela; Bukreeva, Inna; Cedola, Alessia

    2018-02-01

    The suppression of ring artifacts in x-ray computed tomography (CT) is a required step in practical applications; it can be addressed by introducing refined digital low pass filters within the reconstruction process. However, these filters may introduce additional ringing artifacts when simultaneously imaging pure phase objects and elements having a non-negligible absorption coefficient. Ringing originates at sharp interfaces, due to the truncation of spatial high frequencies, and severely affects qualitative and quantitative analysis of the reconstructed slices. In this work, we discuss the causes of ringing artifacts, and present a general compensation procedure to account for it. The proposed procedure has been tested with CT datasets of the mouse central nervous system acquired at different synchrotron radiation facilities. The results demonstrate that the proposed method compensates for ringing artifacts induced by low pass ring removal filters. The effectiveness of the ring suppression filters is not altered; the proposed method can thus be considered as a framework to improve the ring removal step, regardless of the specific filter adopted or the imaged sample.

  11. Reconstructing the Auditory Apparatus of Therapsids by Means of Neutron Tomography

    NASA Astrophysics Data System (ADS)

    Laaß, Michael; Schillinger, Burkhard

    The internal cranial structure of mammalian ancestors, i.e. the therapsids or ;mammal-like reptiles;, is crucial for understanding the early mammalian evolution. In the past therapsid skulls were investigated by mechanical sectioning or serial grinding, which was a very time-consuming and destructive process and could only be applied to non-valuable or poorly preserved specimens. As most therapsid skulls are embedded in terrestrial iron-rich sediments of Late Permian or Triassic age, i.e. so called ;Red beds;, a successful investigation with X-Rays is often not possible. We successfully investigated therapsid skulls by means of neutron tomography at the facility ANTARES at FRM II in Munich using cold neutron radiation. This kind of radiation is able to penetrate iron-rich substances in the range between 5 and 15 cm and produces a good contrast between matrix and bones, which enables segmentation of internal cranial structures such as bones, cavities and canals of nerves and blood vessels. In particular, neutron tomography combined with methods of 3D modeling was used here for the investigation and reconstruction of the auditory apparatus of therapsids.

  12. Pointing History Engine for the Spitzer Space Telescope

    NASA Technical Reports Server (NTRS)

    Bayard, David; Ahmed, Asif; Brugarolas, Paul

    2007-01-01

    The Pointing History Engine (PHE) is a computer program that provides mathematical transformations needed to reconstruct, from downlinked telemetry data, the attitude of the Spitzer Space Telescope (formerly known as the Space Infrared Telescope Facility) as a function of time. The PHE also serves as an example for development of similar pointing reconstruction software for future space telescopes. The transformations implemented in the PHE take account of the unique geometry of the Spitzer telescope-pointing chain, including all data on relative alignments of components, and all information available from attitude-determination instruments. The PHE makes it possible to coordinate attitude data with observational data acquired at the same time, so that any observed astronomical object can be located for future reference and re-observation. The PHE is implemented as a subroutine used in conjunction with telemetry-formatting services of the Mission Image Processing Laboratory of NASA s Jet Propulsion Laboratory to generate the Boresight Pointing History File (BPHF). The BPHF is an archival database designed to serve as Spitzer s primary astronomical reference documenting where the telescope was pointed at any time during its mission.

  13. An improved ring removal procedure for in-line x-ray phase contrast tomography.

    PubMed

    Massimi, Lorenzo; Brun, Francesco; Fratini, Michela; Bukreeva, Inna; Cedola, Alessia

    2018-02-12

    The suppression of ring artifacts in x-ray computed tomography (CT) is a required step in practical applications; it can be addressed by introducing refined digital low pass filters within the reconstruction process. However, these filters may introduce additional ringing artifacts when simultaneously imaging pure phase objects and elements having a non-negligible absorption coefficient. Ringing originates at sharp interfaces, due to the truncation of spatial high frequencies, and severely affects qualitative and quantitative analysis of the reconstructed slices. In this work, we discuss the causes of ringing artifacts, and present a general compensation procedure to account for it. The proposed procedure has been tested with CT datasets of the mouse central nervous system acquired at different synchrotron radiation facilities. The results demonstrate that the proposed method compensates for ringing artifacts induced by low pass ring removal filters. The effectiveness of the ring suppression filters is not altered; the proposed method can thus be considered as a framework to improve the ring removal step, regardless of the specific filter adopted or the imaged sample.

  14. Emittance measurements in low energy ion storage rings

    NASA Astrophysics Data System (ADS)

    Hunt, J. R.; Carli, C.; Resta-López, J.; Welsch, C. P.

    2018-07-01

    The development of the next generation of ultra-low energy antiproton and ion facilities requires precise information about the beam emittance to guarantee optimum performance. In the Extra-Low ENergy Antiproton storage ring (ELENA) the transverse emittances will be measured by scraping. However, this diagnostic measurement faces several challenges: non-zero dispersion, non-Gaussian beam distributions due to effects of the electron cooler and various systematic errors such as closed orbit offsets and inaccurate rms momentum spread estimation. In addition, diffusion processes, such as intra-beam scattering might lead to emittance overestimates. Here, we present algorithms to efficiently address the emittance reconstruction in presence of the above effects, and present simulation results for the case of ELENA.

  15. 36 CFR 223.111 - Administration of contracts in designated disaster areas.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... road or facility, the United States shall bear such increased construction cost if, as determined by... damages are so great that restoration, reconstruction, or construction is not practical under the cost...

  16. Understanding reconstructed Dante spectra using high resolution spectroscopy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    May, M. J., E-mail: may13@llnl.gov; Widmann, K.; Kemp, G. E.

    2016-11-15

    The Dante is an 18 channel filtered diode array used at the National Ignition Facility (NIF) to measure the spectrally and temporally resolved radiation flux between 50 eV and 20 keV from various targets. The absolute flux is determined from the radiometric calibration of the x-ray diodes, filters, and mirrors and a reconstruction algorithm applied to the recorded voltages from each channel. The reconstructed spectra are very low resolution with features consistent with the instrument response and are not necessarily consistent with the spectral emission features from the plasma. Errors may exist between the reconstructed spectra and the actual emissionmore » features due to assumptions in the algorithm. Recently, a high resolution convex crystal spectrometer, VIRGIL, has been installed at NIF with the same line of sight as the Dante. Spectra from L-shell Ag and Xe have been recorded by both VIRGIL and Dante. Comparisons of these two spectroscopic measurements yield insights into the accuracy of the Dante reconstructions.« less

  17. Kuwaiti reconstruction project unprecedented in size, complexity

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tippee, B.

    1993-03-15

    There had been no challenge like it: a desert emirate ablaze; its main city sacked; the economically crucial oil industry devastated; countryside shrouded in smoke from oil well fires and littered with unexploded ordnance, disabled military equipment, and unignited crude oil. Like the well-documented effort that brought 749 burning wells under control in less than 7 months, Kuwaiti reconstruction had no precedent. Unlike the firefight, reconstruction is no-where complete. It nevertheless has placed two of three refineries back on stream, restored oil production to preinvasion levels, and repaired or rebuilt 17 of 26 oil field gathering stations. Most of themore » progress has come since the last well fire went out on Nov. 6, 1991. Expatriates in Kuwait since the days of Al-Awda- the return,' in Arabic- attribute much of the rapid progress under Al-Tameer- the reconstruction'- to decisions and preparations made while the well fires still raged. The article describes the planning for Al-Awda, reentering the country, drilling plans, facilities reconstruction, and special problems.« less

  18. Real-time implementing wavefront reconstruction for adaptive optics

    NASA Astrophysics Data System (ADS)

    Wang, Caixia; Li, Mei; Wang, Chunhong; Zhou, Luchun; Jiang, Wenhan

    2004-12-01

    The capability of real time wave-front reconstruction is important for an adaptive optics (AO) system. The bandwidth of system and the real-time processing ability of the wave-front processor is mainly affected by the speed of calculation. The system requires enough number of subapertures and high sampling frequency to compensate atmospheric turbulence. The number of reconstruction operation is increased accordingly. Since the performance of AO system improves with the decrease of calculation latency, it is necessary to study how to increase the speed of wavefront reconstruction. There are two methods to improve the real time of the reconstruction. One is to convert the wavefront reconstruction matrix, such as by wavelet or FFT. The other is enhancing the performance of the processing element. Analysis shows that the latency cutting is performed with the cost of reconstruction precision by the former method. In this article, the latter method is adopted. From the characteristic of the wavefront reconstruction algorithm, a systolic array by FPGA is properly designed to implement real-time wavefront reconstruction. The system delay is reduced greatly by the utilization of pipeline and parallel processing. The minimum latency of reconstruction is the reconstruction calculation of one subaperture.

  19. Computed tomography of x-ray images using neural networks

    NASA Astrophysics Data System (ADS)

    Allred, Lloyd G.; Jones, Martin H.; Sheats, Matthew J.; Davis, Anthony W.

    2000-03-01

    Traditional CT reconstruction is done using the technique of Filtered Backprojection. While this technique is widely employed in industrial and medical applications, it is not generally understood that FB has a fundamental flaw. Gibbs phenomena states any Fourier reconstruction will produce errors in the vicinity of all discontinuities, and that the error will equal 28 percent of the discontinuity. A number of years back, one of the authors proposed a biological perception model whereby biological neural networks perceive 3D images from stereo vision. The perception model proports an internal hard-wired neural network which emulates the external physical process. A process is repeated whereby erroneous unknown internal values are used to generate an emulated signal with is compared to external sensed data, generating an error signal. Feedback from the error signal is then sued to update the erroneous internal values. The process is repeated until the error signal no longer decrease. It was soon realized that the same method could be used to obtain CT from x-rays without having to do Fourier transforms. Neural networks have the additional potential for handling non-linearities and missing data. The technique has been applied to some coral images, collected at the Los Alamos high-energy x-ray facility. The initial images show considerable promise, in some instances showing more detail than the FB images obtained from the same data. Although routine production using this new method would require a massively parallel computer, the method shows promise, especially where refined detail is required.

  20. A novel data processing technique for image reconstruction of penumbral imaging

    NASA Astrophysics Data System (ADS)

    Xie, Hongwei; Li, Hongyun; Xu, Zeping; Song, Guzhou; Zhang, Faqiang; Zhou, Lin

    2011-06-01

    CT image reconstruction technique was applied to the data processing of the penumbral imaging. Compared with other traditional processing techniques for penumbral coded pinhole image such as Wiener, Lucy-Richardson and blind technique, this approach is brand new. In this method, the coded aperture processing method was used for the first time independent to the point spread function of the image diagnostic system. In this way, the technical obstacles was overcome in the traditional coded pinhole image processing caused by the uncertainty of point spread function of the image diagnostic system. Then based on the theoretical study, the simulation of penumbral imaging and image reconstruction was carried out to provide fairly good results. While in the visible light experiment, the point source of light was used to irradiate a 5mm×5mm object after diffuse scattering and volume scattering. The penumbral imaging was made with aperture size of ~20mm. Finally, the CT image reconstruction technique was used for image reconstruction to provide a fairly good reconstruction result.

  1. A software tool of digital tomosynthesis application for patient positioning in radiotherapy.

    PubMed

    Yan, Hui; Dai, Jian-Rong

    2016-03-08

    Digital Tomosynthesis (DTS) is an image modality in reconstructing tomographic images from two-dimensional kV projections covering a narrow scan angles. Comparing with conventional cone-beam CT (CBCT), it requires less time and radiation dose in data acquisition. It is feasible to apply this technique in patient positioning in radiotherapy. To facilitate its clinical application, a software tool was developed and the reconstruction processes were accelerated by graphic process-ing unit (GPU). Two reconstruction and two registration processes are required for DTS application which is different from conventional CBCT application which requires one image reconstruction process and one image registration process. The reconstruction stage consists of productions of two types of DTS. One type of DTS is reconstructed from cone-beam (CB) projections covering a narrow scan angle and is named onboard DTS (ODTS), which represents the real patient position in treatment room. Another type of DTS is reconstructed from digitally reconstructed radiography (DRR) and is named reference DTS (RDTS), which represents the ideal patient position in treatment room. Prior to the reconstruction of RDTS, The DRRs are reconstructed from planning CT using the same acquisition setting of CB projections. The registration stage consists of two matching processes between ODTS and RDTS. The target shift in lateral and longitudinal axes are obtained from the matching between ODTS and RDTS in coronal view, while the target shift in longitudinal and vertical axes are obtained from the matching between ODTS and RDTS in sagittal view. In this software, both DRR and DTS reconstruction algorithms were implemented on GPU environments for acceleration purpose. The comprehensive evaluation of this software tool was performed including geometric accuracy, image quality, registration accuracy, and reconstruction efficiency. The average correlation coefficient between DRR/DTS generated by GPU-based algorithm and CPU-based algorithm is 0.99. Based on the measurements of cube phantom on DTS, the geometric errors are within 0.5 mm in three axes. For both cube phantom and pelvic phantom, the registration errors are within 0.5 mm in three axes. Compared with reconstruction performance of CPU-based algorithms, the performances of DRR and DTS reconstructions are improved by a factor of 15 to 20. A GPU-based software tool was developed for DTS application for patient positioning of radiotherapy. The geometric and registration accuracy met the clinical requirement in patient setup of radiotherapy. The high performance of DRR and DTS reconstruction algorithms was achieved by the GPU-based computation environments. It is a useful software tool for researcher and clinician in evaluating DTS application in patient positioning of radiotherapy.

  2. Three-dimensional reconstruction of neutron, gamma-ray, and x-ray sources using spherical harmonic decomposition

    NASA Astrophysics Data System (ADS)

    Volegov, P. L.; Danly, C. R.; Fittinghoff, D.; Geppert-Kleinrath, V.; Grim, G.; Merrill, F. E.; Wilde, C. H.

    2017-11-01

    Neutron, gamma-ray, and x-ray imaging are important diagnostic tools at the National Ignition Facility (NIF) for measuring the two-dimensional (2D) size and shape of the neutron producing region, for probing the remaining ablator and measuring the extent of the DT plasmas during the stagnation phase of Inertial Confinement Fusion implosions. Due to the difficulty and expense of building these imagers, at most only a few two-dimensional projections images will be available to reconstruct the three-dimensional (3D) sources. In this paper, we present a technique that has been developed for the 3D reconstruction of neutron, gamma-ray, and x-ray sources from a minimal number of 2D projections using spherical harmonics decomposition. We present the detailed algorithms used for this characterization and the results of reconstructed sources from experimental neutron and x-ray data collected at OMEGA and NIF.

  3. Integration of TomoPy and the ASTRA toolbox for advanced processing and reconstruction of tomographic synchrotron data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pelt, Daniël M.; Gürsoy, Dogˇa; Palenstijn, Willem Jan

    2016-04-28

    The processing of tomographic synchrotron data requires advanced and efficient software to be able to produce accurate results in reasonable time. In this paper, the integration of two software toolboxes, TomoPy and the ASTRA toolbox, which, together, provide a powerful framework for processing tomographic data, is presented. The integration combines the advantages of both toolboxes, such as the user-friendliness and CPU-efficient methods of TomoPy and the flexibility and optimized GPU-based reconstruction methods of the ASTRA toolbox. It is shown that both toolboxes can be easily installed and used together, requiring only minor changes to existing TomoPy scripts. Furthermore, it ismore » shown that the efficient GPU-based reconstruction methods of the ASTRA toolbox can significantly decrease the time needed to reconstruct large datasets, and that advanced reconstruction methods can improve reconstruction quality compared with TomoPy's standard reconstruction method.« less

  4. Integration of TomoPy and the ASTRA toolbox for advanced processing and reconstruction of tomographic synchrotron data

    PubMed Central

    Pelt, Daniël M.; Gürsoy, Doǧa; Palenstijn, Willem Jan; Sijbers, Jan; De Carlo, Francesco; Batenburg, Kees Joost

    2016-01-01

    The processing of tomographic synchrotron data requires advanced and efficient software to be able to produce accurate results in reasonable time. In this paper, the integration of two software toolboxes, TomoPy and the ASTRA toolbox, which, together, provide a powerful framework for processing tomographic data, is presented. The integration combines the advantages of both toolboxes, such as the user-friendliness and CPU-efficient methods of TomoPy and the flexibility and optimized GPU-based reconstruction methods of the ASTRA toolbox. It is shown that both toolboxes can be easily installed and used together, requiring only minor changes to existing TomoPy scripts. Furthermore, it is shown that the efficient GPU-based reconstruction methods of the ASTRA toolbox can significantly decrease the time needed to reconstruct large datasets, and that advanced reconstruction methods can improve reconstruction quality compared with TomoPy’s standard reconstruction method. PMID:27140167

  5. 77 FR 36460 - Endangered and Threatened Wildlife and Plants; Removing the Magazine Mountain Shagreen From the...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-06-19

    ... geographic range of Magazine Mountain shagreen that may affect or benefit the species. (5) The draft post... picnic facilities on the north slopes, additional hiking trails, and a reconstructed homestead. However...

  6. 23 CFR 771.117 - Categorical exclusions.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... management systems, electronic payment equipment, automatic vehicle locaters, automated passenger counters..., reconstruction, adding shoulders, or adding auxiliary lanes (e.g., parking, weaving, turning, climbing). (2... fringe parking facilities. (5) Construction of new truck weigh stations or rest areas. (6) Approvals for...

  7. Performance of a liquid argon time projection chamber exposed to the CERN West Area Neutrino Facility neutrino beam

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Arneodo, F.; Cavanna, F.; Mitri, I. De

    2006-12-01

    We present the results of the first exposure of a Liquid Argon TPC to a multi-GeV neutrino beam. The data have been collected with a 50 liters ICARUS-like chamber located between the CHORUS and NOMAD experiments at the CERN West Area Neutrino Facility (WANF). We discuss both the instrumental performance of the detector and its capability to identify and reconstruct low-multiplicity neutrino interactions.

  8. sTools - a data reduction pipeline for the GREGOR Fabry-Pérot Interferometer and the High-resolution Fast Imager at the GREGOR solar telescope

    NASA Astrophysics Data System (ADS)

    Kuckein, C.; Denker, C.; Verma, M.; Balthasar, H.; González Manrique, S. J.; Louis, R. E.; Diercke, A.

    2017-10-01

    A huge amount of data has been acquired with the GREGOR Fabry-Pérot Interferometer (GFPI), large-format facility cameras, and since 2016 with the High-resolution Fast Imager (HiFI). These data are processed in standardized procedures with the aim of providing science-ready data for the solar physics community. For this purpose, we have developed a user-friendly data reduction pipeline called ``sTools'' based on the Interactive Data Language (IDL) and licensed under creative commons license. The pipeline delivers reduced and image-reconstructed data with a minimum of user interaction. Furthermore, quick-look data are generated as well as a webpage with an overview of the observations and their statistics. All the processed data are stored online at the GREGOR GFPI and HiFI data archive of the Leibniz Institute for Astrophysics Potsdam (AIP). The principles of the pipeline are presented together with selected high-resolution spectral scans and images processed with sTools.

  9. ESTIMATION OF INTERNAL EXPOSURE TO URANIUM WITH UNCERTAINTY FROM URINALYSIS DATA USING THE InDEP COMPUTER CODE

    PubMed Central

    Anderson, Jeri L.; Apostoaei, A. Iulian; Thomas, Brian A.

    2015-01-01

    The National Institute for Occupational Safety and Health (NIOSH) is currently studying mortality in a cohort of 6409 workers at a former uranium processing facility. As part of this study, over 220 000 urine samples were used to reconstruct organ doses due to internal exposure to uranium. Most of the available computational programs designed for analysis of bioassay data handle a single case at a time, and thus require a significant outlay of time and resources for the exposure assessment of a large cohort. NIOSH is currently supporting the development of a computer program, InDEP (Internal Dose Evaluation Program), to facilitate internal radiation exposure assessment as part of epidemiological studies of both uranium- and plutonium-exposed cohorts. A novel feature of InDEP is its batch processing capability which allows for the evaluation of multiple study subjects simultaneously. InDEP analyses bioassay data and derives intakes and organ doses with uncertainty estimates using least-squares regression techniques or using the Bayes’ Theorem as applied to internal dosimetry (Bayesian method). This paper describes the application of the current version of InDEP to formulate assumptions about the characteristics of exposure at the study facility that were used in a detailed retrospective intake and organ dose assessment of the cohort. PMID:22683620

  10. Neutron Radiography and Computed Tomography at Oak Ridge National Laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Raine, Dudley A. III; Hubbard, Camden R.; Whaley, Paul M.

    1997-12-31

    The capability to perform neutron radiography and computed tomography is being developed at Oak Ridge National Laboratory. The facility will be located at the High Flux Isotope Reactor (HFIR), which has the highest steady state neutron flux of any reactor in the world. The Monte Carlo N-Particle transport code (MCNP), versions 4A and 4B, has been used extensively in the design phase of the facility to predict and optimize the operating characteristics, and to ensure the safety of personnel working in and around the blockhouse. Neutrons are quite penetrating in most engineering materials and can be useful to detect internalmore » flaws and features. Hydrogen atoms, such as in a hydrocarbon fuel, lubricant or a metal hydride, are relatively opaque to neutron transmission. Thus, neutron based tomography or radiography is ideal to image their presence. The source flux also provides unparalleled flexibility for future upgrades, including real time radiography where dynamic processes can be observed. A novel tomography detector has been designed using optical fibers and digital technology to provide a large dynamic range for reconstructions. Film radiography is also available for high resolution imaging applications. This paper summarizes the results of the design phase of this facility and the potential benefits to science and industry.« less

  11. First results from the commissioning of the BGO-OD experiment at ELSA

    NASA Astrophysics Data System (ADS)

    Bella, Andreas

    2014-11-01

    The BGO-OD experiment at the ELSA accelerator facility in Bonn combines the highly segmented BGO calorimeter with a particle tracking magnetic spectrometer at forward angles. An extensive physics program using an energy tagged Bremsstrahlung photon beam is planned. The commissioning phase of the experiment is recently complete, enhancements for the BGO-OD experiment are nevertheless in development. Recent results from the analysis of the commissioning data, which includes particle track reconstruction in the forward spectrometer and momentum reconstruction with the BGO calorimeter are presented.

  12. Neutron Tomography of a Fuel Cell: Statistical Learning Implementation of a Penalized Likelihood Method

    NASA Astrophysics Data System (ADS)

    Coakley, Kevin J.; Vecchia, Dominic F.; Hussey, Daniel S.; Jacobson, David L.

    2013-10-01

    At the NIST Neutron Imaging Facility, we collect neutron projection data for both the dry and wet states of a Proton-Exchange-Membrane (PEM) fuel cell. Transmitted thermal neutrons captured in a scintillator doped with lithium-6 produce scintillation light that is detected by an amorphous silicon detector. Based on joint analysis of the dry and wet state projection data, we reconstruct a residual neutron attenuation image with a Penalized Likelihood method with an edge-preserving Huber penalty function that has two parameters that control how well jumps in the reconstruction are preserved and how well noisy fluctuations are smoothed out. The choice of these parameters greatly influences the resulting reconstruction. We present a data-driven method that objectively selects these parameters, and study its performance for both simulated and experimental data. Before reconstruction, we transform the projection data so that the variance-to-mean ratio is approximately one. For both simulated and measured projection data, the Penalized Likelihood method reconstruction is visually sharper than a reconstruction yielded by a standard Filtered Back Projection method. In an idealized simulation experiment, we demonstrate that the cross validation procedure selects regularization parameters that yield a reconstruction that is nearly optimal according to a root-mean-square prediction error criterion.

  13. 76 FR 2944 - Notice of Passenger Facility Charge (PFC) Approvals and Disapprovals

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-01-18

    ... equipment. Rehabilitate airfield guidance signs. Rehabilitate runway 16/34 (design only). Rehabilitate parallel and connecting taxiways (design only). Rehabilitate terminal building. Conduct wildlife hazard assessment. Terminal building expansion (design only). PFC administrative costs. Reconstruct west aircraft...

  14. The Future School: Designing for Student Success.

    ERIC Educational Resources Information Center

    Ruck, Gary

    1993-01-01

    Three themes of change in school planning are the future school, outsourcing, and the reconstruction of existing facilities to accommodate technological and philosophical potential. Describes the technology and the house concept at a middle school and renovations at an elementary school. (MLF)

  15. 40 CFR Table 2 to Subpart Ggg of... - States That Submitted a Negative Declaration Letter a

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... FACILITIES AND POLLUTANTS Federal Plan Requirements for Municipal Solid Waste Landfills That Commenced Construction Prior to May 30, 1991 and Have Not Been Modified or Reconstructed Since May 30, 1991 Pt. 62, Subpt...

  16. 40 CFR Table 2 to Subpart Ggg of... - States That Submitted a Negative Declaration Letter a

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... FACILITIES AND POLLUTANTS Federal Plan Requirements for Municipal Solid Waste Landfills That Commenced Construction Prior to May 30, 1991 and Have Not Been Modified or Reconstructed Since May 30, 1991 Pt. 62, Subpt...

  17. Workflows and the Role of Images for Virtual 3d Reconstruction of no Longer Extant Historic Objects

    NASA Astrophysics Data System (ADS)

    Münster, S.

    2013-07-01

    3D reconstruction technologies have gained importance as tools for the research and visualization of no longer extant historic objects during the last decade. Within such reconstruction processes, visual media assumes several important roles: as the most important sources especially for a reconstruction of no longer extant objects, as a tool for communication and cooperation within the production process, as well as for a communication and visualization of results. While there are many discourses about theoretical issues of depiction as sources and as visualization outcomes of such projects, there is no systematic research about the importance of depiction during a 3D reconstruction process and based on empirical findings. Moreover, from a methodological perspective, it would be necessary to understand which role visual media plays during the production process and how it is affected by disciplinary boundaries and challenges specific to historic topics. Research includes an analysis of published work and case studies investigating reconstruction projects. This study uses methods taken from social sciences to gain a grounded view of how production processes would take place in practice and which functions and roles images would play within them. For the investigation of these topics, a content analysis of 452 conference proceedings and journal articles related to 3D reconstruction modeling in the field of humanities has been completed. Most of the projects described in those publications dealt with data acquisition and model building for existing objects. Only a small number of projects focused on structures that no longer or never existed physically. Especially that type of project seems to be interesting for a study of the importance of pictures as sources and as tools for interdisciplinary cooperation during the production process. In the course of the examination the authors of this paper applied a qualitative content analysis for a sample of 26 previously published project reports to depict strategies and types and three case studies of 3D reconstruction projects to evaluate evolutionary processes during such projects. The research showed that reconstructions of no longer existing historic structures are most commonly used for presentation or research purposes of large buildings or city models. Additionally, they are often realized by interdisciplinary workgroups using images as the most important source for reconstruction as far as important media for communication and quality control during the reconstruction process.

  18. The Fringe Reading Facility at the Max-Planck-Institut fuer Stroemungsforschung

    NASA Astrophysics Data System (ADS)

    Becker, F.; Meier, G. E. A.; Wegner, H.; Timm, R.; Wenskus, R.

    1987-05-01

    A Mach-Zehnder interferometer is used for optical flow measurements in a transonic wind tunnel. Holographic interferograms are reconstructed by illumination with a He-Ne-laser and viewed by a video camera through wide angle optics. This setup was used for investigating industrial double exposure holograms of truck tires in order to develop methods of automatic recognition of certain manufacturing faults. Automatic input is achieved by a transient recorder digitizing the output of a TV camera and transferring the digitized data to a PDP11-34. Interest centered around sequences of interferograms showing the interaction of vortices with a profile and subsequent emission of sound generated by this process. The objective is the extraction of quantitative data which relates to the emission of noise.

  19. The Fringe Reading Facility at the Max-Planck-Institut fuer Stroemungsforschung

    NASA Technical Reports Server (NTRS)

    Becker, F.; Meier, G. E. A.; Wegner, H.; Timm, R.; Wenskus, R.

    1987-01-01

    A Mach-Zehnder interferometer is used for optical flow measurements in a transonic wind tunnel. Holographic interferograms are reconstructed by illumination with a He-Ne-laser and viewed by a video camera through wide angle optics. This setup was used for investigating industrial double exposure holograms of truck tires in order to develop methods of automatic recognition of certain manufacturing faults. Automatic input is achieved by a transient recorder digitizing the output of a TV camera and transferring the digitized data to a PDP11-34. Interest centered around sequences of interferograms showing the interaction of vortices with a profile and subsequent emission of sound generated by this process. The objective is the extraction of quantitative data which relates to the emission of noise.

  20. The Big Bang: Facial Trauma Caused by Recreational Fireworks

    PubMed Central

    Molendijk, Josher; Vervloet, Bob; Wolvius, Eppo B.; Koudstaal, Maarten J.

    2015-01-01

    In the Netherlands, it is a tradition of setting off fireworks to celebrate the turn of the year. In our medical facility, each year patients with severe skeletal maxillofacial trauma inflicted by recreational fireworks are encountered. We present two cases of patients with severe blast injury to the face, caused by direct impact of rockets, and thereby try to contribute to the limited literature on facial blast injuries, their treatment, and clinical outcome. These patients require multidisciplinary treatment, involving multiple reconstructive surgeries, and the overall recovery process is long. The severity of these traumas raises questions about the firework traditions and legislations not only in the Netherlands but also worldwide. Therefore, the authors support restrictive laws on personal use of fireworks in the Netherlands. PMID:27162578

  1. The Big Bang: Facial Trauma Caused by Recreational Fireworks.

    PubMed

    Molendijk, Josher; Vervloet, Bob; Wolvius, Eppo B; Koudstaal, Maarten J

    2016-06-01

    In the Netherlands, it is a tradition of setting off fireworks to celebrate the turn of the year. In our medical facility, each year patients with severe skeletal maxillofacial trauma inflicted by recreational fireworks are encountered. We present two cases of patients with severe blast injury to the face, caused by direct impact of rockets, and thereby try to contribute to the limited literature on facial blast injuries, their treatment, and clinical outcome. These patients require multidisciplinary treatment, involving multiple reconstructive surgeries, and the overall recovery process is long. The severity of these traumas raises questions about the firework traditions and legislations not only in the Netherlands but also worldwide. Therefore, the authors support restrictive laws on personal use of fireworks in the Netherlands.

  2. A zero-error operational video data compression system

    NASA Technical Reports Server (NTRS)

    Kutz, R. L.

    1973-01-01

    A data compression system has been operating since February 1972, using ATS spin-scan cloud cover data. With the launch of ITOS 3 in October 1972, this data compression system has become the only source of near-realtime very high resolution radiometer image data at the data processing facility. The VHRR image data are compressed and transmitted over a 50 kilobit per second wideband ground link. The goal of the data compression experiment was to send data quantized to six bits at twice the rate possible when no compression is used, while maintaining zero error between the transmitted and reconstructed data. All objectives of the data compression experiment were met, and thus a capability of doubling the data throughput of the system has been achieved.

  3. Capturing molecular multimode relaxation processes in excitable gases based on decomposition of acoustic relaxation spectra

    NASA Astrophysics Data System (ADS)

    Zhu, Ming; Liu, Tingting; Wang, Shu; Zhang, Kesheng

    2017-08-01

    Existing two-frequency reconstructive methods can only capture primary (single) molecular relaxation processes in excitable gases. In this paper, we present a reconstructive method based on the novel decomposition of frequency-dependent acoustic relaxation spectra to capture the entire molecular multimode relaxation process. This decomposition of acoustic relaxation spectra is developed from the frequency-dependent effective specific heat, indicating that a multi-relaxation process is the sum of the interior single-relaxation processes. Based on this decomposition, we can reconstruct the entire multi-relaxation process by capturing the relaxation times and relaxation strengths of N interior single-relaxation processes, using the measurements of acoustic absorption and sound speed at 2N frequencies. Experimental data for the gas mixtures CO2-N2 and CO2-O2 validate our decomposition and reconstruction approach.

  4. [Application of joint reconstruction with autogenous coronoid process graft to treat temporomandibular joint ankylosis].

    PubMed

    Xie, Qing-tiao; Huang, Xuan-ping; Jiang, Xian-fang; Yang, Yuan-yuan; Li, Hua; Lin, Xi

    2013-08-01

    To evaluate the clinical effect of joint reconstruction by using autogenous coronoid process graft to treat temporomandibular joint(TMJ) ankylosis. Nine cases of TMJ ankylosis from September 2008 to September 2010 were surgically treated by joint reconstruction with autogenous coronoid process graft, using autogenous articular disc or prosthodontic membrane as interpositional material. Mouth opening, occlusion and cone beam CT(CBCT) were used for evaluation before and after surgery. Satisfactory mouth opening was achieved in all patients and no one got occlusal changes or reankylosis during follow-up. CBCT showed that coronoid process graft reached bone union with the ramus and turned to be round. It is effective to cure TMJ ankylosis through joint reconstruction with autogenous coronoid process graft.

  5. Plastic Surgery Challenges in War Wounded I: Flap-Based Extremity Reconstruction

    PubMed Central

    Sabino, Jennifer M.; Slater, Julia; Valerio, Ian L.

    2016-01-01

    Scope and Significance: Reconstruction of traumatic injuries requiring tissue transfer begins with aggressive resuscitation and stabilization. Systematic advances in acute casualty care at the point of injury have improved survival and allowed for increasingly complex treatment before definitive reconstruction at tertiary medical facilities outside the combat zone. As a result, the complexity of the limb salvage algorithm has increased over 14 years of combat activities in Iraq and Afghanistan. Problem: Severe poly-extremity trauma in combat casualties has led to a large number of extremity salvage cases. Advanced reconstructive techniques coupled with regenerative medicine applications have played a critical role in the restoration, recovery, and rehabilitation of functional limb salvage. Translational Relevance: The past 14 years of war trauma have increased our understanding of tissue transfer for extremity reconstruction in the treatment of combat casualties. Injury patterns, flap choice, and reconstruction timing are critical variables to consider for optimal outcomes. Clinical Relevance: Subacute reconstruction with specifically chosen flap tissue and donor site location based on individual injuries result in successful tissue transfer, even in critically injured patients. These considerations can be combined with regenerative therapies to optimize massive wound coverage and limb salvage form and function in previously active patients. Summary: Traditional soft tissue reconstruction is integral in the treatment of war extremity trauma. Pedicle and free flaps are a critically important part of the reconstructive ladder for salvaging extreme extremity injuries that are seen as a result of the current practice of war. PMID:27679751

  6. A first look at reconstructed data from the GlueX detector

    NASA Astrophysics Data System (ADS)

    Taylor, Simon; GlueX Collaboration

    2015-10-01

    Construction of the GlueX detector in Hall D at the Thomas Jefferson National Accelerator Facility has recently been completed as part of the 12 GeV Upgrade to the facility. The detector consists of a barrel region containing devices for tracking charged particles and a lead-scintillator calorimeter for detecting photons, and a forward region consisting of two layers of scintillator paddles for time-of-flight measurements and a lead-glass electromagnetic calorimeter. The electron beam from the accelerator is converted into a photon beam by inserting a diamond radiator, thereby producing a coherent bremsstrahlung spectrum of photons impinging on a 30 cm-long LH2 target. The energy of the photon beam is determined using a tagging spectrometer. A commissioning run took place in Spring of 2015 during which all of the detector components were read out. Preliminary calibrations have been determined to a level sufficient to allow reconstruction of final states with several charged tracks and neutral particles. A first look at results of reconstruction of events using the GlueX detector will be presented. This material is based upon work supported by the U.S. Department of Energy, Office of Science, Office of Nuclear Physics under Contract DE-AC05-06OR23177.

  7. The PixFEL project: Progress towards a fine pitch X-ray imaging camera for next generation FEL facilities

    NASA Astrophysics Data System (ADS)

    Rizzo, G.; Batignani, G.; Benkechkache, M. A.; Bettarini, S.; Casarosa, G.; Comotti, D.; Dalla Betta, G.-F.; Fabris, L.; Forti, F.; Grassi, M.; Lodola, L.; Malcovati, P.; Manghisoni, M.; Mendicino, R.; Morsani, F.; Paladino, A.; Pancheri, L.; Paoloni, E.; Ratti, L.; Re, V.; Traversi, G.; Vacchi, C.; Verzellesi, G.; Xu, H.

    2016-07-01

    The INFN PixFEL project is developing the fundamental building blocks for a large area X-ray imaging camera to be deployed at next generation free electron laser (FEL) facilities with unprecedented intensity. Improvement in performance beyond the state of art in imaging instrumentation will be explored adopting advanced technologies like active edge sensors, a 65 nm node CMOS process and vertical integration. These are the key ingredients of the PixFEL project to realize a seamless large area focal plane instrument composed by a matrix of multilayer four-side buttable tiles. In order to minimize the dead area and reduce ambiguities in image reconstruction, a fine pitch active edge thick sensor is being optimized to cope with very high intensity photon flux, up to 104 photons per pixel, in the range from 1 to 10 keV. A low noise analog front-end channel with this wide dynamic range and a novel dynamic compression feature, together with a low power 10 bit analog to digital conversion up to 5 MHz, has been realized in a 110 μm pitch with a 65 nm CMOS process. Vertical interconnection of two CMOS tiers will be also explored in the future to build a four-side buttable readout chip with high density memories. In the long run the objective of the PixFEL project is to build a flexible X-ray imaging camera for operation both in burst mode, like at the European X-FEL, or in continuous mode with the high frame rates anticipated for future FEL facilities.

  8. The drift chamber array at the external target facility in HIRFL-CSR

    NASA Astrophysics Data System (ADS)

    Sun, Y. Z.; Sun, Z. Y.; Wang, S. T.; Duan, L. M.; Sun, Y.; Yan, D.; Tang, S. W.; Yang, H. R.; Lu, C. G.; Ma, P.; Yu, Y. H.; Zhang, X. H.; Yue, K.; Fang, F.; Su, H.

    2018-06-01

    A drift chamber array at the External Target Facility in HIRFL-CSR has been constructed for three-dimensional particle tracking in high-energy radioactive ion beam experiments. The design, readout, track reconstruction program and calibration procedures for the detector are described. The drift chamber array was tested in a 311 AMeV 40Ar beam experiment. The detector performance based on the measurements of the beam test is presented. A spatial resolution of 230 μm is achieved.

  9. Challenges in scaling NLO generators to leadership computers

    NASA Astrophysics Data System (ADS)

    Benjamin, D.; Childers, JT; Hoeche, S.; LeCompte, T.; Uram, T.

    2017-10-01

    Exascale computing resources are roughly a decade away and will be capable of 100 times more computing than current supercomputers. In the last year, Energy Frontier experiments crossed a milestone of 100 million core-hours used at the Argonne Leadership Computing Facility, Oak Ridge Leadership Computing Facility, and NERSC. The Fortran-based leading-order parton generator called Alpgen was successfully scaled to millions of threads to achieve this level of usage on Mira. Sherpa and MadGraph are next-to-leading order generators used heavily by LHC experiments for simulation. Integration times for high-multiplicity or rare processes can take a week or more on standard Grid machines, even using all 16-cores. We will describe our ongoing work to scale the Sherpa generator to thousands of threads on leadership-class machines and reduce run-times to less than a day. This work allows the experiments to leverage large-scale parallel supercomputers for event generation today, freeing tens of millions of grid hours for other work, and paving the way for future applications (simulation, reconstruction) on these and future supercomputers.

  10. Image reconstruction: an overview for clinicians.

    PubMed

    Hansen, Michael S; Kellman, Peter

    2015-03-01

    Image reconstruction plays a critical role in the clinical use of magnetic resonance imaging (MRI). The MRI raw data is not acquired in image space and the role of the image reconstruction process is to transform the acquired raw data into images that can be interpreted clinically. This process involves multiple signal processing steps that each have an impact on the image quality. This review explains the basic terminology used for describing and quantifying image quality in terms of signal-to-noise ratio and point spread function. In this context, several commonly used image reconstruction components are discussed. The image reconstruction components covered include noise prewhitening for phased array data acquisition, interpolation needed to reconstruct square pixels, raw data filtering for reducing Gibbs ringing artifacts, Fourier transforms connecting the raw data with image space, and phased array coil combination. The treatment of phased array coils includes a general explanation of parallel imaging as a coil combination technique. The review is aimed at readers with no signal processing experience and should enable them to understand what role basic image reconstruction steps play in the formation of clinical images and how the resulting image quality is described. © 2014 Wiley Periodicals, Inc.

  11. Textbook Writing and Creativity: The Case of Mendeleev.

    ERIC Educational Resources Information Center

    Graham, Loren R.

    1983-01-01

    Historical reconstruction of Dmitrii Mendeleev's part in the creation of the Periodic Table of Elements illustrates how important the process of textbook writing was in this scientific development. A clear difference is seen between logical reconstruction of the discovery process and the insights provided by historical reconstruction of the same…

  12. 40 CFR 62.8871 - Identification of sources.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... (CONTINUED) APPROVAL AND PROMULGATION OF STATE PLANS FOR DESIGNATED FACILITIES AND POLLUTANTS Ohio Landfill Gas Emissions from Existing Municipal Solid Waste Landfills § 62.8871 Identification of sources. The plan applies to all existing municipal solid waste landfills for which construction, reconstruction or...

  13. 40 CFR 62.7101 - Identification of sources.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... (CONTINUED) APPROVAL AND PROMULGATION OF STATE PLANS FOR DESIGNATED FACILITIES AND POLLUTANTS Nevada Landfill Gas Emissions from Existing Municipal Solid Waste Landfills § 62.7101 Identification of sources. The plan applies to all existing municipal solid waste landfills for which construction, reconstruction, or...

  14. Winning the Peace: Institutionalizing Provincial Reconstruction Teams in the United States Military

    DTIC Science & Technology

    2012-06-15

    dental and veterinary care; the construction of rudimentary surface transportation systems and public facilities; and...rather than at the tactical unit level. Provided that this change occurs, the entire calculus of the PRT could correspondingly change. Assuming that

  15. CT head-scan dosimetry in an anthropomorphic phantom and associated measurement of ACR accreditation-phantom imaging metrics under clinically representative scan conditions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brunner, Claudia C.; Stern, Stanley H.; Chakrabarti, Kish

    2013-08-15

    Purpose: To measure radiation absorbed dose and its distribution in an anthropomorphic head phantom under clinically representative scan conditions in three widely used computed tomography (CT) scanners, and to relate those dose values to metrics such as high-contrast resolution, noise, and contrast-to-noise ratio (CNR) in the American College of Radiology CT accreditation phantom.Methods: By inserting optically stimulated luminescence dosimeters (OSLDs) in the head of an anthropomorphic phantom specially developed for CT dosimetry (University of Florida, Gainesville), we measured dose with three commonly used scanners (GE Discovery CT750 HD, Siemens Definition, Philips Brilliance 64) at two different clinical sites (Walter Reedmore » National Military Medical Center, National Institutes of Health). The scanners were set to operate with the same data-acquisition and image-reconstruction protocols as used clinically for typical head scans, respective of the practices of each facility for each scanner. We also analyzed images of the ACR CT accreditation phantom with the corresponding protocols. While the Siemens Definition and the Philips Brilliance protocols utilized only conventional, filtered back-projection (FBP) image-reconstruction methods, the GE Discovery also employed its particular version of an adaptive statistical iterative reconstruction (ASIR) algorithm that can be blended in desired proportions with the FBP algorithm. We did an objective image-metrics analysis evaluating the modulation transfer function (MTF), noise power spectrum (NPS), and CNR for images reconstructed with FBP. For images reconstructed with ASIR, we only analyzed the CNR, since MTF and NPS results are expected to depend on the object for iterative reconstruction algorithms.Results: The OSLD measurements showed that the Siemens Definition and the Philips Brilliance scanners (located at two different clinical facilities) yield average absorbed doses in tissue of 42.6 and 43.1 mGy, respectively. The GE Discovery delivers about the same amount of dose (43.7 mGy) when run under similar operating and image-reconstruction conditions, i.e., without tube current modulation and ASIR. The image-metrics analysis likewise showed that the MTF, NPS, and CNR associated with the reconstructed images are mutually comparable when the three scanners are run with similar settings, and differences can be attributed to different edge-enhancement properties of the applied reconstruction filters. Moreover, when the GE scanner was operated with the facility's scanner settings for routine head exams, which apply 50% ASIR and use only approximately half of the 100%-FBP dose, the CNR of the images showed no significant change. Even though the CNR alone is not sufficient to characterize the image quality and justify any dose reduction claims, it can be useful as a constancy test metric.Conclusions: This work presents a straightforward method to connect direct measurements of CT dose with objective image metrics such as high-contrast resolution, noise, and CNR. It demonstrates that OSLD measurements in an anthropomorphic head phantom allow a realistic and locally precise estimation of magnitude and spatial distribution of dose in tissue delivered during a typical CT head scan. Additional objective analysis of the images of the ACR accreditation phantom can be used to relate the measured doses to high contrast resolution, noise, and CNR.« less

  16. High resolution x-ray CMT: Reconstruction methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, J.K.

    This paper qualitatively discusses the primary characteristics of methods for reconstructing tomographic images from a set of projections. These reconstruction methods can be categorized as either {open_quotes}analytic{close_quotes} or {open_quotes}iterative{close_quotes} techniques. Analytic algorithms are derived from the formal inversion of equations describing the imaging process, while iterative algorithms incorporate a model of the imaging process and provide a mechanism to iteratively improve image estimates. Analytic reconstruction algorithms are typically computationally more efficient than iterative methods; however, analytic algorithms are available for a relatively limited set of imaging geometries and situations. Thus, the framework of iterative reconstruction methods is better suited formore » high accuracy, tomographic reconstruction codes.« less

  17. On the development of an underground geoscience laboratory at Boulby in NE England (Invited)

    NASA Astrophysics Data System (ADS)

    Petley, D. N.; Rosser, N.; Barlow, J.; Brain, M. J.; Lim, M.; Sapsford, M.; Pybus, D.

    2009-12-01

    The Boulby Mine in NE England is a major potash extraction facility located in NE England. Opened in 1973, the mine extracts both potash and rock salt from Zechstein deposits located at a depth of about 1100 m below the land surface. For the last 20 years the mine has housed an important laboratory built to provide a base for Dark Matter research. However, in the last ten years the mine has progressively become been the site of research into geophysical and geological processes, primarily through a strategic partnership between the mine operators, Cleveland Potash Ltd, and the University of Durham. The site is now the base for an initial proof of concept project, funded by the Regional Development Agency One Northeast, to explore the viability of establishing a permanent geosciences research facility at Boulby. The vision is a facility that provides access for researchers into the range of geological environments at Boulby, extending from the coastal cliffs at the surface, through the access shafts to the deepest potash seams. The facility is designed to host research in geophysics, hydrology, geophysics, geomorphology, geochemistry, microbiology, rock mechanics, mining engineering, petrology and related fields. This proof of concept study has three key strategic aims: 1. To establish the range of uses of a research laboratory at Boulby and to determine the nature of the facilities required; 2. To initiate research programmes into: a. palaeoenvironmental reconstruction of the Zechstein deposits; b. the mechanics of the potash and halite rocks; and c. the mechanisms of failure of the coastal cliffs; 3. To construct an initial four serviced research caverns within the mine. The proof of concept stage of the project is intended to run until September 2010, with development of the facility being completed by 2015. However, the facility is currently in a position to host research projects across a wide range of disciplines.

  18. Facilities for High Resolution Imaging of the Sun

    NASA Astrophysics Data System (ADS)

    von der Lühe, Oskar

    2018-04-01

    The Sun is the only star where physical processes can be observed at their intrinsic spatial scales. Even though the Sun in a mere 150 million km from Earth, it is difficult to resolve fundamental processes in the solar atmosphere, because they occur at scales of the order of the kilometer. They can be observed only with telescopes which have apertures of several meters. The current state-of-the-art are solar telescopes with apertures of 1.5 m which resolve 50 km on the solar surface, soon to be superseded by telescopes with 4 m apertures with 20 km resolution. The US American 4 m DSI Solar Telescope is currently constructed on Maui, Hawaii, and is expected to have first light in 2020. The European solar community collaborates intensively to pursue the 4 m European Solar Telescope with a construction start in the Canaries early in the next decade. Solar telescopes with slightly smaller are also in the planning by the Russian, Indian and Chinese communities. In order to achieve a resolution which approaches the diffraction limit, all modern solar telescopes use adaptive optics which compensates virtually any scene on the solar disk. Multi-conjugate adaptive optics designed to compensate fields of the order on one minute of arc have been demonstrated and will become a facility feature of the new telescopes. The requirements for high precision spectro-polarimetry – about one part in 104 – makes continuous monitoring of (MC)AO performance and post-processing image reconstruction methods a necessity.

  19. A software tool of digital tomosynthesis application for patient positioning in radiotherapy

    PubMed Central

    Dai, Jian‐Rong

    2016-01-01

    Digital Tomosynthesis (DTS) is an image modality in reconstructing tomographic images from two‐dimensional kV projections covering a narrow scan angles. Comparing with conventional cone‐beam CT (CBCT), it requires less time and radiation dose in data acquisition. It is feasible to apply this technique in patient positioning in radiotherapy. To facilitate its clinical application, a software tool was developed and the reconstruction processes were accelerated by graphic processing unit (GPU). Two reconstruction and two registration processes are required for DTS application which is different from conventional CBCT application which requires one image reconstruction process and one image registration process. The reconstruction stage consists of productions of two types of DTS. One type of DTS is reconstructed from cone‐beam (CB) projections covering a narrow scan angle and is named onboard DTS (ODTS), which represents the real patient position in treatment room. Another type of DTS is reconstructed from digitally reconstructed radiography (DRR) and is named reference DTS (RDTS), which represents the ideal patient position in treatment room. Prior to the reconstruction of RDTS, The DRRs are reconstructed from planning CT using the same acquisition setting of CB projections. The registration stage consists of two matching processes between ODTS and RDTS. The target shift in lateral and longitudinal axes are obtained from the matching between ODTS and RDTS in coronal view, while the target shift in longitudinal and vertical axes are obtained from the matching between ODTS and RDTS in sagittal view. In this software, both DRR and DTS reconstruction algorithms were implemented on GPU environments for acceleration purpose. The comprehensive evaluation of this software tool was performed including geometric accuracy, image quality, registration accuracy, and reconstruction efficiency. The average correlation coefficient between DRR/DTS generated by GPU‐based algorithm and CPU‐based algorithm is 0.99. Based on the measurements of cube phantom on DTS, the geometric errors are within 0.5 mm in three axes. For both cube phantom and pelvic phantom, the registration errors are within 0.5 mm in three axes. Compared with reconstruction performance of CPU‐based algorithms, the performances of DRR and DTS reconstructions are improved by a factor of 15 to 20. A GPU‐based software tool was developed for DTS application for patient positioning of radiotherapy. The geometric and registration accuracy met the clinical requirement in patient setup of radiotherapy. The high performance of DRR and DTS reconstruction algorithms was achieved by the GPU‐based computation environments. It is a useful software tool for researcher and clinician in evaluating DTS application in patient positioning of radiotherapy. PACS number(s): 87.57.nf PMID:27074482

  20. Metal-induced streak artifact reduction using iterative reconstruction algorithms in x-ray computed tomography image of the dentoalveolar region.

    PubMed

    Dong, Jian; Hayakawa, Yoshihiko; Kannenberg, Sven; Kober, Cornelia

    2013-02-01

    The objective of this study was to reduce metal-induced streak artifact on oral and maxillofacial x-ray computed tomography (CT) images by developing the fast statistical image reconstruction system using iterative reconstruction algorithms. Adjacent CT images often depict similar anatomical structures in thin slices. So, first, images were reconstructed using the same projection data of an artifact-free image. Second, images were processed by the successive iterative restoration method where projection data were generated from reconstructed image in sequence. Besides the maximum likelihood-expectation maximization algorithm, the ordered subset-expectation maximization algorithm (OS-EM) was examined. Also, small region of interest (ROI) setting and reverse processing were applied for improving performance. Both algorithms reduced artifacts instead of slightly decreasing gray levels. The OS-EM and small ROI reduced the processing duration without apparent detriments. Sequential and reverse processing did not show apparent effects. Two alternatives in iterative reconstruction methods were effective for artifact reduction. The OS-EM algorithm and small ROI setting improved the performance. Copyright © 2012 Elsevier Inc. All rights reserved.

  1. 77 FR 14584 - Notice of Passenger Facility Charge (PFC) Approvals and Disapprovals

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-03-12

    ... square yards). Modify supplemental wind cones. Runway 8/26 pavement rejuvenation. Purchase snow removal equipment--high-speed snow plow. Master plan/land use. Design passenger terminal remodel. Install runway... lighting and cable rehabilitation. Construct improvements of terminal building. Design reconstruction of...

  2. Calculated organ doses for Mayak production association central hall using ICRP and MCNP.

    PubMed

    Choe, Dong-Ok; Shelkey, Brenda N; Wilde, Justin L; Walk, Heidi A; Slaughter, David M

    2003-03-01

    As part of an ongoing dose reconstruction project, equivalent organ dose rates from photons and neutrons were estimated using the energy spectra measured in the central hall above the graphite reactor core located in the Russian Mayak Production Association facility. Reconstruction of the work environment was necessary due to the lack of personal dosimeter data for neutrons in the time period prior to 1987. A typical worker scenario for the central hall was developed for the Monte Carlo Neutron Photon-4B (MCNP) code. The resultant equivalent dose rates for neutrons and photons were compared with the equivalent dose rates derived from calculations using the conversion coefficients in the International Commission on Radiological Protection Publications 51 and 74 in order to validate the model scenario for this Russian facility. The MCNP results were in good agreement with the results of the ICRP publications indicating the modeling scenario was consistent with actual work conditions given the spectra provided. The MCNP code will allow for additional orientations to accurately reflect source locations.

  3. Simple model for the reconstruction of radionuclide concentrations and radiation exposures along the Techa River

    NASA Technical Reports Server (NTRS)

    Vorobiova, M. I.; Degteva, M. O.; Neta, M. O. (Principal Investigator)

    1999-01-01

    The Techa River (Southern Urals, Russia) was contaminated in 1949-1956 by liquid radioactive wastes from the Mayak complex, the first Russian facility for the production of plutonium. The measurements of environmental contamination were started in 1951. A simple model describing radionuclide transport along the free-flowing river and the accumulation of radionuclides by bottom sediments is presented. This model successfully correlates the rates of radionuclide releases as reconstructed by the Mayak experts, hydrological data, and available environmental monitoring data for the early period of contamination (1949-1951). The model was developed to reconstruct doses for people who lived in the riverside communities during the period of the releases and who were chronically exposed to external and internal irradiation. The model fills the data gaps and permits reconstruction of external gamma-exposure rates in air on the river bank and radionuclide concentrations in river water used for drinking and other household needs in 1949-1951.

  4. An image filtering technique for SPIDER visible tomography

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fonnesu, N., E-mail: nicola.fonnesu@igi.cnr.it; Agostini, M.; Brombin, M.

    2014-02-15

    The tomographic diagnostic developed for the beam generated in the SPIDER facility (100 keV, 50 A prototype negative ion source of ITER neutral beam injector) will characterize the two-dimensional particle density distribution of the beam. The simulations described in the paper show that instrumental noise has a large influence on the maximum achievable resolution of the diagnostic. To reduce its impact on beam pattern reconstruction, a filtering technique has been adapted and implemented in the tomography code. This technique is applied to the simulated tomographic reconstruction of the SPIDER beam, and the main results are reported.

  5. Aperture tolerances for neutron-imaging systems in inertial confinement fusion.

    PubMed

    Ghilea, M C; Sangster, T C; Meyerhofer, D D; Lerche, R A; Disdier, L

    2008-02-01

    Neutron-imaging systems are being considered as an ignition diagnostic for the National Ignition Facility (NIF) [Hogan et al., Nucl. Fusion 41, 567 (2001)]. Given the importance of these systems, a neutron-imaging design tool is being used to quantify the effects of aperture fabrication and alignment tolerances on reconstructed neutron images for inertial confinement fusion. The simulations indicate that alignment tolerances of more than 1 mrad would introduce measurable features in a reconstructed image for both pinholes and penumbral aperture systems. These simulations further show that penumbral apertures are several times less sensitive to fabrication errors than pinhole apertures.

  6. Improvement of the High Fluence Irradiation Facility at the University of Tokyo

    NASA Astrophysics Data System (ADS)

    Murakami, Kenta; Iwai, Takeo; Abe, Hiroaki; Sekimura, Naoto

    2016-08-01

    This paper reports the modification of the High Fluence Irradiation Facility at the University of Tokyo (HIT). The HIT facility was severely damaged during the 2011 earthquake, which occurred off the Pacific coast of Tohoku. A damaged 1.0 MV tandem Cockcroft-Walton accelerator was replaced with a 1.7 MV accelerator, which was formerly used in another campus of the university. A decision was made to maintain dual-beam irradiation capability by repairing the 3.75 MV single-ended Van de Graaff accelerator and reconstructing the related beamlines. A new beamline was connected with a 200 kV transmission electron microscope (TEM) to perform in-situ TEM observation under ion irradiation.

  7. 3D reconstruction techniques made easy: know-how and pictures.

    PubMed

    Luccichenti, Giacomo; Cademartiri, Filippo; Pezzella, Francesca Romana; Runza, Giuseppe; Belgrano, Manuel; Midiri, Massimo; Sabatini, Umberto; Bastianello, Stefano; Krestin, Gabriel P

    2005-10-01

    Three-dimensional reconstructions represent a visual-based tool for illustrating the basis of three-dimensional post-processing such as interpolation, ray-casting, segmentation, percentage classification, gradient calculation, shading and illumination. The knowledge of the optimal scanning and reconstruction parameters facilitates the use of three-dimensional reconstruction techniques in clinical practise. The aim of this article is to explain the principles of multidimensional image processing in a pictorial way and the advantages and limitations of the different possibilities of 3D visualisation.

  8. SU-F-207-02: Use of Postmortem Subjects for Subjective Image Quality Assessment in Abdominal CT Protocols with Iterative Reconstruction

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mench, A; Lipnharski, I; Carranza, C

    Purpose: New radiation dose reduction technologies are emerging constantly in the medical imaging field. The latest of these technologies, iterative reconstruction (IR) in CT, presents the ability to reduce dose significantly and hence provides great opportunity for CT protocol optimization. However, without effective analysis of image quality, the reduction in radiation exposure becomes irrelevant. This work explores the use of postmortem subjects as an image quality assessment medium for protocol optimizations in abdominal CT. Methods: Three female postmortem subjects were scanned using the Abdomen-Pelvis (AP) protocol at reduced minimum tube current and target noise index (SD) settings of 12.5, 17.5,more » 20.0, and 25.0. Images were reconstructed using two strengths of iterative reconstruction. Radiologists and radiology residents from several subspecialties were asked to evaluate 8 AP image sets including the current facility default scan protocol and 7 scans with the parameters varied as listed above. Images were viewed in the soft tissue window and scored on a 3-point scale as acceptable, borderline acceptable, and unacceptable for diagnosis. The facility default AP scan was identified to the reviewer while the 7 remaining AP scans were randomized and de-identified of acquisition and reconstruction details. The observers were also asked to comment on the subjective image quality criteria they used for scoring images. This included visibility of specific anatomical structures and tissue textures. Results: Radiologists scored images as acceptable or borderline acceptable for target noise index settings of up to 20. Due to the postmortem subjects’ close representation of living human anatomy, readers were able to evaluate images as they would those of actual patients. Conclusion: Postmortem subjects have already been proven useful for direct CT organ dose measurements. This work illustrates the validity of their use for the crucial evaluation of image quality during CT protocol optimization, especially when investigating the effects of new technologies.« less

  9. First principles investigation of the initial stage of H-induced missing-row reconstruction of Pd(110) surface

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Padama, Allan Abraham B.; Kasai, Hideaki, E-mail: kasai@dyn.ap.eng.osaka-u.ac.jp; Center for Atomic and Molecular Technologies, Osaka University, Suita, Osaka 565-0871

    2014-06-28

    The pathway of H diffusion that will induce the migration of Pd atom is investigated by employing first principles calculations based on density functional theory to explain the origin of missing-row reconstruction of Pd(110).The calculated activation barrier and the H-induced reconstruction energy reveal that the long bridge-to-tetrahedral configuration is the energetically favored process for the initial stage of reconstruction phenomenon. While the H diffusion triggers the migration of Pd atom, it is the latter process that significantly contributes to the activated missing-row reconstruction of Pd(110). Nonetheless, the strong interaction between the diffusing H and the Pd atoms dictates the occurrencemore » of reconstructed surface.« less

  10. 42 CFR 82.32 - How will NIOSH make changes in scientific elements underlying the dose reconstruction process...

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 42 Public Health 1 2014-10-01 2014-10-01 false How will NIOSH make changes in scientific elements underlying the dose reconstruction process, based on scientific progress? 82.32 Section 82.32 Public Health... AND RELATED ACTIVITIES METHODS FOR CONDUCTING DOSE RECONSTRUCTION UNDER THE ENERGY EMPLOYEES...

  11. 42 CFR 82.32 - How will NIOSH make changes in scientific elements underlying the dose reconstruction process...

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 42 Public Health 1 2011-10-01 2011-10-01 false How will NIOSH make changes in scientific elements underlying the dose reconstruction process, based on scientific progress? 82.32 Section 82.32 Public Health... AND RELATED ACTIVITIES METHODS FOR CONDUCTING DOSE RECONSTRUCTION UNDER THE ENERGY EMPLOYEES...

  12. 42 CFR 82.32 - How will NIOSH make changes in scientific elements underlying the dose reconstruction process...

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 42 Public Health 1 2012-10-01 2012-10-01 false How will NIOSH make changes in scientific elements underlying the dose reconstruction process, based on scientific progress? 82.32 Section 82.32 Public Health... AND RELATED ACTIVITIES METHODS FOR CONDUCTING DOSE RECONSTRUCTION UNDER THE ENERGY EMPLOYEES...

  13. 42 CFR 82.32 - How will NIOSH make changes in scientific elements underlying the dose reconstruction process...

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 42 Public Health 1 2013-10-01 2013-10-01 false How will NIOSH make changes in scientific elements underlying the dose reconstruction process, based on scientific progress? 82.32 Section 82.32 Public Health... AND RELATED ACTIVITIES METHODS FOR CONDUCTING DOSE RECONSTRUCTION UNDER THE ENERGY EMPLOYEES...

  14. 40 CFR 232.3 - Activities not requiring permits.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... buildings, roads, and other discrete structures and the installation of support facilities necessary for... structures used to effect such conversion. A conversion of section 404 wetland to a non-wetland is a change... emergency reconstruction of recently damaged parts, of currently serviceable structures such as dikes, dams...

  15. 40 CFR 232.3 - Activities not requiring permits.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... buildings, roads, and other discrete structures and the installation of support facilities necessary for... structures used to effect such conversion. A conversion of section 404 wetland to a non-wetland is a change... emergency reconstruction of recently damaged parts, of currently serviceable structures such as dikes, dams...

  16. 40 CFR 232.3 - Activities not requiring permits.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... buildings, roads, and other discrete structures and the installation of support facilities necessary for... structures used to effect such conversion. A conversion of section 404 wetland to a non-wetland is a change... emergency reconstruction of recently damaged parts, of currently serviceable structures such as dikes, dams...

  17. 40 CFR 232.3 - Activities not requiring permits.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... buildings, roads, and other discrete structures and the installation of support facilities necessary for... structures used to effect such conversion. A conversion of section 404 wetland to a non-wetland is a change... emergency reconstruction of recently damaged parts, of currently serviceable structures such as dikes, dams...

  18. 40 CFR 60.100 - Applicability, designation of affected facility, and reconstruction.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... petroleum refineries: fluid catalytic cracking unit catalyst regenerators, fuel gas combustion devices, and... petroleum refinery. (b) Any fluid catalytic cracking unit catalyst regenerator or fuel gas combustion device...) and (d) of this section. (c) Any fluid catalytic cracking unit catalyst regenerator under paragraph (b...

  19. 40 CFR 60.100 - Applicability, designation of affected facility, and reconstruction.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... petroleum refineries: fluid catalytic cracking unit catalyst regenerators, fuel gas combustion devices, and... petroleum refinery. (b) Any fluid catalytic cracking unit catalyst regenerator or fuel gas combustion device...) and (d) of this section. (c) Any fluid catalytic cracking unit catalyst regenerator under paragraph (b...

  20. 40 CFR 52.870 - Identification of plan.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 66219; at EPA Air and Radiation Docket and Information Center, EPA West Building, 1301 Constitution... (202) 566-1742. For information on the availability of this material at NARA, call (202) 741-6030, or... definition of the terms “building, structure, facility, or installation”; “installation”; and “reconstruction...

  1. 40 CFR 60.100 - Applicability, designation of affected facility, and reconstruction.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... petroleum refineries: fluid catalytic cracking unit catalyst regenerators, fuel gas combustion devices, and... petroleum refinery. (b) Any fluid catalytic cracking unit catalyst regenerator or fuel gas combustion device...) and (d) of this section. (c) Any fluid catalytic cracking unit catalyst regenerator under paragraph (b...

  2. Computer-aided light sheet flow visualization using photogrammetry

    NASA Technical Reports Server (NTRS)

    Stacy, Kathryn; Severance, Kurt; Childers, Brooks A.

    1994-01-01

    A computer-aided flow visualization process has been developed to analyze video images acquired from rotating and translating light sheet visualization systems. The computer process integrates a mathematical model for image reconstruction, advanced computer graphics concepts, and digital image processing to provide a quantitative and a visual analysis capability. The image reconstruction model, based on photogrammetry, uses knowledge of the camera and light sheet locations and orientations to project two-dimensional light sheet video images into three-dimensional space. A sophisticated computer visualization package, commonly used to analyze computational fluid dynamics (CFD) results, was chosen to interactively display the reconstructed light sheet images with the numerical surface geometry for the model or aircraft under study. The photogrammetric reconstruction technique and the image processing and computer graphics techniques and equipment are described. Results of the computer-aided process applied to both a wind tunnel translating light sheet experiment and an in-flight rotating light sheet experiment are presented. The capability to compare reconstructed experimental light sheet images with CFD solutions in the same graphics environment is also demonstrated.

  3. Computer-Aided Light Sheet Flow Visualization

    NASA Technical Reports Server (NTRS)

    Stacy, Kathryn; Severance, Kurt; Childers, Brooks A.

    1993-01-01

    A computer-aided flow visualization process has been developed to analyze video images acquired from rotating and translating light sheet visualization systems. The computer process integrates a mathematical model for image reconstruction, advanced computer graphics concepts, and digital image processing to provide a quantitative and visual analysis capability. The image reconstruction model, based on photogrammetry, uses knowledge of the camera and light sheet locations and orientations to project two-dimensional light sheet video images into three-dimensional space. A sophisticated computer visualization package, commonly used to analyze computational fluid dynamics (CFD) data sets, was chosen to interactively display the reconstructed light sheet images, along with the numerical surface geometry for the model or aircraft under study. A description is provided of the photogrammetric reconstruction technique, and the image processing and computer graphics techniques and equipment. Results of the computer aided process applied to both a wind tunnel translating light sheet experiment and an in-flight rotating light sheet experiment are presented. The capability to compare reconstructed experimental light sheet images and CFD solutions in the same graphics environment is also demonstrated.

  4. Computer-aided light sheet flow visualization

    NASA Technical Reports Server (NTRS)

    Stacy, Kathryn; Severance, Kurt; Childers, Brooks A.

    1993-01-01

    A computer-aided flow visualization process has been developed to analyze video images acquired from rotating and translating light sheet visualization systems. The computer process integrates a mathematical model for image reconstruction, advanced computer graphics concepts, and digital image processing to provide a quantitative and visual analysis capability. The image reconstruction model, based on photogrammetry, uses knowledge of the camera and light sheet locations and orientations to project two-dimensional light sheet video images into three-dimensional space. A sophisticated computer visualization package, commonly used to analyze computational fluid dynamics (CFD) data sets, was chosen to interactively display the reconstructed light sheet images, along with the numerical surface geometry for the model or aircraft under study. A description is provided of the photogrammetric reconstruction technique, and the image processing and computer graphics techniques and equipment. Results of the computer aided process applied to both a wind tunnel translating light sheet experiment and an in-flight rotating light sheet experiment are presented. The capability to compare reconstructed experimental light sheet images and CFD solutions in the same graphics environment is also demonstrated.

  5. The spatial resolution of a rotating gamma camera tomographic facility.

    PubMed

    Webb, S; Flower, M A; Ott, R J; Leach, M O; Inamdar, R

    1983-12-01

    An important feature determining the spatial resolution in transverse sections reconstructed by convolution and back-projection is the frequency filter corresponding to the convolution kernel. Equations have been derived giving the theoretical spatial resolution, for a perfect detector and noise-free data, using four filter functions. Experiments have shown that physical constraints will always limit the resolution that can be achieved with a given system. The experiments indicate that the region of the frequency spectrum between KN/2 and KN where KN is the Nyquist frequency does not contribute significantly to resolution. In order to investigate the physical effect of these filter functions, the spatial resolution of reconstructed images obtained with a GE 400T rotating gamma camera has been measured. The results obtained serve as an aid to choosing appropriate reconstruction filters for use with a rotating gamma camera system.

  6. Methods of reconstruction of multi-particle events in the new coordinate-tracking setup

    NASA Astrophysics Data System (ADS)

    Vorobyev, V. S.; Shutenko, V. V.; Zadeba, E. A.

    2018-01-01

    At the Unique Scientific Facility NEVOD (MEPhI), a large coordinate-tracking detector based on drift chambers for investigations of muon bundles generated by ultrahigh energy primary cosmic rays is being developed. One of the main characteristics of the bundle is muon multiplicity. Three methods of reconstruction of multiple events were investigated: the sequential search method, method of finding the straight line and method of histograms. The last method determines the number of tracks with the same zenith angle in the event. It is most suitable for the determination of muon multiplicity: because of a large distance to the point of generation of muons, their trajectories are quasiparallel. The paper presents results of application of three reconstruction methods to data from the experiment, and also first results of the detector operation.

  7. AMS data production facilities at science operations center at CERN

    NASA Astrophysics Data System (ADS)

    Choutko, V.; Egorov, A.; Eline, A.; Shan, B.

    2017-10-01

    The Alpha Magnetic Spectrometer (AMS) is a high energy physics experiment on the board of the International Space Station (ISS). This paper presents the hardware and software facilities of Science Operation Center (SOC) at CERN. Data Production is built around production server - a scalable distributed service which links together a set of different programming modules for science data transformation and reconstruction. The server has the capacity to manage 1000 paralleled job producers, i.e. up to 32K logical processors. Monitoring and management tool with Production GUI is also described.

  8. Blob-enhanced reconstruction technique

    NASA Astrophysics Data System (ADS)

    Castrillo, Giusy; Cafiero, Gioacchino; Discetti, Stefano; Astarita, Tommaso

    2016-09-01

    A method to enhance the quality of the tomographic reconstruction and, consequently, the 3D velocity measurement accuracy, is presented. The technique is based on integrating information on the objects to be reconstructed within the algebraic reconstruction process. A first guess intensity distribution is produced with a standard algebraic method, then the distribution is rebuilt as a sum of Gaussian blobs, based on location, intensity and size of agglomerates of light intensity surrounding local maxima. The blobs substitution regularizes the particle shape allowing a reduction of the particles discretization errors and of their elongation in the depth direction. The performances of the blob-enhanced reconstruction technique (BERT) are assessed with a 3D synthetic experiment. The results have been compared with those obtained by applying the standard camera simultaneous multiplicative reconstruction technique (CSMART) to the same volume. Several blob-enhanced reconstruction processes, both substituting the blobs at the end of the CSMART algorithm and during the iterations (i.e. using the blob-enhanced reconstruction as predictor for the following iterations), have been tested. The results confirm the enhancement in the velocity measurements accuracy, demonstrating a reduction of the bias error due to the ghost particles. The improvement is more remarkable at the largest tested seeding densities. Additionally, using the blobs distributions as a predictor enables further improvement of the convergence of the reconstruction algorithm, with the improvement being more considerable when substituting the blobs more than once during the process. The BERT process is also applied to multi resolution (MR) CSMART reconstructions, permitting simultaneously to achieve remarkable improvements in the flow field measurements and to benefit from the reduction in computational time due to the MR approach. Finally, BERT is also tested on experimental data, obtaining an increase of the signal-to-noise ratio in the reconstructed flow field and a higher value of the correlation factor in the velocity measurements with respect to the volume to which the particles are not replaced.

  9. Region of interest processing for iterative reconstruction in x-ray computed tomography

    NASA Astrophysics Data System (ADS)

    Kopp, Felix K.; Nasirudin, Radin A.; Mei, Kai; Fehringer, Andreas; Pfeiffer, Franz; Rummeny, Ernst J.; Noël, Peter B.

    2015-03-01

    The recent advancements in the graphics card technology raised the performance of parallel computing and contributed to the introduction of iterative reconstruction methods for x-ray computed tomography in clinical CT scanners. Iterative maximum likelihood (ML) based reconstruction methods are known to reduce image noise and to improve the diagnostic quality of low-dose CT. However, iterative reconstruction of a region of interest (ROI), especially ML based, is challenging. But for some clinical procedures, like cardiac CT, only a ROI is needed for diagnostics. A high-resolution reconstruction of the full field of view (FOV) consumes unnecessary computation effort that results in a slower reconstruction than clinically acceptable. In this work, we present an extension and evaluation of an existing ROI processing algorithm. Especially improvements for the equalization between regions inside and outside of a ROI are proposed. The evaluation was done on data collected from a clinical CT scanner. The performance of the different algorithms is qualitatively and quantitatively assessed. Our solution to the ROI problem provides an increase in signal-to-noise ratio and leads to visually less noise in the final reconstruction. The reconstruction speed of our technique was observed to be comparable with other previous proposed techniques. The development of ROI processing algorithms in combination with iterative reconstruction will provide higher diagnostic quality in the near future.

  10. Studies of carbon incorporation on the diamond [100] surface during chemical vapor deposition using density functional theory.

    PubMed

    Cheesman, Andrew; Harvey, Jeremy N; Ashfold, Michael N R

    2008-11-13

    Accurate potential energy surface calculations are presented for many of the key steps involved in diamond chemical vapor deposition on the [100] surface (in its 2 x 1 reconstructed and hydrogenated form). The growing diamond surface was described by using a large (approximately 1500 atoms) cluster model, with the key atoms involved in chemical steps being described by using a quantum mechanical (QM, density functional theory, DFT) method and the bulk of the atoms being described by molecular mechanics (MM). The resulting hybrid QM/MM calculations are more systematic and/or at a higher level of theory than previous work on this growth process. The dominant process for carbon addition, in the form of methyl radicals, is predicted to be addition to a surface radical site, opening of the adjacent C-C dimer bond, insertion, and ultimate ring closure. Other steps such as insertion across the trough between rows of dimer bonds or addition to a neighboring dimer leading to formation of a reconstruction on the next layer may also contribute. Etching of carbon can also occur; the most likely mechanism involves loss of a two-carbon moiety in the form of ethene. The present higher-level calculations confirm that migration of inserted carbon along both dimer rows and chains should be relatively facile, with barriers of approximately 150 kJ mol (-1) when starting from suitable diradical species, and that this step should play an important role in establishing growth of smooth surfaces.

  11. Restoration of singularities in reconstructed phase of crystal image in electron holography.

    PubMed

    Li, Wei; Tanji, Takayoshi

    2014-12-01

    Off-axis electron holography can be used to measure the inner potential of a specimen from its reconstructed phase image and is thus a powerful technique for materials scientists. However, abrupt reversals of contrast from white to black may sometimes occur in a digitally reconstructed phase image, which results in inaccurate information. Such phase distortion is mainly due to the digital reconstruction process and weak electron wave amplitude in some areas of the specimen. Therefore, digital image processing can be applied to the reconstruction and restoration of phase images. In this paper, fringe reconnection processing is applied to phase image restoration of a crystal structure image. The disconnection and wrong connection of interference fringes in the hologram that directly cause a 2π phase jump imperfection are correctly reconnected. Experimental results show that the phase distortion is significantly reduced after the processing. The quality of the reconstructed phase image was improved by the removal of imperfections in the final phase. © The Author 2014. Published by Oxford University Press on behalf of The Japanese Society of Microscopy. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  12. Implementation of GPU accelerated SPECT reconstruction with Monte Carlo-based scatter correction.

    PubMed

    Bexelius, Tobias; Sohlberg, Antti

    2018-06-01

    Statistical SPECT reconstruction can be very time-consuming especially when compensations for collimator and detector response, attenuation, and scatter are included in the reconstruction. This work proposes an accelerated SPECT reconstruction algorithm based on graphics processing unit (GPU) processing. Ordered subset expectation maximization (OSEM) algorithm with CT-based attenuation modelling, depth-dependent Gaussian convolution-based collimator-detector response modelling, and Monte Carlo-based scatter compensation was implemented using OpenCL. The OpenCL implementation was compared against the existing multi-threaded OSEM implementation running on a central processing unit (CPU) in terms of scatter-to-primary ratios, standardized uptake values (SUVs), and processing speed using mathematical phantoms and clinical multi-bed bone SPECT/CT studies. The difference in scatter-to-primary ratios, visual appearance, and SUVs between GPU and CPU implementations was minor. On the other hand, at its best, the GPU implementation was noticed to be 24 times faster than the multi-threaded CPU version on a normal 128 × 128 matrix size 3 bed bone SPECT/CT data set when compensations for collimator and detector response, attenuation, and scatter were included. GPU SPECT reconstructions show great promise as an every day clinical reconstruction tool.

  13. Parallel image reconstruction for 3D positron emission tomography from incomplete 2D projection data

    NASA Astrophysics Data System (ADS)

    Guerrero, Thomas M.; Ricci, Anthony R.; Dahlbom, Magnus; Cherry, Simon R.; Hoffman, Edward T.

    1993-07-01

    The problem of excessive computational time in 3D Positron Emission Tomography (3D PET) reconstruction is defined, and we present an approach for solving this problem through the construction of an inexpensive parallel processing system and the adoption of the FAVOR algorithm. Currently, the 3D reconstruction of the 610 images of a total body procedure would require 80 hours and the 3D reconstruction of the 620 images of a dynamic study would require 110 hours. An inexpensive parallel processing system for 3D PET reconstruction is constructed from the integration of board level products from multiple vendors. The system achieves its computational performance through the use of 6U VME four i860 processor boards, the processor boards from five manufacturers are discussed from our perspective. The new 3D PET reconstruction algorithm FAVOR, FAst VOlume Reconstructor, that promises a substantial speed improvement is adopted. Preliminary results from parallelizing FAVOR are utilized in formulating architectural improvements for this problem. In summary, we are addressing the problem of excessive computational time in 3D PET image reconstruction, through the construction of an inexpensive parallel processing system and the parallelization of a 3D reconstruction algorithm that uses the incomplete data set that is produced by current PET systems.

  14. Reconstruction method for data protection in telemedicine systems

    NASA Astrophysics Data System (ADS)

    Buldakova, T. I.; Suyatinov, S. I.

    2015-03-01

    In the report the approach to protection of transmitted data by creation of pair symmetric keys for the sensor and the receiver is offered. Since biosignals are unique for each person, their corresponding processing allows to receive necessary information for creation of cryptographic keys. Processing is based on reconstruction of the mathematical model generating time series that are diagnostically equivalent to initial biosignals. Information about the model is transmitted to the receiver, where the restoration of physiological time series is performed using the reconstructed model. Thus, information about structure and parameters of biosystem model received in the reconstruction process can be used not only for its diagnostics, but also for protection of transmitted data in telemedicine complexes.

  15. 40 CFR 63.9490 - What parts of my plant does this subpart cover?

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... CATEGORIES (CONTINUED) National Emission Standards for Hazardous Air Pollutants for Friction Materials...) This subpart applies to each new, reconstructed, or existing affected source at your friction materials... solvent mixer (as defined in § 63.9565) at your friction materials manufacturing facility. (c) A solvent...

  16. 40 CFR 63.9490 - What parts of my plant does this subpart cover?

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... CATEGORIES (CONTINUED) National Emission Standards for Hazardous Air Pollutants for Friction Materials...) This subpart applies to each new, reconstructed, or existing affected source at your friction materials... solvent mixer (as defined in § 63.9565) at your friction materials manufacturing facility. (c) A solvent...

  17. 40 CFR 63.9490 - What parts of my plant does this subpart cover?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... CATEGORIES (CONTINUED) National Emission Standards for Hazardous Air Pollutants for Friction Materials...) This subpart applies to each new, reconstructed, or existing affected source at your friction materials... solvent mixer (as defined in § 63.9565) at your friction materials manufacturing facility. (c) A solvent...

  18. 40 CFR 63.9490 - What parts of my plant does this subpart cover?

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... CATEGORIES (CONTINUED) National Emission Standards for Hazardous Air Pollutants for Friction Materials...) This subpart applies to each new, reconstructed, or existing affected source at your friction materials... solvent mixer (as defined in § 63.9565) at your friction materials manufacturing facility. (c) A solvent...

  19. 24 CFR 58.35 - Categorical exclusions.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... use is not changed, and the footprint of the building is not increased in a floodplain or in a wetland..., reconstruction, or rehabilitation of public facilities and improvements (other than buildings) when the...) Rehabilitation of buildings and improvements when the following conditions are met: (i) In the case of a building...

  20. 78 FR 57208 - Notice of Passenger Facility Charge (PFC) Approvals and Disapprovals

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-17

    ... LEVEL: End of runway deicing program--phase 1, runway 34R. Taxiway S pavement reconstruction. Replace carousel 9 and oversized bag belt TU3. Terminal redevelopment program--design and associated technical... APPROVED FOR COLLECTION AND USE: PFC program administration. Design taxiways A, L and B. BRIEF DESCRIPTION...

  1. 78 FR 11593 - Environmental Impact and Related Procedures

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-02-19

    ... reconstruction activity is in the same location with the same capacity, dimensions, and design as the original... capacity, dimensions, and design as the original road, highway, or bridge as before the declaration, and (B... and design changes to a damaged facility to meet current design standards; (2) repair and...

  2. 40 CFR Table 1 to Subpart Cccccc... - Applicability Criteria and Management Practices for Gasoline Dispensing Facilities With Monthly...

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Gallons of Gasoline or More If you own or operate Then you must 1. A new, reconstructed, or existing GDF... fill tube is used, it shall be provided with a submerged drop tube that extends the same distance from...

  3. 40 CFR 63.9490 - What parts of my plant does this subpart cover?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... CATEGORIES (CONTINUED) National Emission Standards for Hazardous Air Pollutants for Friction Materials...) This subpart applies to each new, reconstructed, or existing affected source at your friction materials... solvent mixer (as defined in § 63.9565) at your friction materials manufacturing facility. (c) A solvent...

  4. 40 CFR 63.7182 - What parts of my facility does this subpart cover?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... (CONTINUED) AIR PROGRAMS (CONTINUED) NATIONAL EMISSION STANDARDS FOR HAZARDOUS AIR POLLUTANTS FOR SOURCE... subpart applies to each new, reconstructed, or existing affected source that you own or operate that manufactures semiconductors. (b) An affected source subject to this subpart is the collection of all...

  5. 40 CFR 52.870 - Identification of plan.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 66101; at the EPA, Air and Radiation Docket and Information Center, Room Number 3334, EPA West Building... Docket at (202) 566-1742. For information on the availability of this material at NARA, call (202) 741... definition of the terms “building, structure, facility, or installation”; “installation”; and “reconstruction...

  6. Reconstruction of dynamical systems from resampled point processes produced by neuron models

    NASA Astrophysics Data System (ADS)

    Pavlova, Olga N.; Pavlov, Alexey N.

    2018-04-01

    Characterization of dynamical features of chaotic oscillations from point processes is based on embedding theorems for non-uniformly sampled signals such as the sequences of interspike intervals (ISIs). This theoretical background confirms the ability of attractor reconstruction from ISIs generated by chaotically driven neuron models. The quality of such reconstruction depends on the available length of the analyzed dataset. We discuss how data resampling improves the reconstruction for short amount of data and show that this effect is observed for different types of mechanisms for spike generation.

  7. Reconstruction of audio waveforms from spike trains of artificial cochlea models

    PubMed Central

    Zai, Anja T.; Bhargava, Saurabh; Mesgarani, Nima; Liu, Shih-Chii

    2015-01-01

    Spiking cochlea models describe the analog processing and spike generation process within the biological cochlea. Reconstructing the audio input from the artificial cochlea spikes is therefore useful for understanding the fidelity of the information preserved in the spikes. The reconstruction process is challenging particularly for spikes from the mixed signal (analog/digital) integrated circuit (IC) cochleas because of multiple non-linearities in the model and the additional variance caused by random transistor mismatch. This work proposes an offline method for reconstructing the audio input from spike responses of both a particular spike-based hardware model called the AEREAR2 cochlea and an equivalent software cochlea model. This method was previously used to reconstruct the auditory stimulus based on the peri-stimulus histogram of spike responses recorded in the ferret auditory cortex. The reconstructed audio from the hardware cochlea is evaluated against an analogous software model using objective measures of speech quality and intelligibility; and further tested in a word recognition task. The reconstructed audio under low signal-to-noise (SNR) conditions (SNR < –5 dB) gives a better classification performance than the original SNR input in this word recognition task. PMID:26528113

  8. Time-of-flight PET image reconstruction using origin ensembles.

    PubMed

    Wülker, Christian; Sitek, Arkadiusz; Prevrhal, Sven

    2015-03-07

    The origin ensemble (OE) algorithm is a novel statistical method for minimum-mean-square-error (MMSE) reconstruction of emission tomography data. This method allows one to perform reconstruction entirely in the image domain, i.e. without the use of forward and backprojection operations. We have investigated the OE algorithm in the context of list-mode (LM) time-of-flight (TOF) PET reconstruction. In this paper, we provide a general introduction to MMSE reconstruction, and a statistically rigorous derivation of the OE algorithm. We show how to efficiently incorporate TOF information into the reconstruction process, and how to correct for random coincidences and scattered events. To examine the feasibility of LM-TOF MMSE reconstruction with the OE algorithm, we applied MMSE-OE and standard maximum-likelihood expectation-maximization (ML-EM) reconstruction to LM-TOF phantom data with a count number typically registered in clinical PET examinations. We analyzed the convergence behavior of the OE algorithm, and compared reconstruction time and image quality to that of the EM algorithm. In summary, during the reconstruction process, MMSE-OE contrast recovery (CRV) remained approximately the same, while background variability (BV) gradually decreased with an increasing number of OE iterations. The final MMSE-OE images exhibited lower BV and a slightly lower CRV than the corresponding ML-EM images. The reconstruction time of the OE algorithm was approximately 1.3 times longer. At the same time, the OE algorithm can inherently provide a comprehensive statistical characterization of the acquired data. This characterization can be utilized for further data processing, e.g. in kinetic analysis and image registration, making the OE algorithm a promising approach in a variety of applications.

  9. Time-of-flight PET image reconstruction using origin ensembles

    NASA Astrophysics Data System (ADS)

    Wülker, Christian; Sitek, Arkadiusz; Prevrhal, Sven

    2015-03-01

    The origin ensemble (OE) algorithm is a novel statistical method for minimum-mean-square-error (MMSE) reconstruction of emission tomography data. This method allows one to perform reconstruction entirely in the image domain, i.e. without the use of forward and backprojection operations. We have investigated the OE algorithm in the context of list-mode (LM) time-of-flight (TOF) PET reconstruction. In this paper, we provide a general introduction to MMSE reconstruction, and a statistically rigorous derivation of the OE algorithm. We show how to efficiently incorporate TOF information into the reconstruction process, and how to correct for random coincidences and scattered events. To examine the feasibility of LM-TOF MMSE reconstruction with the OE algorithm, we applied MMSE-OE and standard maximum-likelihood expectation-maximization (ML-EM) reconstruction to LM-TOF phantom data with a count number typically registered in clinical PET examinations. We analyzed the convergence behavior of the OE algorithm, and compared reconstruction time and image quality to that of the EM algorithm. In summary, during the reconstruction process, MMSE-OE contrast recovery (CRV) remained approximately the same, while background variability (BV) gradually decreased with an increasing number of OE iterations. The final MMSE-OE images exhibited lower BV and a slightly lower CRV than the corresponding ML-EM images. The reconstruction time of the OE algorithm was approximately 1.3 times longer. At the same time, the OE algorithm can inherently provide a comprehensive statistical characterization of the acquired data. This characterization can be utilized for further data processing, e.g. in kinetic analysis and image registration, making the OE algorithm a promising approach in a variety of applications.

  10. Combined analysis of job and task benzene air exposures among workers at four US refinery operations

    PubMed Central

    Shin, Jennifer (Mi); Unice, Ken M; Gaffney, Shannon H; Kreider, Marisa L; Gelatt, Richard H; Panko, Julie M

    2016-01-01

    Workplace air samples analyzed for benzene at four US refineries from 1976 to 2007 were pooled into a single dataset to characterize similarities and differences between job titles, tasks and refineries, and to provide a robust dataset for exposure reconstruction. Approximately 12,000 non-task (>180 min) personal samples associated with 50 job titles and 4000 task (<180 min) samples characterizing 24 tasks were evaluated. Personal air sample data from four individual refineries were pooled based on a number of factors including (1) the consistent sampling approach used by refinery industrial hygienists over time, (2) the use of similar exposure controls, (3) the comparability of benzene content of process streams and end products, (4) the ability to assign uniform job titles and task codes across all four refineries, and (5) our analysis of variance (ANOVA) of the distribution of benzene air concentrations for select jobs/tasks across all four refineries. The jobs and tasks most frequently sampled included those with highest potential contact with refinery product streams containing benzene, which reflected the targeted sampling approach utilized by the facility industrial hygienists. Task and non-task data were analyzed to identify and account for significant differences within job-area, task-job, and task-area categories. This analysis demonstrated that in general, areas with benzene containing process streams were associated with greater benzene air concentrations compared to areas with process streams containing little to no benzene. For several job titles and tasks analyzed, there was a statistically significant decrease in benzene air concentration after 1990. This study provides a job and task-focused analysis of occupational exposure to benzene during refinery operations, and it should be useful for reconstructing refinery workers’ exposures to benzene over the past 30 years. PMID:26862134

  11. Combined analysis of job and task benzene air exposures among workers at four US refinery operations.

    PubMed

    Burns, Amanda; Shin, Jennifer Mi; Unice, Ken M; Gaffney, Shannon H; Kreider, Marisa L; Gelatt, Richard H; Panko, Julie M

    2017-03-01

    Workplace air samples analyzed for benzene at four US refineries from 1976 to 2007 were pooled into a single dataset to characterize similarities and differences between job titles, tasks and refineries, and to provide a robust dataset for exposure reconstruction. Approximately 12,000 non-task (>180 min) personal samples associated with 50 job titles and 4000 task (<180 min) samples characterizing 24 tasks were evaluated. Personal air sample data from four individual refineries were pooled based on a number of factors including (1) the consistent sampling approach used by refinery industrial hygienists over time, (2) the use of similar exposure controls, (3) the comparability of benzene content of process streams and end products, (4) the ability to assign uniform job titles and task codes across all four refineries, and (5) our analysis of variance (ANOVA) of the distribution of benzene air concentrations for select jobs/tasks across all four refineries. The jobs and tasks most frequently sampled included those with highest potential contact with refinery product streams containing benzene, which reflected the targeted sampling approach utilized by the facility industrial hygienists. Task and non-task data were analyzed to identify and account for significant differences within job-area, task-job, and task-area categories. This analysis demonstrated that in general, areas with benzene containing process streams were associated with greater benzene air concentrations compared to areas with process streams containing little to no benzene. For several job titles and tasks analyzed, there was a statistically significant decrease in benzene air concentration after 1990. This study provides a job and task-focused analysis of occupational exposure to benzene during refinery operations, and it should be useful for reconstructing refinery workers' exposures to benzene over the past 30 years.

  12. Attractor reconstruction for non-linear systems: a methodological note

    USGS Publications Warehouse

    Nichols, J.M.; Nichols, J.D.

    2001-01-01

    Attractor reconstruction is an important step in the process of making predictions for non-linear time-series and in the computation of certain invariant quantities used to characterize the dynamics of such series. The utility of computed predictions and invariant quantities is dependent on the accuracy of attractor reconstruction, which in turn is determined by the methods used in the reconstruction process. This paper suggests methods by which the delay and embedding dimension may be selected for a typical delay coordinate reconstruction. A comparison is drawn between the use of the autocorrelation function and mutual information in quantifying the delay. In addition, a false nearest neighbor (FNN) approach is used in minimizing the number of delay vectors needed. Results highlight the need for an accurate reconstruction in the computation of the Lyapunov spectrum and in prediction algorithms.

  13. An iterative reduced field-of-view reconstruction for periodically rotated overlapping parallel lines with enhanced reconstruction (PROPELLER) MRI.

    PubMed

    Lin, Jyh-Miin; Patterson, Andrew J; Chang, Hing-Chiu; Gillard, Jonathan H; Graves, Martin J

    2015-10-01

    To propose a new reduced field-of-view (rFOV) strategy for iterative reconstructions in a clinical environment. Iterative reconstructions can incorporate regularization terms to improve the image quality of periodically rotated overlapping parallel lines with enhanced reconstruction (PROPELLER) MRI. However, the large amount of calculations required for full FOV iterative reconstructions has posed a huge computational challenge for clinical usage. By subdividing the entire problem into smaller rFOVs, the iterative reconstruction can be accelerated on a desktop with a single graphic processing unit (GPU). This rFOV strategy divides the iterative reconstruction into blocks, based on the block-diagonal dominant structure. A near real-time reconstruction system was developed for the clinical MR unit, and parallel computing was implemented using the object-oriented model. In addition, the Toeplitz method was implemented on the GPU to reduce the time required for full interpolation. Using the data acquired from the PROPELLER MRI, the reconstructed images were then saved in the digital imaging and communications in medicine format. The proposed rFOV reconstruction reduced the gridding time by 97%, as the total iteration time was 3 s even with multiple processes running. A phantom study showed that the structure similarity index for rFOV reconstruction was statistically superior to conventional density compensation (p < 0.001). In vivo study validated the increased signal-to-noise ratio, which is over four times higher than with density compensation. Image sharpness index was improved using the regularized reconstruction implemented. The rFOV strategy permits near real-time iterative reconstruction to improve the image quality of PROPELLER images. Substantial improvements in image quality metrics were validated in the experiments. The concept of rFOV reconstruction may potentially be applied to other kinds of iterative reconstructions for shortened reconstruction duration.

  14. Markov prior-based block-matching algorithm for superdimension reconstruction of porous media

    NASA Astrophysics Data System (ADS)

    Li, Yang; He, Xiaohai; Teng, Qizhi; Feng, Junxi; Wu, Xiaohong

    2018-04-01

    A superdimension reconstruction algorithm is used for the reconstruction of three-dimensional (3D) structures of a porous medium based on a single two-dimensional image. The algorithm borrows the concepts of "blocks," "learning," and "dictionary" from learning-based superresolution reconstruction and applies them to the 3D reconstruction of a porous medium. In the neighborhood-matching process of the conventional superdimension reconstruction algorithm, the Euclidean distance is used as a criterion, although it may not really reflect the structural correlation between adjacent blocks in an actual situation. Hence, in this study, regular items are adopted as prior knowledge in the reconstruction process, and a Markov prior-based block-matching algorithm for superdimension reconstruction is developed for more accurate reconstruction. The algorithm simultaneously takes into consideration the probabilistic relationship between the already reconstructed blocks in three different perpendicular directions (x , y , and z ) and the block to be reconstructed, and the maximum value of the probability product of the blocks to be reconstructed (as found in the dictionary for the three directions) is adopted as the basis for the final block selection. Using this approach, the problem of an imprecise spatial structure caused by a point simulation can be overcome. The problem of artifacts in the reconstructed structure is also addressed through the addition of hard data and by neighborhood matching. To verify the improved reconstruction accuracy of the proposed method, the statistical and morphological features of the results from the proposed method and traditional superdimension reconstruction method are compared with those of the target system. The proposed superdimension reconstruction algorithm is confirmed to enable a more accurate reconstruction of the target system while also eliminating artifacts.

  15. Israeli mothers' meaning reconstruction in the aftermath of homicide.

    PubMed

    Mahat-Shamir, Michal; Leichtentritt, Ronit D

    2016-01-01

    This study is the first to our knowledge to provide an in-depth account of the meanings reconstructed by bereaved Israeli mothers of homicide victims. Homicide survivors tend to receive little or no support from society; this is especially true in Israel, where homicide victims are a neglected population whose voice is socially muted. Constructivist theories have informed understanding of grief, emphasizing the role of meaning reconstruction in adaptation to bereavement, as well as the role of social support in the process of meaning reconstruction. We derived 3 prototypes of meaning from interviews of 12 bereaved mothers: the existential paradox; a bifurcated worldview; and oppression, mortification, and humiliation. Most informants used all 3 prototypes in the process of reconstructing meaning, describing changes in the perception of themselves, the world, and society. However, change was also accompanied by continuity, because participants did not abandon their former worldview while adopting a new one. The findings suggest that meaning reconstruction in the aftermath of homicide is a unique, multifaceted, and contradictory process. Implications for practice are outlined. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  16. Automation of 3D reconstruction of neural tissue from large volume of conventional serial section transmission electron micrographs.

    PubMed

    Mishchenko, Yuriy

    2009-01-30

    We describe an approach for automation of the process of reconstruction of neural tissue from serial section transmission electron micrographs. Such reconstructions require 3D segmentation of individual neuronal processes (axons and dendrites) performed in densely packed neuropil. We first detect neuronal cell profiles in each image in a stack of serial micrographs with multi-scale ridge detector. Short breaks in detected boundaries are interpolated using anisotropic contour completion formulated in fuzzy-logic framework. Detected profiles from adjacent sections are linked together based on cues such as shape similarity and image texture. Thus obtained 3D segmentation is validated by human operators in computer-guided proofreading process. Our approach makes possible reconstructions of neural tissue at final rate of about 5 microm3/manh, as determined primarily by the speed of proofreading. To date we have applied this approach to reconstruct few blocks of neural tissue from different regions of rat brain totaling over 1000microm3, and used these to evaluate reconstruction speed, quality, error rates, and presence of ambiguous locations in neuropil ssTEM imaging data.

  17. Analysis of an Optimized MLOS Tomographic Reconstruction Algorithm and Comparison to the MART Reconstruction Algorithm

    NASA Astrophysics Data System (ADS)

    La Foy, Roderick; Vlachos, Pavlos

    2011-11-01

    An optimally designed MLOS tomographic reconstruction algorithm for use in 3D PIV and PTV applications is analyzed. Using a set of optimized reconstruction parameters, the reconstructions produced by the MLOS algorithm are shown to be comparable to reconstructions produced by the MART algorithm for a range of camera geometries, camera numbers, and particle seeding densities. The resultant velocity field error calculated using PIV and PTV algorithms is further minimized by applying both pre and post processing to the reconstructed data sets.

  18. High resolution astrophysical observations using speckle imaging

    NASA Astrophysics Data System (ADS)

    Noyes, R. W.; Nisenson, P.; Papaliolios, C.; Stachnik, R. V.

    1986-04-01

    This report describes progress under a contract to develop a complete astronomical speckle image reconstruction facility and to apply that facility to the solution of astronomical problems. During the course of the contract we have developed the procedures, algorithms, theory and hardware required to perform that function and have made and interpreted astronomical observations of substantial significance. A principal result of the program was development of a photon-counting camera of innovative design, the PAPA detector. Development of this device was, in our view, essential to making the speckle process into a useful astronomical tool, since the principal impediment to that circumstance in the past was the necessity for application of photon noise compensation procedures which were difficult if not impossible to calibrate. The photon camera made this procedure unnecessary and permitted precision image recovery. The result of this effort and the associated algorithm development was an active program of astronomical observation which included investigations into young stellar objects, supergiant structure and measurements of the helium abundance of the early universe. We have also continued research on recovery of high angular resolution images of the solar surface working with scientists at the Sacramento Peak Observatory in this area.

  19. Making Advanced Scientific Algorithms and Big Scientific Data Management More Accessible

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Venkatakrishnan, S. V.; Mohan, K. Aditya; Beattie, Keith

    2016-02-14

    Synchrotrons such as the Advanced Light Source (ALS) at Lawrence Berkeley National Laboratory are known as user facilities. They are sources of extremely bright X-ray beams, and scientists come from all over the world to perform experiments that require these beams. As the complexity of experiments has increased, and the size and rates of data sets has exploded, managing, analyzing and presenting the data collected at synchrotrons has been an increasing challenge. The ALS has partnered with high performance computing, fast networking, and applied mathematics groups to create a"super-facility", giving users simultaneous access to the experimental, computational, and algorithmic resourcesmore » to overcome this challenge. This combination forms an efficient closed loop, where data despite its high rate and volume is transferred and processed, in many cases immediately and automatically, on appropriate compute resources, and results are extracted, visualized, and presented to users or to the experimental control system, both to provide immediate insight and to guide decisions about subsequent experiments during beam-time. In this paper, We will present work done on advanced tomographic reconstruction algorithms to support users of the 3D micron-scale imaging instrument (Beamline 8.3.2, hard X-ray micro-tomography).« less

  20. Quantitative proton imaging from multiple physics processes: a proof of concept

    NASA Astrophysics Data System (ADS)

    Bopp, C.; Rescigno, R.; Rousseau, M.; Brasse, D.

    2015-07-01

    Proton imaging is developed in order to improve the accuracy of charged particle therapy treatment planning. It makes it possible to directly map the relative stopping powers of the materials using the information on the energy loss of the protons. In order to reach a satisfactory spatial resolution in the reconstructed images, the position and direction of each particle is recorded upstream and downstream from the patient. As a consequence of individual proton detection, information on the transmission rate and scattering of the protons is available. Image reconstruction processes are proposed to make use of this information. A proton tomographic acquisition of an anthropomorphic head phantom was simulated. The transmission rate of the particles was used to reconstruct a map of the macroscopic cross section for nuclear interactions of the materials. A two-step iterative reconstruction process was implemented to reconstruct a map of the inverse scattering length of the materials using the scattering of the protons. Results indicate that, while the reconstruction processes should be optimized, it is possible to extract quantitative information from the transmission rate and scattering of the protons. This suggests that proton imaging could provide additional knowledge on the materials that may be of use to further improve treatment planning.

  1. Deep learning methods to guide CT image reconstruction and reduce metal artifacts

    NASA Astrophysics Data System (ADS)

    Gjesteby, Lars; Yang, Qingsong; Xi, Yan; Zhou, Ye; Zhang, Junping; Wang, Ge

    2017-03-01

    The rapidly-rising field of machine learning, including deep learning, has inspired applications across many disciplines. In medical imaging, deep learning has been primarily used for image processing and analysis. In this paper, we integrate a convolutional neural network (CNN) into the computed tomography (CT) image reconstruction process. Our first task is to monitor the quality of CT images during iterative reconstruction and decide when to stop the process according to an intelligent numerical observer instead of using a traditional stopping rule, such as a fixed error threshold or a maximum number of iterations. After training on ground truth images, the CNN was successful in guiding an iterative reconstruction process to yield high-quality images. Our second task is to improve a sinogram to correct for artifacts caused by metal objects. A large number of interpolation and normalization-based schemes were introduced for metal artifact reduction (MAR) over the past four decades. The NMAR algorithm is considered a state-of-the-art method, although residual errors often remain in the reconstructed images, especially in cases of multiple metal objects. Here we merge NMAR with deep learning in the projection domain to achieve additional correction in critical image regions. Our results indicate that deep learning can be a viable tool to address CT reconstruction challenges.

  2. Quantitative proton imaging from multiple physics processes: a proof of concept.

    PubMed

    Bopp, C; Rescigno, R; Rousseau, M; Brasse, D

    2015-07-07

    Proton imaging is developed in order to improve the accuracy of charged particle therapy treatment planning. It makes it possible to directly map the relative stopping powers of the materials using the information on the energy loss of the protons. In order to reach a satisfactory spatial resolution in the reconstructed images, the position and direction of each particle is recorded upstream and downstream from the patient. As a consequence of individual proton detection, information on the transmission rate and scattering of the protons is available. Image reconstruction processes are proposed to make use of this information. A proton tomographic acquisition of an anthropomorphic head phantom was simulated. The transmission rate of the particles was used to reconstruct a map of the macroscopic cross section for nuclear interactions of the materials. A two-step iterative reconstruction process was implemented to reconstruct a map of the inverse scattering length of the materials using the scattering of the protons. Results indicate that, while the reconstruction processes should be optimized, it is possible to extract quantitative information from the transmission rate and scattering of the protons. This suggests that proton imaging could provide additional knowledge on the materials that may be of use to further improve treatment planning.

  3. Assessment Processes to Increase the Burden of Existing Buildings Using BIM

    NASA Astrophysics Data System (ADS)

    Szeląg, Romuald

    2017-10-01

    The process of implementation of the reconstruction of buildings is often associated with the need to adapt them to increased loads. In the restricted access to the archive project documentation it is necessary to use technical solutions to obtain a fairly short period of time necessary to implement the technical parameters of such processes. Dissemination of BIM in the design process can also be used effectively in the processes of identification of existing facilities for the implementation of the work of strengthening or adapting objects to the increased load requirements. Obtained in the process of research and macroscopic data is then used in the processes of numerical processing aimed at developing a numerical model reflects the actual parameters of the structure of the existing structure and, therefore, allows a better look at the object and the execution of the process to strengthen future. This article will identify possibilities for the use of BIM in processes of identification technology buildings and structures and indicated the necessary data to be obtained during the preliminary work. Introduced in model solutions enable the use of multi-criteria analysis of the choice of the most optimal solutions in terms of costs or expenditures of time during the process of construction. Taking the above work by building a numerical model of the object allows every step of verification by authorized person inventoried solutions and enables tracking and changes in the situation of those found derogations in relation to the parameters established at the primary stage. In the event of significant deviations, there is the possibility of rapid changes to the completed process of calculation and presentation of alternative solutions. Availability software using BIM technology is increasingly common here knowledge of the implementation of such solutions will become in a short time, the standard for most objects or engineering structures. The use of modern solutions using the described processes will be discussed on the example of an industrial facility where there was a need for installation of new equipment and adapt it to the technical parameters.

  4. SPIDER image processing for single-particle reconstruction of biological macromolecules from electron micrographs

    PubMed Central

    Shaikh, Tanvir R; Gao, Haixiao; Baxter, William T; Asturias, Francisco J; Boisset, Nicolas; Leith, Ardean; Frank, Joachim

    2009-01-01

    This protocol describes the reconstruction of biological molecules from the electron micrographs of single particles. Computation here is performed using the image-processing software SPIDER and can be managed using a graphical user interface, termed the SPIDER Reconstruction Engine. Two approaches are described to obtain an initial reconstruction: random-conical tilt and common lines. Once an existing model is available, reference-based alignment can be used, a procedure that can be iterated. Also described is supervised classification, a method to look for homogeneous subsets when multiple known conformations of the molecule may coexist. PMID:19180078

  5. Lander Trajectory Reconstruction computer program

    NASA Technical Reports Server (NTRS)

    Adams, G. L.; Bradt, A. J.; Ferguson, J. B.; Schnelker, H. J.

    1971-01-01

    The Lander Trajectory Reconstruction (LTR) computer program is a tool for analysis of the planetary entry trajectory and atmosphere reconstruction process for a lander or probe. The program can be divided into two parts: (1) the data generator and (2) the reconstructor. The data generator provides the real environment in which the lander or probe is presumed to find itself. The reconstructor reconstructs the entry trajectory and atmosphere using sensor data generated by the data generator and a Kalman-Schmidt consider filter. A wide variety of vehicle and environmental parameters may be either solved-for or considered in the filter process.

  6. Liquid Argon TPC Signal Formation, Signal Processing and Hit Reconstruction

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baller, Bruce

    2017-03-11

    This document describes the early stage of the reconstruction chain that was developed for the ArgoNeuT and MicroBooNE experiments at Fermilab. These experiments study accelerator neutrino interactions that occur in a Liquid Argon Time Projection Chamber. Reconstructing the properties of particles produced in these interactions requires knowledge of the micro-physics processes that affect the creation and transport of ionization electrons to the readout system. A wire signal deconvolution technique was developed to convert wire signals to a standard form for hit reconstruction, to remove artifacts in the electronics chain and to remove coherent noise.

  7. Defining event reconstruction of digital crime scenes.

    PubMed

    Carrier, Brian D; Spafford, Eugene H

    2004-11-01

    Event reconstruction plays a critical role in solving physical crimes by explaining why a piece of physical evidence has certain characteristics. With digital crimes, the current focus has been on the recognition and identification of digital evidence using an object's characteristics, but not on the identification of the events that caused the characteristics. This paper examines digital event reconstruction and proposes a process model and procedure that can be used for a digital crime scene. The model has been designed so that it can apply to physical crime scenes, can support the unique aspects of a digital crime scene, and can be implemented in software to automate part of the process. We also examine the differences between physical event reconstruction and digital event reconstruction.

  8. [The characteristics of computer simulation of traffic accidents].

    PubMed

    Zou, Dong-Hua; Liu, Ning-Guo; Chen, Jian-Guo; Jin, Xian-Long; Zhang, Xiao-Yun; Zhang, Jian-Hua; Chen, Yi-Jiu

    2008-12-01

    To reconstruct the collision process of traffic accident and the injury mode of the victim by computer simulation technology in forensic assessment of traffic accident. Forty actual accidents were reconstructed by stimulation software and high performance computer based on analysis of the trace evidences at the scene, damage of the vehicles and injury of the victims, with 2 cases discussed in details. The reconstruction correlated very well in 28 cases, well in 9 cases, and suboptimal in 3 cases with the above parameters. Accurate reconstruction of the accident would be helpful for assessment of the injury mechanism of the victims. Reconstruction of the collision process of traffic accident and the injury mechanism of the victim by computer simulation is useful in traffic accident assessment.

  9. 75 FR 28626 - Subcommittee on Procedures Review, Advisory Board on Radiation and Worker Health (ABRWH...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-05-21

    ...''), OTIB-0051 (``Effect of Threshold Energy and Angular Response of NTA Film on Missed Neutron Dose at the... Reconstruction During Residual Radioactivity Periods at Atomic Weapons Employer Facilities''), and TBD 6000 (``Site Profile for Atomic Weapons Employers that Worked Uranium and Thorium Metals''); and a continuation...

  10. Center for Nondestructive Evaluation - Center for Nondestructive Evaluation

    Science.gov Websites

    available for the full range of inspection methods, housed in a 52,000 sq. ft. facility with over $5M in - 1990): Development of NDE methods for application to DOE energy and weapons programs, including multi for enhanced frequency bandwidth and improved flaw reconstruction, and novel methods for poling

  11. 40 CFR 60.489a - List of chemicals produced by affected facilities.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 7 2014-07-01 2014-07-01 false List of chemicals produced by affected... Equipment Leaks of VOC in the Synthetic Organic Chemicals Manufacturing Industry for Which Construction, Reconstruction, or Modification Commenced After November 7, 2006 § 60.489a List of chemicals produced by affected...

  12. Analysis of 4D Modeling for Use by the Naval Facilities Engineering Command

    DTIC Science & Technology

    2004-08-01

    Use Today Today, 4D modeling is being used to build Space Mountain at the new Hong Kong Disneyland theme park. Additionally, the technology is being...used for the reconstruction of the 26-year-old Space Mountain at the Disneyland in Anaheim. Muller explains: Among the hassles: Contractors must

  13. When a School Burns, Cool Heads and Quick Action Keep Education on Track.

    ERIC Educational Resources Information Center

    Parry, Robert; Burris, Carol

    1988-01-01

    A fire destroyed an elementary school in the East Rockaway (New York) school system. A substitute facility, furniture, and textbooks were secured and classes opened, missing only four school days. Future precautions include insurance to cover actual reconstruction costs, smoke detectors, and a computerized inventory system. (MLF)

  14. 40 CFR 60.52b - Standards for municipal waste combustor metals, acid gases, organics, and nitrogen oxides.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... SOURCES Standards of Performance for Large Municipal Waste Combustors for Which Construction is Commenced... section. (i) For affected facilities that commenced construction, modification, or reconstruction after September 20, 1994, and on or before December 19, 2005, the emission limit is 24 milligrams per dry standard...

  15. Oxygen Assessments Ensure Safer Medical Devices

    NASA Technical Reports Server (NTRS)

    2013-01-01

    A team at White Sands Test Facility developed a test method to evaluate fire hazards in oxygen-enriched environments. Wendell Hull and Associates, located in Las Cruces, New Mexico, entered a Space Act Agreement with NASA and now provides services including fire and explosion investigations, oxygen testing and training, and accident reconstruction and forensic engineering.

  16. Estimation of point source fugitive emission rates from a single sensor time series: a conditionally-sampled Gaussian plume reconstruction

    EPA Science Inventory

    This paper presents a technique for determining the trace gas emission rate from a point source. The technique was tested using data from controlled methane release experiments and from measurement downwind of a natural gas production facility in Wyoming. Concentration measuremen...

  17. Remote inspection with multi-copters, radiological sensors and SLAM techniques

    NASA Astrophysics Data System (ADS)

    Carvalho, Henrique; Vale, Alberto; Marques, Rúben; Ventura, Rodrigo; Brouwer, Yoeri; Gonçalves, Bruno

    2018-01-01

    Activated material can be found in different scenarios, such as in nuclear reactor facilities or medical facilities (e.g. in positron emission tomography commonly known as PET scanning). In addition, there are unexpected scenarios resulting from possible accidents, or where dangerous material is hidden for terrorism attacks using nuclear weapons. Thus, a technological solution is important to cope with fast and reliable remote inspection. The multi-copter is a common type of Unmanned Aerial Vehicle (UAV) that provides the ability to perform a first radiological inspection in the described scenarios. The paper proposes a solution with a multi-copter equipped with on-board sensors to perform a 3D reconstruction and a radiological mapping of the scenario. A depth camera and a Geiger-Müler counter are the used sensors. The inspection is performed in two steps: i) a 3D reconstruction of the environment and ii) radiation activity inference to localise and quantify sources of radiation. Experimental results were achieved with real 3D data and simulated radiation activity. Experimental tests with real sources of radiation are planned in the next iteration of the work.

  18. A review of GPU-based medical image reconstruction.

    PubMed

    Després, Philippe; Jia, Xun

    2017-10-01

    Tomographic image reconstruction is a computationally demanding task, even more so when advanced models are used to describe a more complete and accurate picture of the image formation process. Such advanced modeling and reconstruction algorithms can lead to better images, often with less dose, but at the price of long calculation times that are hardly compatible with clinical workflows. Fortunately, reconstruction tasks can often be executed advantageously on Graphics Processing Units (GPUs), which are exploited as massively parallel computational engines. This review paper focuses on recent developments made in GPU-based medical image reconstruction, from a CT, PET, SPECT, MRI and US perspective. Strategies and approaches to get the most out of GPUs in image reconstruction are presented as well as innovative applications arising from an increased computing capacity. The future of GPU-based image reconstruction is also envisioned, based on current trends in high-performance computing. Copyright © 2017 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  19. [Application of Fourier transform profilometry in 3D-surface reconstruction].

    PubMed

    Shi, Bi'er; Lu, Kuan; Wang, Yingting; Li, Zhen'an; Bai, Jing

    2011-08-01

    With the improvement of system frame and reconstruction methods in fluorescent molecules tomography (FMT), the FMT technology has been widely used as an important experimental tool in biomedical research. It is necessary to get the 3D-surface profile of the experimental object as the boundary constraints of FMT reconstruction algorithms. We proposed a new 3D-surface reconstruction method based on Fourier transform profilometry (FTP) method under the blue-purple light condition. The slice images were reconstructed using proper image processing methods, frequency spectrum analysis and filtering. The results of experiment showed that the method properly reconstructed the 3D-surface of objects and has the mm-level accuracy. Compared to other methods, this one is simple and fast. Besides its well-reconstructed, the proposed method could help monitor the behavior of the object during the experiment to ensure the correspondence of the imaging process. Furthermore, the method chooses blue-purple light section as its light source to avoid the interference towards fluorescence imaging.

  20. CONNJUR Workflow Builder: A software integration environment for spectral reconstruction

    PubMed Central

    Fenwick, Matthew; Weatherby, Gerard; Vyas, Jay; Sesanker, Colbert; Martyn, Timothy O.; Ellis, Heidi J.C.; Gryk, Michael R.

    2015-01-01

    CONNJUR Workflow Builder (WB) is an open-source software integration environment that leverages existing spectral reconstruction tools to create a synergistic, coherent platform for converting biomolecular NMR data from the time domain to the frequency domain. WB provides data integration of primary data and metadata using a relational database, and includes a library of pre-built workflows for processing time domain data. WB simplifies maximum entropy reconstruction, facilitating the processing of non-uniformly sampled time domain data. As will be shown in the paper, the unique features of WB provide it with novel abilities to enhance the quality, accuracy, and fidelity of the spectral reconstruction process. WB also provides features which promote collaboration, education, parameterization, and non-uniform data sets along with processing integrated with the Rowland NMR Toolkit (RNMRTK) and NMRPipe software packages. WB is available free of charge in perpetuity, dual-licensed under the MIT and GPL open source licenses. PMID:26066803

  1. Fast data reconstructed method of Fourier transform imaging spectrometer based on multi-core CPU

    NASA Astrophysics Data System (ADS)

    Yu, Chunchao; Du, Debiao; Xia, Zongze; Song, Li; Zheng, Weijian; Yan, Min; Lei, Zhenggang

    2017-10-01

    Imaging spectrometer can gain two-dimensional space image and one-dimensional spectrum at the same time, which shows high utility in color and spectral measurements, the true color image synthesis, military reconnaissance and so on. In order to realize the fast reconstructed processing of the Fourier transform imaging spectrometer data, the paper designed the optimization reconstructed algorithm with OpenMP parallel calculating technology, which was further used for the optimization process for the HyperSpectral Imager of `HJ-1' Chinese satellite. The results show that the method based on multi-core parallel computing technology can control the multi-core CPU hardware resources competently and significantly enhance the calculation of the spectrum reconstruction processing efficiency. If the technology is applied to more cores workstation in parallel computing, it will be possible to complete Fourier transform imaging spectrometer real-time data processing with a single computer.

  2. A review of breast tomosynthesis. Part II. Image reconstruction, processing and analysis, and advanced applications

    PubMed Central

    Sechopoulos, Ioannis

    2013-01-01

    Many important post-acquisition aspects of breast tomosynthesis imaging can impact its clinical performance. Chief among them is the reconstruction algorithm that generates the representation of the three-dimensional breast volume from the acquired projections. But even after reconstruction, additional processes, such as artifact reduction algorithms, computer aided detection and diagnosis, among others, can also impact the performance of breast tomosynthesis in the clinical realm. In this two part paper, a review of breast tomosynthesis research is performed, with an emphasis on its medical physics aspects. In the companion paper, the first part of this review, the research performed relevant to the image acquisition process is examined. This second part will review the research on the post-acquisition aspects, including reconstruction, image processing, and analysis, as well as the advanced applications being investigated for breast tomosynthesis. PMID:23298127

  3. CONNJUR Workflow Builder: a software integration environment for spectral reconstruction.

    PubMed

    Fenwick, Matthew; Weatherby, Gerard; Vyas, Jay; Sesanker, Colbert; Martyn, Timothy O; Ellis, Heidi J C; Gryk, Michael R

    2015-07-01

    CONNJUR Workflow Builder (WB) is an open-source software integration environment that leverages existing spectral reconstruction tools to create a synergistic, coherent platform for converting biomolecular NMR data from the time domain to the frequency domain. WB provides data integration of primary data and metadata using a relational database, and includes a library of pre-built workflows for processing time domain data. WB simplifies maximum entropy reconstruction, facilitating the processing of non-uniformly sampled time domain data. As will be shown in the paper, the unique features of WB provide it with novel abilities to enhance the quality, accuracy, and fidelity of the spectral reconstruction process. WB also provides features which promote collaboration, education, parameterization, and non-uniform data sets along with processing integrated with the Rowland NMR Toolkit (RNMRTK) and NMRPipe software packages. WB is available free of charge in perpetuity, dual-licensed under the MIT and GPL open source licenses.

  4. Facile multi-dimensional profiling of chemical gradients at the millimetre scale.

    PubMed

    Chen, Chih-Lin; Hsieh, Kai-Ta; Hsu, Ching-Fong; Urban, Pawel L

    2016-01-07

    A vast number of conventional physicochemical methods are suitable for the analysis of homogeneous samples. However, in various cases, the samples exhibit intrinsic heterogeneity. Tomography allows one to record approximate distributions of chemical species in the three-dimensional space. Here we develop a simple optical tomography system which enables performing scans of non-homogeneous samples at different wavelengths. It takes advantage of inexpensive open-source electronics and simple algorithms. The analysed samples are illuminated by a miniature LCD/LED screen which emits light at three wavelengths (598, 547 and 455 nm, corresponding to the R, G, and B channels, respectively). On presentation of every wavelength, the sample vial is rotated by ∼180°, and videoed at 30 frames per s. The RGB values of pixels in the obtained digital snapshots are subsequently collated, and processed to produce sinograms. Following the inverse Radon transform, approximate quasi-three-dimensional images are reconstructed for each wavelength. Sample components with distinct visible light absorption spectra (myoglobin, methylene blue) can be resolved. The system was used to follow dynamic changes in non-homogeneous samples in real time, to visualize binary mixtures, to reconstruct reaction-diffusion fronts formed during the reduction of 2,6-dichlorophenolindophenol by ascorbic acid, and to visualize the distribution of fungal mycelium grown in a semi-solid medium.

  5. Development of monolithic pixel detector with SOI technology for the ILC vertex detector

    NASA Astrophysics Data System (ADS)

    Yamada, M.; Ono, S.; Tsuboyama, T.; Arai, Y.; Haba, J.; Ikegami, Y.; Kurachi, I.; Togawa, M.; Mori, T.; Aoyagi, W.; Endo, S.; Hara, K.; Honda, S.; Sekigawa, D.

    2018-01-01

    We have been developing a monolithic pixel sensor for the International Linear Collider (ILC) vertex detector with the 0.2 μm FD-SOI CMOS process by LAPIS Semiconductor Co., Ltd. We aim to achieve a 3 μm single-point resolution required for the ILC with a 20×20 μm2 pixel. Beam bunch crossing at the ILC occurs every 554 ns in 1-msec-long bunch trains with an interval of 200 ms. Each pixel must record the charge and time stamp of a hit to identify a collision bunch for event reconstruction. Necessary functions include the amplifier, comparator, shift register, analog memory and time stamp implementation in each pixel, and column ADC and Zero-suppression logic on the chip. We tested the first prototype sensor, SOFIST ver.1, with a 120 GeV proton beam at the Fermilab Test Beam Facility in January 2017. SOFIST ver.1 has a charge sensitive amplifier and two analog memories in each pixel, and an 8-bit Wilkinson-type ADC is implemented for each column on the chip. We measured the residual of the hit position to the reconstructed track. The standard deviation of the residual distribution fitted by a Gaussian is better than 3 μm.

  6. Commercial milk distribution profiles and production locations. Hanford Environmental Dose Reconstruction Project

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Deonigi, D.E.; Anderson, D.M.; Wilfert, G.L.

    1993-12-01

    The Hanford Environmental Dose Reconstruction (HEDR) Project was established to estimate radiation doses that people could have received from nuclear operations at the Hanford Site since 1944. For this period iodine-131 is the most important offsite contributor to radiation doses from Hanford operations. Consumption of milk from cows that ate vegetation contaminated by iodine-131 is the dominant radiation pathway for individuals who drank milk. Information has been developed on commercial milk cow locations and commercial milk distribution during 1945 and 1951. The year 1945 was selected because during 1945 the largest amount of iodine-131 was released from Hanford facilities inmore » a calendar year; therefore, 1945 was the year in which an individual was likely to have received the highest dose. The year 1951 was selected to provide data for comparing the changes that occurred in commercial milk flows (i.e., sources, processing locations, and market areas) between World War II and the post-war period. To estimate the doses people could have received from this milk flow, it is necessary to estimate the amount of milk people consumed, the source of the milk, the specific feeding regime used for milk cows, and the amount of iodine-131 contamination deposited on feed.« less

  7. Commercial milk distribution profiles and production locations. Hanford Environmental Dose Reconstruction Project

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Deonigi, D.E.; Anderson, D.M.; Wilfert, G.L.

    1994-04-01

    The Hanford Environmental Dose Reconstruction (HEDR) Project was established to estimate radiation doses that people could have received from nuclear operations at the Hanford Site since 1944. For this period iodine-131 is the most important offsite contributor to radiation doses from Hanford operations. Consumption of milk from cows that ate vegetation contaminated by iodine-131 is the dominant radiation pathway for individuals who drank milk (Napier 1992). Information has been developed on commercial milk cow locations and commercial milk distribution during 1945 and 1951. The year 1945 was selected because during 1945 the largest amount of iodine-131 was released from Hanfordmore » facilities in a calendar year (Heeb 1993); therefore, 1945 was the year in which an individual was likely to have received the highest dose. The year 1951 was selected to provide data for comparing the changes that occurred in commercial milk flows (i.e., sources, processing locations, and market areas) between World War II and the post-war period. To estimate the doses people could have received from this milk flow, it is necessary to estimate the amount of milk people consumed, the source of the milk, the specific feeding regime used for milk cows, and the amount of iodine-131 contamination deposited on feed.« less

  8. GPU-based prompt gamma ray imaging from boron neutron capture therapy.

    PubMed

    Yoon, Do-Kun; Jung, Joo-Young; Jo Hong, Key; Sil Lee, Keum; Suk Suh, Tae

    2015-01-01

    The purpose of this research is to perform the fast reconstruction of a prompt gamma ray image using a graphics processing unit (GPU) computation from boron neutron capture therapy (BNCT) simulations. To evaluate the accuracy of the reconstructed image, a phantom including four boron uptake regions (BURs) was used in the simulation. After the Monte Carlo simulation of the BNCT, the modified ordered subset expectation maximization reconstruction algorithm using the GPU computation was used to reconstruct the images with fewer projections. The computation times for image reconstruction were compared between the GPU and the central processing unit (CPU). Also, the accuracy of the reconstructed image was evaluated by a receiver operating characteristic (ROC) curve analysis. The image reconstruction time using the GPU was 196 times faster than the conventional reconstruction time using the CPU. For the four BURs, the area under curve values from the ROC curve were 0.6726 (A-region), 0.6890 (B-region), 0.7384 (C-region), and 0.8009 (D-region). The tomographic image using the prompt gamma ray event from the BNCT simulation was acquired using the GPU computation in order to perform a fast reconstruction during treatment. The authors verified the feasibility of the prompt gamma ray image reconstruction using the GPU computation for BNCT simulations.

  9. Reconstruction of neuronal input through modeling single-neuron dynamics and computations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Qin, Qing; Wang, Jiang; Yu, Haitao

    Mathematical models provide a mathematical description of neuron activity, which can better understand and quantify neural computations and corresponding biophysical mechanisms evoked by stimulus. In this paper, based on the output spike train evoked by the acupuncture mechanical stimulus, we present two different levels of models to describe the input-output system to achieve the reconstruction of neuronal input. The reconstruction process is divided into two steps: First, considering the neuronal spiking event as a Gamma stochastic process. The scale parameter and the shape parameter of Gamma process are, respectively, defined as two spiking characteristics, which are estimated by a state-spacemore » method. Then, leaky integrate-and-fire (LIF) model is used to mimic the response system and the estimated spiking characteristics are transformed into two temporal input parameters of LIF model, through two conversion formulas. We test this reconstruction method by three different groups of simulation data. All three groups of estimates reconstruct input parameters with fairly high accuracy. We then use this reconstruction method to estimate the non-measurable acupuncture input parameters. Results show that under three different frequencies of acupuncture stimulus conditions, estimated input parameters have an obvious difference. The higher the frequency of the acupuncture stimulus is, the higher the accuracy of reconstruction is.« less

  10. Redundancy Analysis of Capacitance Data of a Coplanar Electrode Array for Fast and Stable Imaging Processing

    PubMed Central

    Wen, Yintang; Zhang, Zhenda; Zhang, Yuyan; Sun, Dongtao

    2017-01-01

    A coplanar electrode array sensor is established for the imaging of composite-material adhesive-layer defect detection. The sensor is based on the capacitive edge effect, which leads to capacitance data being considerably weak and susceptible to environmental noise. The inverse problem of coplanar array electrical capacitance tomography (C-ECT) is ill-conditioning, in which a small error of capacitance data can seriously affect the quality of reconstructed images. In order to achieve a stable image reconstruction process, a redundancy analysis method for capacitance data is proposed. The proposed method is based on contribution rate and anti-interference capability. According to the redundancy analysis, the capacitance data are divided into valid and invalid data. When the image is reconstructed by valid data, the sensitivity matrix needs to be changed accordingly. In order to evaluate the effectiveness of the sensitivity map, singular value decomposition (SVD) is used. Finally, the two-dimensional (2D) and three-dimensional (3D) images are reconstructed by the Tikhonov regularization method. Through comparison of the reconstructed images of raw capacitance data, the stability of the image reconstruction process can be improved, and the quality of reconstructed images is not degraded. As a result, much invalid data are not collected, and the data acquisition time can also be reduced. PMID:29295537

  11. Reconstruction of neuronal input through modeling single-neuron dynamics and computations

    NASA Astrophysics Data System (ADS)

    Qin, Qing; Wang, Jiang; Yu, Haitao; Deng, Bin; Chan, Wai-lok

    2016-06-01

    Mathematical models provide a mathematical description of neuron activity, which can better understand and quantify neural computations and corresponding biophysical mechanisms evoked by stimulus. In this paper, based on the output spike train evoked by the acupuncture mechanical stimulus, we present two different levels of models to describe the input-output system to achieve the reconstruction of neuronal input. The reconstruction process is divided into two steps: First, considering the neuronal spiking event as a Gamma stochastic process. The scale parameter and the shape parameter of Gamma process are, respectively, defined as two spiking characteristics, which are estimated by a state-space method. Then, leaky integrate-and-fire (LIF) model is used to mimic the response system and the estimated spiking characteristics are transformed into two temporal input parameters of LIF model, through two conversion formulas. We test this reconstruction method by three different groups of simulation data. All three groups of estimates reconstruct input parameters with fairly high accuracy. We then use this reconstruction method to estimate the non-measurable acupuncture input parameters. Results show that under three different frequencies of acupuncture stimulus conditions, estimated input parameters have an obvious difference. The higher the frequency of the acupuncture stimulus is, the higher the accuracy of reconstruction is.

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tomopy is a Python toolbox to perform x-ray data processing, image reconstruction and data exchange tasks at synchrotron facilities. The dependencies of the software are currently as follows: -Python related python standard library (http://docs.python.org/2/library/) numpy (http://www.numpy.org/) scipy (http://scipy.org/) matplotlib (http://matplotlip.org/) sphinx (http://sphinx-doc.org) pil (http://www.pythonware.com/products/pil/) pyhdf (http://pysclint.sourceforge.net/pyhdf/) h5py (http://www.h5py.org) pywt (http://www.pybytes.com/pywavelets/) file.py (https://pyspec.svn.sourceforge.net/svnroot/pyspec/trunk/pyspec/ccd/files.py) -C/C++ related: gridec (anonymous?? C-code written back in 1997 that uses standard C library) fftw (http://www.fftw.org/) tomoRecon (multi-threaded C++ verion of gridrec. Author: Mark Rivers from APS. http://cars9.uchicago.edu/software/epics/tomoRecon.html) epics (http://www.aps.anl.gov/epics/)

  13. The Learning Reconstruction of Particle System and Linear Momentum Conservation in Introductory Physics Course

    NASA Astrophysics Data System (ADS)

    Karim, S.; Saepuzaman, D.; Sriyansyah, S. P.

    2016-08-01

    This study is initiated by low achievement of prospective teachers in understanding concepts in introductory physics course. In this case, a problem has been identified that students cannot develop their thinking skills required for building physics concepts. Therefore, this study will reconstruct a learning process, emphasizing a physics concept building. The outcome will design physics lesson plans for the concepts of particle system as well as linear momentum conservation. A descriptive analysis method will be used in order to investigate the process of learning reconstruction carried out by students. In this process, the students’ conceptual understanding will be evaluated using essay tests for concepts of particle system and linear momentum conservation. The result shows that the learning reconstruction has successfully supported the students’ understanding of physics concept.

  14. Maximising information recovery from rank-order codes

    NASA Astrophysics Data System (ADS)

    Sen, B.; Furber, S.

    2007-04-01

    The central nervous system encodes information in sequences of asynchronously generated voltage spikes, but the precise details of this encoding are not well understood. Thorpe proposed rank-order codes as an explanation of the observed speed of information processing in the human visual system. The work described in this paper is inspired by the performance of SpikeNET, a biologically inspired neural architecture using rank-order codes for information processing, and is based on the retinal model developed by VanRullen and Thorpe. This model mimics retinal information processing by passing an input image through a bank of Difference of Gaussian (DoG) filters and then encoding the resulting coefficients in rank-order. To test the effectiveness of this encoding in capturing the information content of an image, the rank-order representation is decoded to reconstruct an image that can be compared with the original. The reconstruction uses a look-up table to infer the filter coefficients from their rank in the encoded image. Since the DoG filters are approximately orthogonal functions, they are treated as their own inverses in the reconstruction process. We obtained a quantitative measure of the perceptually important information retained in the reconstructed image relative to the original using a slightly modified version of an objective metric proposed by Petrovic. It is observed that around 75% of the perceptually important information is retained in the reconstruction. In the present work we reconstruct the input using a pseudo-inverse of the DoG filter-bank with the aim of improving the reconstruction and thereby extracting more information from the rank-order encoded stimulus. We observe that there is an increase of 10 - 15% in the information retrieved from a reconstructed stimulus as a result of inverting the filter-bank.

  15. A comparison of manual neuronal reconstruction from biocytin histology or 2-photon imaging: morphometry and computer modeling

    PubMed Central

    Blackman, Arne V.; Grabuschnig, Stefan; Legenstein, Robert; Sjöström, P. Jesper

    2014-01-01

    Accurate 3D reconstruction of neurons is vital for applications linking anatomy and physiology. Reconstructions are typically created using Neurolucida after biocytin histology (BH). An alternative inexpensive and fast method is to use freeware such as Neuromantic to reconstruct from fluorescence imaging (FI) stacks acquired using 2-photon laser-scanning microscopy during physiological recording. We compare these two methods with respect to morphometry, cell classification, and multicompartmental modeling in the NEURON simulation environment. Quantitative morphological analysis of the same cells reconstructed using both methods reveals that whilst biocytin reconstructions facilitate tracing of more distal collaterals, both methods are comparable in representing the overall morphology: automated clustering of reconstructions from both methods successfully separates neocortical basket cells from pyramidal cells but not BH from FI reconstructions. BH reconstructions suffer more from tissue shrinkage and compression artifacts than FI reconstructions do. FI reconstructions, on the other hand, consistently have larger process diameters. Consequently, significant differences in NEURON modeling of excitatory post-synaptic potential (EPSP) forward propagation are seen between the two methods, with FI reconstructions exhibiting smaller depolarizations. Simulated action potential backpropagation (bAP), however, is indistinguishable between reconstructions obtained with the two methods. In our hands, BH reconstructions are necessary for NEURON modeling and detailed morphological tracing, and thus remain state of the art, although they are more labor intensive, more expensive, and suffer from a higher failure rate due to the occasional poor outcome of histological processing. However, for a subset of anatomical applications such as cell type identification, FI reconstructions are superior, because of indistinguishable classification performance with greater ease of use, essentially 100% success rate, and lower cost. PMID:25071470

  16. Reconstructing Face Image from the Thermal Infrared Spectrum to the Visible Spectrum †

    PubMed Central

    Kresnaraman, Brahmastro; Deguchi, Daisuke; Takahashi, Tomokazu; Mekada, Yoshito; Ide, Ichiro; Murase, Hiroshi

    2016-01-01

    During the night or in poorly lit areas, thermal cameras are a better choice instead of normal cameras for security surveillance because they do not rely on illumination. A thermal camera is able to detect a person within its view, but identification from only thermal information is not an easy task. The purpose of this paper is to reconstruct the face image of a person from the thermal spectrum to the visible spectrum. After the reconstruction, further image processing can be employed, including identification/recognition. Concretely, we propose a two-step thermal-to-visible-spectrum reconstruction method based on Canonical Correlation Analysis (CCA). The reconstruction is done by utilizing the relationship between images in both thermal infrared and visible spectra obtained by CCA. The whole image is processed in the first step while the second step processes patches in an image. Results show that the proposed method gives satisfying results with the two-step approach and outperforms comparative methods in both quality and recognition evaluations. PMID:27110781

  17. Intermediate view reconstruction using adaptive disparity search algorithm for real-time 3D processing

    NASA Astrophysics Data System (ADS)

    Bae, Kyung-hoon; Park, Changhan; Kim, Eun-soo

    2008-03-01

    In this paper, intermediate view reconstruction (IVR) using adaptive disparity search algorithm (ASDA) is for realtime 3-dimensional (3D) processing proposed. The proposed algorithm can reduce processing time of disparity estimation by selecting adaptive disparity search range. Also, the proposed algorithm can increase the quality of the 3D imaging. That is, by adaptively predicting the mutual correlation between stereo images pair using the proposed algorithm, the bandwidth of stereo input images pair can be compressed to the level of a conventional 2D image and a predicted image also can be effectively reconstructed using a reference image and disparity vectors. From some experiments, stereo sequences of 'Pot Plant' and 'IVO', it is shown that the proposed algorithm improves the PSNRs of a reconstructed image to about 4.8 dB by comparing with that of conventional algorithms, and reduces the Synthesizing time of a reconstructed image to about 7.02 sec by comparing with that of conventional algorithms.

  18. Event Reconstruction for Many-core Architectures using Java

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Graf, Norman A.; /SLAC

    Although Moore's Law remains technically valid, the performance enhancements in computing which traditionally resulted from increased CPU speeds ended years ago. Chip manufacturers have chosen to increase the number of core CPUs per chip instead of increasing clock speed. Unfortunately, these extra CPUs do not automatically result in improvements in simulation or reconstruction times. To take advantage of this extra computing power requires changing how software is written. Event reconstruction is globally serial, in the sense that raw data has to be unpacked first, channels have to be clustered to produce hits before those hits are identified as belonging tomore » a track or shower, tracks have to be found and fit before they are vertexed, etc. However, many of the individual procedures along the reconstruction chain are intrinsically independent and are perfect candidates for optimization using multi-core architecture. Threading is perhaps the simplest approach to parallelizing a program and Java includes a powerful threading facility built into the language. We have developed a fast and flexible reconstruction package (org.lcsim) written in Java that has been used for numerous physics and detector optimization studies. In this paper we present the results of our studies on optimizing the performance of this toolkit using multiple threads on many-core architectures.« less

  19. Reverse engineering physical models employing a sensor integration between 3D stereo detection and contact digitization

    NASA Astrophysics Data System (ADS)

    Chen, Liang-Chia; Lin, Grier C. I.

    1997-12-01

    A vision-drive automatic digitization process for free-form surface reconstruction has been developed, with a coordinate measurement machine (CMM) equipped with a touch-triggered probe and a CCD camera, in reverse engineering physical models. The process integrates 3D stereo detection, data filtering, Delaunay triangulation, adaptive surface digitization into a single process of surface reconstruction. By using this innovative approach, surface reconstruction can be implemented automatically and accurately. Least-squares B- spline surface models with the controlled accuracy of digitization can be generated for further application in product design and manufacturing processes. One industrial application indicates that this approach is feasible, and the processing time required in reverse engineering process can be significantly reduced up to more than 85%.

  20. Total focusing method with correlation processing of antenna array signals

    NASA Astrophysics Data System (ADS)

    Kozhemyak, O. A.; Bortalevich, S. I.; Loginov, E. L.; Shinyakov, Y. A.; Sukhorukov, M. P.

    2018-03-01

    The article proposes a method of preliminary correlation processing of a complete set of antenna array signals used in the image reconstruction algorithm. The results of experimental studies of 3D reconstruction of various reflectors using and without correlation processing are presented in the article. Software ‘IDealSystem3D’ by IDeal-Technologies was used for experiments. Copper wires of different diameters located in a water bath were used as a reflector. The use of correlation processing makes it possible to obtain more accurate reconstruction of the image of the reflectors and to increase the signal-to-noise ratio. The experimental results were processed using an original program. This program allows varying the parameters of the antenna array and sampling frequency.

  1. Local Surface Reconstruction from MER images using Stereo Workstation

    NASA Astrophysics Data System (ADS)

    Shin, Dongjoe; Muller, Jan-Peter

    2010-05-01

    The authors present a semi-automatic workflow that reconstructs the 3D shape of the martian surface from local stereo images delivered by PnCam or NavCam on systems such as the NASA Mars Exploration Rover (MER) Mission and in the future the ESA-NASA ExoMars rover PanCam. The process is initiated with manually selected tiepoints on a stereo workstation which is then followed by a tiepoint refinement, stereo-matching using region growing and Levenberg-Marquardt Algorithm (LMA)-based bundle adjustment processing. The stereo workstation, which is being developed by UCL in collaboration with colleagues at the Jet Propulsion Laboratory (JPL) within the EU FP7 ProVisG project, includes a set of practical GUI-based tools that enable an operator to define a visually correct tiepoint via a stereo display. To achieve platform and graphic hardware independence, the stereo application has been implemented using JPL's JADIS graphic library which is written in JAVA and the remaining processing blocks used in the reconstruction workflow have also been developed as a JAVA package to increase the code re-usability, portability and compatibility. Although initial tiepoints from the stereo workstation are reasonably acceptable as true correspondences, it is often required to employ an optional validity check and/or quality enhancing process. To meet this requirement, the workflow has been designed to include a tiepoint refinement process based on the Adaptive Least Square Correlation (ALSC) matching algorithm so that the initial tiepoints can be further enhanced to sub-pixel precision or rejected if they fail to pass the ALSC matching threshold. Apart from the accuracy of reconstruction, it is obvious that the other criterion to assess the quality of reconstruction is the density (or completeness) of reconstruction, which is not attained in the refinement process. Thus, we re-implemented a stereo region growing process, which is a core matching algorithm within the UCL-HRSC reconstruction workflow. This algorithm's performance is reasonable even for close-range imagery so long as the stereo -pair does not too large a baseline displacement. For post-processing, a Bundle Adjustment (BA) is used to optimise the initial calibration parameters, which bootstrap the reconstruction results. Amongst many options for the non-linear optimisation, the LMA has been adopted due to its stability so that the BA searches the best calibration parameters whilst iteratively minimising the re-projection errors of the initial reconstruction points. For the evaluation of the proposed method, the result of the method is compared with the reconstruction from a disparity map provided by JPL using their operational processing system. Visual and quantitative comparison will be presented as well as updated camera parameters. As part of future work, we will investigate a method expediting the processing speed of the stereo region growing process and look into the possibility of extending the use of the stereo workstation to orbital image processing. Such an interactive stereo workstation can also be used to digitize points and line features as well as assess the accuracy of stereo processed results produced from other stereo matching algorithms available from within the consortium and elsewhere. It can also provide "ground truth" when suitably refined for stereo matching algorithms as well as provide visual cues as to why these matching algorithms sometimes fail to mitigate this in the future. The research leading to these results has received funding from the European Community's Seventh Framework Programme (FP7/2007-2013) under grant agreement n° 218814 "PRoVisG".

  2. Trace: a high-throughput tomographic reconstruction engine for large-scale datasets.

    PubMed

    Bicer, Tekin; Gürsoy, Doğa; Andrade, Vincent De; Kettimuthu, Rajkumar; Scullin, William; Carlo, Francesco De; Foster, Ian T

    2017-01-01

    Modern synchrotron light sources and detectors produce data at such scale and complexity that large-scale computation is required to unleash their full power. One of the widely used imaging techniques that generates data at tens of gigabytes per second is computed tomography (CT). Although CT experiments result in rapid data generation, the analysis and reconstruction of the collected data may require hours or even days of computation time with a medium-sized workstation, which hinders the scientific progress that relies on the results of analysis. We present Trace, a data-intensive computing engine that we have developed to enable high-performance implementation of iterative tomographic reconstruction algorithms for parallel computers. Trace provides fine-grained reconstruction of tomography datasets using both (thread-level) shared memory and (process-level) distributed memory parallelization. Trace utilizes a special data structure called replicated reconstruction object to maximize application performance. We also present the optimizations that we apply to the replicated reconstruction objects and evaluate them using tomography datasets collected at the Advanced Photon Source. Our experimental evaluations show that our optimizations and parallelization techniques can provide 158× speedup using 32 compute nodes (384 cores) over a single-core configuration and decrease the end-to-end processing time of a large sinogram (with 4501 × 1 × 22,400 dimensions) from 12.5 h to <5 min per iteration. The proposed tomographic reconstruction engine can efficiently process large-scale tomographic data using many compute nodes and minimize reconstruction times.

  3. Comparison between various patch wise strategies for reconstruction of ultra-spectral cubes captured with a compressive sensing system

    NASA Astrophysics Data System (ADS)

    Oiknine, Yaniv; August, Isaac Y.; Revah, Liat; Stern, Adrian

    2016-05-01

    Recently we introduced a Compressive Sensing Miniature Ultra-Spectral Imaging (CS-MUSI) system. The system is based on a single Liquid Crystal (LC) cell and a parallel sensor array where the liquid crystal cell performs spectral encoding. Within the framework of compressive sensing, the CS-MUSI system is able to reconstruct ultra-spectral cubes captured with only an amount of ~10% samples compared to a conventional system. Despite the compression, the technique is extremely complex computationally, because reconstruction of ultra-spectral images requires processing huge data cubes of Gigavoxel size. Fortunately, the computational effort can be alleviated by using separable operation. An additional way to reduce the reconstruction effort is to perform the reconstructions on patches. In this work, we consider processing on various patch shapes. We present an experimental comparison between various patch shapes chosen to process the ultra-spectral data captured with CS-MUSI system. The patches may be one dimensional (1D) for which the reconstruction is carried out spatially pixel-wise, or two dimensional (2D) - working on spatial rows/columns of the ultra-spectral cube, as well as three dimensional (3D).

  4. Quantitative fractography by digital image processing: NIH Image macro tools for stereo pair analysis and 3-D reconstruction.

    PubMed

    Hein, L R

    2001-10-01

    A set of NIH Image macro programs was developed to make qualitative and quantitative analyses from digital stereo pictures produced by scanning electron microscopes. These tools were designed for image alignment, anaglyph representation, animation, reconstruction of true elevation surfaces, reconstruction of elevation profiles, true-scale elevation mapping and, for the quantitative approach, surface area and roughness calculations. Limitations on time processing, scanning techniques and programming concepts are also discussed.

  5. The yellowed archives of yellowcake.

    PubMed Central

    Silver, K

    1996-01-01

    Extensive historical documentation of exposures and releases at government-owned energy facilities is a unique and valuable resource for analyzing and communicating health risks. Facilities at all stages of the atomic fuel cycle were the subject of numerous industrial hygiene, occupational health, and environmental assessments during the Cold War period. Uranium mines and mills on the Colorado Plateau were investigated as early as the 1940s. One such facility was the mill in Monticello, Utah, which began operation as a vanadium extraction plant in 1943 and was later adapted to recover uranium from carnotite ores. The mill ceased operation in 1960. The site was added to the federal Superfund list in 1986. ATSDR held public availability sessions in 1993 as part of its public health assessment process, at which several former mill workers voiced health concerns. An extensive literature search yielded several industrial hygiene evaluations of the Monticello mill and health studies that included Monticello workers, only two of which had been published in the peer-reviewed literature. In combination with the broader scientific literature, these historical reports provide a partial basis for responding to mill workers' contemporary health concerns. The strengths and limitations of the available exposure data for analytical epidemiologic studies and dose reconstruction are discussed. As an interim measure, the available historical documentation may be especially helpful in communicating about health risks with workers and communities in ways that acknowledge the historical context of their experience. Images p116-a p117-a p118-a PMID:8606907

  6. Flexibility and utility of pre-processing methods in converting STXM setups for ptychography - Final Paper

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fromm, Catherine

    2015-08-20

    Ptychography is an advanced diffraction based imaging technique that can achieve resolution of 5nm and below. It is done by scanning a sample through a beam of focused x-rays using discrete yet overlapping scan steps. Scattering data is collected on a CCD camera, and the phase of the scattered light is reconstructed with sophisticated iterative algorithms. Because the experimental setup is similar, ptychography setups can be created by retrofitting existing STXM beam lines with new hardware. The other challenge comes in the reconstruction of the collected scattering images. Scattering data must be adjusted and packaged with experimental parameters to calibratemore » the reconstruction software. The necessary pre-processing of data prior to reconstruction is unique to each beamline setup, and even the optical alignments used on that particular day. Pre-processing software must be developed to be flexible and efficient in order to allow experiments appropriate control and freedom in the analysis of their hard-won data. This paper will describe the implementation of pre-processing software which successfully connects data collection steps to reconstruction steps, letting the user accomplish accurate and reliable ptychography.« less

  7. Investigating the impact of spatial priors on the performance of model-based IVUS elastography

    PubMed Central

    Richards, M S; Doyley, M M

    2012-01-01

    This paper describes methods that provide pre-requisite information for computing circumferential stress in modulus elastograms recovered from vascular tissue—information that could help cardiologists detect life-threatening plaques and predict their propensity to rupture. The modulus recovery process is an ill-posed problem; therefore additional information is needed to provide useful elastograms. In this work, prior geometrical information was used to impose hard or soft constraints on the reconstruction process. We conducted simulation and phantom studies to evaluate and compare modulus elastograms computed with soft and hard constraints versus those computed without any prior information. The results revealed that (1) the contrast-to-noise ratio of modulus elastograms achieved using the soft prior and hard prior reconstruction methods exceeded those computed without any prior information; (2) the soft prior and hard prior reconstruction methods could tolerate up to 8 % measurement noise; and (3) the performance of soft and hard prior modulus elastogram degraded when incomplete spatial priors were employed. This work demonstrates that including spatial priors in the reconstruction process should improve the performance of model-based elastography, and the soft prior approach should enhance the robustness of the reconstruction process to errors in the geometrical information. PMID:22037648

  8. GPU-accelerated Kernel Regression Reconstruction for Freehand 3D Ultrasound Imaging.

    PubMed

    Wen, Tiexiang; Li, Ling; Zhu, Qingsong; Qin, Wenjian; Gu, Jia; Yang, Feng; Xie, Yaoqin

    2017-07-01

    Volume reconstruction method plays an important role in improving reconstructed volumetric image quality for freehand three-dimensional (3D) ultrasound imaging. By utilizing the capability of programmable graphics processing unit (GPU), we can achieve a real-time incremental volume reconstruction at a speed of 25-50 frames per second (fps). After incremental reconstruction and visualization, hole-filling is performed on GPU to fill remaining empty voxels. However, traditional pixel nearest neighbor-based hole-filling fails to reconstruct volume with high image quality. On the contrary, the kernel regression provides an accurate volume reconstruction method for 3D ultrasound imaging but with the cost of heavy computational complexity. In this paper, a GPU-based fast kernel regression method is proposed for high-quality volume after the incremental reconstruction of freehand ultrasound. The experimental results show that improved image quality for speckle reduction and details preservation can be obtained with the parameter setting of kernel window size of [Formula: see text] and kernel bandwidth of 1.0. The computational performance of the proposed GPU-based method can be over 200 times faster than that on central processing unit (CPU), and the volume with size of 50 million voxels in our experiment can be reconstructed within 10 seconds.

  9. Progress toward the development and testing of source reconstruction methods for NIF neutron imaging.

    PubMed

    Loomis, E N; Grim, G P; Wilde, C; Wilson, D C; Morgan, G; Wilke, M; Tregillis, I; Merrill, F; Clark, D; Finch, J; Fittinghoff, D; Bower, D

    2010-10-01

    Development of analysis techniques for neutron imaging at the National Ignition Facility is an important and difficult task for the detailed understanding of high-neutron yield inertial confinement fusion implosions. Once developed, these methods must provide accurate images of the hot and cold fuels so that information about the implosion, such as symmetry and areal density, can be extracted. One method under development involves the numerical inversion of the pinhole image using knowledge of neutron transport through the pinhole aperture from Monte Carlo simulations. In this article we present results of source reconstructions based on simulated images that test the methods effectiveness with regard to pinhole misalignment.

  10. The cosmic ray muon tomography facility based on large scale MRPC detectors

    NASA Astrophysics Data System (ADS)

    Wang, Xuewu; Zeng, Ming; Zeng, Zhi; Wang, Yi; Zhao, Ziran; Yue, Xiaoguang; Luo, Zhifei; Yi, Hengguan; Yu, Baihui; Cheng, Jianping

    2015-06-01

    Cosmic ray muon tomography is a novel technology to detect high-Z material. A prototype of TUMUTY with 73.6 cm×73.6 cm large scale position sensitive MRPC detectors has been developed and is introduced in this paper. Three test kits have been tested and image is reconstructed using MAP algorithm. The reconstruction results show that the prototype is working well and the objects with complex structure and small size (20 mm) can be imaged on it, while the high-Z material is distinguishable from the low-Z one. This prototype provides a good platform for our further studies of the physical characteristics and the performances of cosmic ray muon tomography.

  11. IRVE-3 Post-Flight Reconstruction

    NASA Technical Reports Server (NTRS)

    Olds, Aaron D.; Beck, Roger; Bose, David; White, Joseph; Edquist, Karl; Hollis, Brian; Lindell, Michael; Cheatwood, F. N.; Gsell, Valerie; Bowden, Ernest

    2013-01-01

    The Inflatable Re-entry Vehicle Experiment 3 (IRVE-3) was conducted from the NASA Wallops Flight Facility on July 23, 2012. Launched on a Black Brant XI sounding rocket, the IRVE-3 research vehicle achieved an apogee of 469 km, deployed and inflated a Hypersonic Inflatable Aerodynamic Decelerator (HIAD), re-entered the Earth's atmosphere at Mach 10 and achieved a peak deceleration of 20 g's before descending to splashdown roughly 20 minutes after launch. This paper presents the filtering methodology and results associated with the development of the Best Estimated Trajectory of the IRVE-3 flight test. The reconstructed trajectory is compared against project requirements and pre-flight predictions of entry state, aerodynamics, HIAD flexibility, and attitude control system performance.

  12. On three-dimensional reconstruction of a neutron/x-ray source from very few two-dimensional projections

    DOE PAGES

    Volegov, P. L.; Danly, C. R.; Merrill, F. E.; ...

    2015-11-24

    The neutron imaging system at the National Ignition Facility is an important diagnostic tool for measuring the two-dimensional size and shape of the source of neutrons produced in the burning deuterium-tritium plasma during the stagnation phase of inertial confinement fusion implosions. Few two-dimensional projections of neutronimages are available to reconstruct the three-dimensionalneutron source. In our paper, we present a technique that has been developed for the 3Dreconstruction of neutron and x-raysources from a minimal number of 2D projections. Here, we present the detailed algorithms used for this characterization and the results of reconstructedsources from experimental data collected at Omega.

  13. GPU-based prompt gamma ray imaging from boron neutron capture therapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yoon, Do-Kun; Jung, Joo-Young; Suk Suh, Tae, E-mail: suhsanta@catholic.ac.kr

    Purpose: The purpose of this research is to perform the fast reconstruction of a prompt gamma ray image using a graphics processing unit (GPU) computation from boron neutron capture therapy (BNCT) simulations. Methods: To evaluate the accuracy of the reconstructed image, a phantom including four boron uptake regions (BURs) was used in the simulation. After the Monte Carlo simulation of the BNCT, the modified ordered subset expectation maximization reconstruction algorithm using the GPU computation was used to reconstruct the images with fewer projections. The computation times for image reconstruction were compared between the GPU and the central processing unit (CPU).more » Also, the accuracy of the reconstructed image was evaluated by a receiver operating characteristic (ROC) curve analysis. Results: The image reconstruction time using the GPU was 196 times faster than the conventional reconstruction time using the CPU. For the four BURs, the area under curve values from the ROC curve were 0.6726 (A-region), 0.6890 (B-region), 0.7384 (C-region), and 0.8009 (D-region). Conclusions: The tomographic image using the prompt gamma ray event from the BNCT simulation was acquired using the GPU computation in order to perform a fast reconstruction during treatment. The authors verified the feasibility of the prompt gamma ray image reconstruction using the GPU computation for BNCT simulations.« less

  14. TU-FG-BRB-07: GPU-Based Prompt Gamma Ray Imaging From Boron Neutron Capture Therapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kim, S; Suh, T; Yoon, D

    Purpose: The purpose of this research is to perform the fast reconstruction of a prompt gamma ray image using a graphics processing unit (GPU) computation from boron neutron capture therapy (BNCT) simulations. Methods: To evaluate the accuracy of the reconstructed image, a phantom including four boron uptake regions (BURs) was used in the simulation. After the Monte Carlo simulation of the BNCT, the modified ordered subset expectation maximization reconstruction algorithm using the GPU computation was used to reconstruct the images with fewer projections. The computation times for image reconstruction were compared between the GPU and the central processing unit (CPU).more » Also, the accuracy of the reconstructed image was evaluated by a receiver operating characteristic (ROC) curve analysis. Results: The image reconstruction time using the GPU was 196 times faster than the conventional reconstruction time using the CPU. For the four BURs, the area under curve values from the ROC curve were 0.6726 (A-region), 0.6890 (B-region), 0.7384 (C-region), and 0.8009 (D-region). Conclusion: The tomographic image using the prompt gamma ray event from the BNCT simulation was acquired using the GPU computation in order to perform a fast reconstruction during treatment. The authors verified the feasibility of the prompt gamma ray reconstruction using the GPU computation for BNCT simulations.« less

  15. 40 CFR 60.40Da - Applicability and designation of affected facility.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... than 73 megawatts (MW) (250 million British thermal units per hour (MMBtu/hr)) heat input of fossil... capable of combusting more than 73 MW (250 MMBtu/h) heat input of fossil fuel (either alone or in... reconstruction after February 28, 2005. (c) Any change to an existing fossil-fuel-fired steam generating unit to...

  16. 40 CFR 60.40Da - Applicability and designation of affected facility.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... than 73 megawatts (MW) (250 million British thermal units per hour (MMBtu/hr)) heat input of fossil... capable of combusting more than 73 MW (250 MMBtu/h) heat input of fossil fuel (either alone or in... reconstruction after February 28, 2005. (c) Any change to an existing fossil-fuel-fired steam generating unit to...

  17. 40 CFR 60.40Da - Applicability and designation of affected facility.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... than 73 megawatts (MW) (250 million British thermal units per hour (MMBtu/hr)) heat input of fossil... capable of combusting more than 73 MW (250 MMBtu/h) heat input of fossil fuel (either alone or in... reconstruction after February 28, 2005. (c) Any change to an existing fossil-fuel-fired steam generating unit to...

  18. Assessing the damage at Mt. Coffee

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Johnson, J.C.; Macauley, L.D.

    1995-12-31

    The Mt. Coffee Hydroelectric Project was damaged during the Liberian civil unrest in early 1990`s. A team of engineers performed a damage assessment of the project with the hope that funding could be obtained to reconstruct the project. The damage done to the plant had far greater impacts to the country than merely the cost to rebuild the facility.

  19. 40 CFR 63.5110 - What special definitions are used in this subpart?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... equipment used to apply an organic coating to the surface of metal coil. A coil coating line includes a web... emission limitation (including any operating limit) or work practice standard; (2) Fails to meet any term... before July 18, 2000, and it has not subsequently undergone reconstruction as defined in § 63.2. Facility...

  20. 40 CFR Table 1 to Subpart Ggg of... - States That Have an Approved and Effective State Plan a

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... DESIGNATED FACILITIES AND POLLUTANTS Federal Plan Requirements for Municipal Solid Waste Landfills That Commenced Construction Prior to May 30, 1991 and Have Not Been Modified or Reconstructed Since May 30, 1991... this Federal plan if it commenced construction before May 30, 1991 and has not been modified or...

  1. 40 CFR Table 1 to Subpart Ggg of... - States That Have an Approved and Effective State Plan a

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... DESIGNATED FACILITIES AND POLLUTANTS Federal Plan Requirements for Municipal Solid Waste Landfills That Commenced Construction Prior to May 30, 1991 and Have Not Been Modified or Reconstructed Since May 30, 1991... this Federal plan if it commenced construction before May 30, 1991 and has not been modified or...

  2. 40 CFR Table 3 to Subpart Ggg of... - Generic Compliance Schedule and Increments of Progress a

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... DESIGNATED FACILITIES AND POLLUTANTS Federal Plan Requirements for Municipal Solid Waste Landfills That Commenced Construction Prior to May 30, 1991 and Have Not Been Modified or Reconstructed Since May 30, 1991... rate report showing NMOC emissions ≥ 50 Mg/yr.b Increment 3—Begin on-site construction 24 months after...

  3. 40 CFR Table 3 to Subpart Ggg of... - Generic Compliance Schedule and Increments of Progress a

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... DESIGNATED FACILITIES AND POLLUTANTS Federal Plan Requirements for Municipal Solid Waste Landfills That Commenced Construction Prior to May 30, 1991 and Have Not Been Modified or Reconstructed Since May 30, 1991... rate report showing NMOC emissions ≥ 50 Mg/yr.b Increment 3—Begin on-site construction 24 months after...

  4. The Early Stage of Neutron Tomography for Cultural Heritage Study in Thailand

    NASA Astrophysics Data System (ADS)

    Khaweerat, S.; Ratanatongchai, W.; S. Wonglee; Schillinger, B.

    In parallel to the upgrade of neutron imaging facility at TRR-1/M1 since 2015, the practice on image processing software has led to implementation of neutron tomography (NT). The current setup provides a thermal neutron flux of 1.08×106 cm-2sec-1 at the exposure position. In general, the sample was fixed on a plate at the top of rotary stage controlled by Labview 2009 Version 9.0.1. The incremental step can be adjusted from 0.45 to 7.2 degree. A 16 bit CCD camera assembled with a Nikkor 50 mm f/1.2 lens was used to record light from 6LiF/ZnS (green) neutron converter screen. The exposure time for each shot was 60 seconds, resulting in the acquisition time of approximately three hours for completely turning the sample around. Afterwards, the batch of two dimensional neutron images of the sample was read into the reconstruction and visualization software Octopus reconstruction 8.8 and Octopus visualization 2.0, respectively. The results revealed that the system alignment is important. Maintaining the stability of heavy sample at every particular angle of rotation is important. Previous alignment showed instability of the supporting plane while tilting the sample. This study showed that the sample stage should be replaced. Even though the NT is a lengthy process and involves large data processing, it offers an opportunity to better understand features of an object in more details than with neutron radiography. The digital NT also allows us to separate inner features that appear superpositioned in radiography by cross-sectioning the 3D data set of an object without destruction. As a result, NT is a significant tool for revealing hidden information included in the inner structure of cultural heritage objects, providing great benefits in archaeological study, conservation process and authenticity investigating.

  5. Reconstructing apology: David Cameron's Bloody Sunday apology in the press.

    PubMed

    McNeill, Andrew; Lyons, Evanthia; Pehrson, Samuel

    2014-12-01

    While there is an acknowledgement in apology research that political apologies are highly mediated, the process of mediation itself has lacked scrutiny. This article suggests that the idea of reconstruction helps to understand how apologies are mediated and evaluated. David Cameron's apology for Bloody Sunday is examined to see how he constructs four aspects of apology: social actors, consequences, categorization, and reasons. The reconstruction of those aspects by British, Unionist, and Nationalist press along with reconstructions made by soldiers in an online forum are considered. Data analysis was informed by thematic analysis and discourse analysis which helped to explore key aspects of reconstruction and how elements of Cameron's apology are altered in subsequent mediated forms of the apology. These mediated reconstructions of the apology allowed their authors to evaluate the apology in different ways. Thus, in this article, it is suggested that the evaluation of the apology by different groups is preceded by a reconstruction of it in accordance with rhetorical goals. This illuminates the process of mediation and helps to understand divergent responses to political apologies. © 2013 The British Psychological Society.

  6. Traceability, reproducibility and wiki-exploration for “à-la-carte” reconstructions of genome-scale metabolic models

    PubMed Central

    Got, Jeanne; Cortés, María Paz; Maass, Alejandro

    2018-01-01

    Genome-scale metabolic models have become the tool of choice for the global analysis of microorganism metabolism, and their reconstruction has attained high standards of quality and reliability. Improvements in this area have been accompanied by the development of some major platforms and databases, and an explosion of individual bioinformatics methods. Consequently, many recent models result from “à la carte” pipelines, combining the use of platforms, individual tools and biological expertise to enhance the quality of the reconstruction. Although very useful, introducing heterogeneous tools, that hardly interact with each other, causes loss of traceability and reproducibility in the reconstruction process. This represents a real obstacle, especially when considering less studied species whose metabolic reconstruction can greatly benefit from the comparison to good quality models of related organisms. This work proposes an adaptable workspace, AuReMe, for sustainable reconstructions or improvements of genome-scale metabolic models involving personalized pipelines. At each step, relevant information related to the modifications brought to the model by a method is stored. This ensures that the process is reproducible and documented regardless of the combination of tools used. Additionally, the workspace establishes a way to browse metabolic models and their metadata through the automatic generation of ad-hoc local wikis dedicated to monitoring and facilitating the process of reconstruction. AuReMe supports exploration and semantic query based on RDF databases. We illustrate how this workspace allowed handling, in an integrated way, the metabolic reconstructions of non-model organisms such as an extremophile bacterium or eukaryote algae. Among relevant applications, the latter reconstruction led to putative evolutionary insights of a metabolic pathway. PMID:29791443

  7. Superfast high-resolution absolute 3D recovery of a stabilized flapping flight process.

    PubMed

    Li, Beiwen; Zhang, Song

    2017-10-30

    Scientific research of a stabilized flapping flight process (e.g. hovering) has been of great interest to a variety of fields including biology, aerodynamics, and bio-inspired robotics. Different from the current passive photogrammetry based methods, the digital fringe projection (DFP) technique has the capability of performing dense superfast (e.g. kHz) 3D topological reconstructions with the projection of defocused binary patterns, yet it is still a challenge to measure a flapping flight process with the presence of rapid flapping wings. This paper presents a novel absolute 3D reconstruction method for a stabilized flapping flight process. Essentially, the slow motion parts (e.g. body) and the fast-motion parts (e.g. wings) are segmented and separately reconstructed with phase shifting techniques and the Fourier transform, respectively. The topological relations between the wings and the body are utilized to ensure absolute 3D reconstruction. Experiments demonstrate the success of our computational framework by testing a flapping wing robot at different flapping speeds.

  8. Meaning Reconstruction Process After Suicide: Life-Story of a Japanese Woman Who Lost Her Son to Suicide.

    PubMed

    Kawashima, Daisuke; Kawano, Kenji

    2017-09-01

    Although Japan has a high suicide rate, there is insufficient research on the experiences of suicide-bereaved individuals. We investigated the qualitative aspects of the meaning reconstruction process after a loss to suicide. We conducted a life-story interview using open-ended questions with one middle-aged Japanese woman who lost her son to suicide. We used a narrative approach to transcribe and code the participant's narratives for analysis. The analysis revealed three meaning groups that structured the participant's reactions to the suicide: making sense of her son's death and life, relationships with other people, and reconstruction of a bond with the deceased. The belief that death is not an eternal split and that there is a connection between the living and the deceased reduced the pain felt by our participant. Furthermore, the narratives worked as scaffolds in the meaning reconstruction process. We discuss our results in the light of cross-cultural differences in the grieving process.

  9. Pre-operative CT angiography and three-dimensional image post processing for deep inferior epigastric perforator flap breast reconstructive surgery.

    PubMed

    Lam, D L; Mitsumori, L M; Neligan, P C; Warren, B H; Shuman, W P; Dubinsky, T J

    2012-12-01

    Autologous breast reconstructive surgery with deep inferior epigastric artery (DIEA) perforator flaps has become the mainstay for breast reconstructive surgery. CT angiography and three-dimensional image post processing can depict the number, size, course and location of the DIEA perforating arteries for the pre-operative selection of the best artery to use for the tissue flap. Knowledge of the location and selection of the optimal perforating artery shortens operative times and decreases patient morbidity.

  10. Liquid argon TPC signal formation, signal processing and reconstruction techniques

    NASA Astrophysics Data System (ADS)

    Baller, B.

    2017-07-01

    This document describes a reconstruction chain that was developed for the ArgoNeuT and MicroBooNE experiments at Fermilab. These experiments study accelerator neutrino interactions that occur in a Liquid Argon Time Projection Chamber. Reconstructing the properties of particles produced in these interactions benefits from the knowledge of the micro-physics processes that affect the creation and transport of ionization electrons to the readout system. A wire signal deconvolution technique was developed to convert wire signals to a standard form for hit reconstruction, to remove artifacts in the electronics chain and to remove coherent noise. A unique clustering algorithm reconstructs line-like trajectories and vertices in two dimensions which are then matched to create of 3D objects. These techniques and algorithms are available to all experiments that use the LArSoft suite of software.

  11. Digital reconstruction of Young's fringes using Fresnel transformation

    NASA Astrophysics Data System (ADS)

    Kulenovic, Rudi; Song, Yaozu; Renninger, P.; Groll, Manfred

    1997-11-01

    This paper deals with the digital numerical reconstruction of Young's fringes from laser speckle photography by means of the Fresnel-transformation. The physical model of the optical reconstruction of a specklegram is a near-field Fresnel-diffraction phenomenon which can be mathematically described by the Fresnel-transformation. Therefore, the interference phenomena can be directly calculated by a microcomputer.If additional a CCD-camera is used for specklegram recording the measurement procedure and evaluation process can be completely carried out in a digital way. Compared with conventional laser speckle photography no holographic plates, no wet development process and no optical specklegram reconstruction are needed. These advantages reveal a wide future in scientific and engineering applications. The basic principle of the numerical reconstruction is described, the effects of experimental parameters of Young's fringes are analyzed and representative results are presented.

  12. Reconstruction of biofilm images: combining local and global structural parameters

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Resat, Haluk; Renslow, Ryan S.; Beyenal, Haluk

    2014-10-20

    Digitized images can be used for quantitative comparison of biofilms grown under different conditions. Using biofilm image reconstruction, it was previously found that biofilms with a completely different look can have nearly identical structural parameters and that the most commonly utilized global structural parameters were not sufficient to uniquely define these biofilms. Here, additional local and global parameters are introduced to show that these parameters considerably increase the reliability of the image reconstruction process. Assessment using human evaluators indicated that the correct identification rate of the reconstructed images increased from 50% to 72% with the introduction of the new parametersmore » into the reconstruction procedure. An expanded set of parameters especially improved the identification of biofilm structures with internal orientational features and of structures in which colony sizes and spatial locations varied. Hence, the newly introduced structural parameter sets helped to better classify the biofilms by incorporating finer local structural details into the reconstruction process.« less

  13. Prehistoric cooking versus accurate palaeotemperature records in shell midden constituents.

    PubMed

    Müller, Peter; Staudigel, Philip T; Murray, Sean T; Vernet, Robert; Barusseau, Jean-Paul; Westphal, Hildegard; Swart, Peter K

    2017-06-15

    The reconstruction of pre-depositional cooking treatments used by prehistoric coastal populations for processing aquatic faunal resources is often difficult in archaeological shell midden assemblages. Besides limiting our knowledge of various social, cultural, economic and technological aspects of shell midden formation, unknown pre-depositional cooking techniques can also introduce large errors in palaeoclimate reconstructions as they can considerably alter the geochemical proxy signatures in calcareous skeletal structures such as bivalve shells or fish otoliths. Based on experimental and archaeological data, we show that carbonate clumped-isotope thermometry can be used to detect and reconstruct prehistoric processing methods in skeletal aragonite from archaeological shell midden assemblages. Given the temperature-dependent re-equilibration of clumped isotopes in aragonitic carbonates, this allows specific processing, cooking or trash dispersal strategies such as boiling, roasting, or burning to be differentiated. Besides permitting the detailed reconstruction of cultural or technological aspects of shell midden formation, this also allows erroneous palaeoclimate reconstructions to be avoided as all aragonitic shells subjected to pre-historic cooking methods show a clear alteration of their initial oxygen isotopic composition.

  14. Image processing pipeline for synchrotron-radiation-based tomographic microscopy.

    PubMed

    Hintermüller, C; Marone, F; Isenegger, A; Stampanoni, M

    2010-07-01

    With synchrotron-radiation-based tomographic microscopy, three-dimensional structures down to the micrometer level can be visualized. Tomographic data sets typically consist of 1000 to 1500 projections of 1024 x 1024 to 2048 x 2048 pixels and are acquired in 5-15 min. A processing pipeline has been developed to handle this large amount of data efficiently and to reconstruct the tomographic volume within a few minutes after the end of a scan. Just a few seconds after the raw data have been acquired, a selection of reconstructed slices is accessible through a web interface for preview and to fine tune the reconstruction parameters. The same interface allows initiation and control of the reconstruction process on the computer cluster. By integrating all programs and tools, required for tomographic reconstruction into the pipeline, the necessary user interaction is reduced to a minimum. The modularity of the pipeline allows functionality for new scan protocols to be added, such as an extended field of view, or new physical signals such as phase-contrast or dark-field imaging etc.

  15. Statistical iterative reconstruction for streak artefact reduction when using multidetector CT to image the dento-alveolar structures.

    PubMed

    Dong, J; Hayakawa, Y; Kober, C

    2014-01-01

    When metallic prosthetic appliances and dental fillings exist in the oral cavity, the appearance of metal-induced streak artefacts is not avoidable in CT images. The aim of this study was to develop a method for artefact reduction using the statistical reconstruction on multidetector row CT images. Adjacent CT images often depict similar anatomical structures. Therefore, reconstructed images with weak artefacts were attempted using projection data of an artefact-free image in a neighbouring thin slice. Images with moderate and strong artefacts were continuously processed in sequence by successive iterative restoration where the projection data was generated from the adjacent reconstructed slice. First, the basic maximum likelihood-expectation maximization algorithm was applied. Next, the ordered subset-expectation maximization algorithm was examined. Alternatively, a small region of interest setting was designated. Finally, the general purpose graphic processing unit machine was applied in both situations. The algorithms reduced the metal-induced streak artefacts on multidetector row CT images when the sequential processing method was applied. The ordered subset-expectation maximization and small region of interest reduced the processing duration without apparent detriments. A general-purpose graphic processing unit realized the high performance. A statistical reconstruction method was applied for the streak artefact reduction. The alternative algorithms applied were effective. Both software and hardware tools, such as ordered subset-expectation maximization, small region of interest and general-purpose graphic processing unit achieved fast artefact correction.

  16. The Relationship of Obesity to Increasing Health-Care Burden in the Setting of Orthopaedic Polytrauma.

    PubMed

    Licht, Heather; Murray, Mark; Vassaur, John; Jupiter, Daniel C; Regner, Justin L; Chaput, Christopher D

    2015-11-18

    With the rise of obesity in the American population, there has been a proportionate increase of obesity in the trauma population. The purpose of this study was to use a computed tomography-based measurement of adiposity to determine if obesity is associated with an increased burden to the health-care system in patients with orthopaedic polytrauma. A prospective comprehensive trauma database at a level-I trauma center was utilized to identify 301 patients with polytrauma who had orthopaedic injuries and intensive care unit admission from 2006 to 2011. Routine thoracoabdominal computed tomographic scans allowed for measurement of the truncal adiposity volume. The truncal three-dimensional reconstruction body mass index was calculated from the computed tomography-based volumes based on a previously validated algorithm. A truncal three-dimensional reconstruction body mass index of <30 kg/m(2) denoted non-obese patients and ≥ 30 kg/m(2) denoted obese patients. The need for orthopaedic surgical procedure, in-hospital mortality, length of stay, hospital charges, and discharge disposition were compared between the two groups. Of the 301 patients, 21.6% were classified as obese (truncal three-dimensional reconstruction body mass index of ≥ 30 kg/m(2)). Higher truncal three-dimensional reconstruction body mass index was associated with longer hospital length of stay (p = 0.02), more days spent in the intensive care unit (p = 0.03), more frequent discharge to a long-term care facility (p < 0.0002), higher rate of orthopaedic surgical intervention (p < 0.01), and increased total hospital charges (p < 0.001). Computed tomographic scans, routinely obtained at the time of admission, can be utilized to calculate truncal adiposity and to investigate the impact of obesity on patients with polytrauma. Obese patients were found to have higher total hospital charges, longer hospital stays, discharge to a continuing-care facility, and a higher rate of orthopaedic surgical intervention. Copyright © 2015 by The Journal of Bone and Joint Surgery, Incorporated.

  17. [What has been brought to residents and communities by the nuclear power plant accident? Special and serious disaster relief procedure modification after the 2011 Tohoku earthquake and tsunami in Fukushima].

    PubMed

    Ishikawa, Kazunobu

    2011-01-01

    After the catastrophic 2011 Tohoku earthquake and tsunami which struck cities and towns on the Japanese Pacific coast, Fukushima has been the focus of special and serious disaster relief procedures modification regarding nuclear power plant accidents. To date, the Japanese government has repeatedly issued evacuation orders to more than 100,000 residents. Huge numbers of refugees are still uncertain if they can return home and re-cultivate their farm land. Ambiguous public announcements concerning the radiation risks seem to have aggravated feelings of insecurity, fear and the desire to escape, both at home and abroad. This disaster has seriously undermined trust internationally and locally in Fukushima. Harmful rumors added further difficulties. In response to this disaster, local government, medical institutions, care facilities, police, emergency services and the self-defense forces continue to put their utmost effort into reconstruction. This seismic disaster has reminded us that supplies of water, electricity, gas, gasoline and telephone/communication facilities are essential prerequisites for reconstruction and daily life. Disaster and radiation medical association teams actively participated in the rescue efforts, and a number of organized medical teams cared for about 15,000 refugees in 100 shelters. We also visited home-bound patients, who were unable to evacuate from the 20-30 km inner evacuation area. In this relief role, we need to consider the following; (1) professionals, both healthcare and nuclear engineers, must always be prepared for unexpected circumstances, (2) the daily organic cooperation of individuals and units is closely linked to readiness against sudden risks, and (3) appropriate accountability is essential to assuage the fears of residents and refugees. A sincere learning process may benefit those innocent refugees who may be forced to abandon their homes permanently.

  18. A 3D Reconstruction Strategy of Vehicle Outline Based on Single-Pass Single-Polarization CSAR Data.

    PubMed

    Leping Chen; Daoxiang An; Xiaotao Huang; Zhimin Zhou

    2017-11-01

    In the last few years, interest in circular synthetic aperture radar (CSAR) acquisitions has arisen as a consequence of the potential achievement of 3D reconstructions over 360° azimuth angle variation. In real-world scenarios, full 3D reconstructions of arbitrary targets need multi-pass data, which makes the processing complex, money-consuming, and time expending. In this paper, we propose a processing strategy for the 3D reconstruction of vehicle, which can avoid using multi-pass data by introducing a priori information of vehicle's shape. Besides, the proposed strategy just needs the single-pass single-polarization CSAR data to perform vehicle's 3D reconstruction, which makes the processing much more economic and efficient. First, an analysis of the distribution of attributed scattering centers from vehicle facet model is presented. And the analysis results show that a smooth and continuous basic outline of vehicle could be extracted from the peak curve of a noncoherent processing image. Second, the 3D location of vehicle roofline is inferred from layover with empirical insets of the basic outline. At last, the basic line and roofline of the vehicle are used to estimate the vehicle's 3D information and constitute the vehicle's 3D outline. The simulated and measured data processing results prove the correctness and effectiveness of our proposed strategy.

  19. Analyser-based phase contrast image reconstruction using geometrical optics.

    PubMed

    Kitchen, M J; Pavlov, K M; Siu, K K W; Menk, R H; Tromba, G; Lewis, R A

    2007-07-21

    Analyser-based phase contrast imaging can provide radiographs of exceptional contrast at high resolution (<100 microm), whilst quantitative phase and attenuation information can be extracted using just two images when the approximations of geometrical optics are satisfied. Analytical phase retrieval can be performed by fitting the analyser rocking curve with a symmetric Pearson type VII function. The Pearson VII function provided at least a 10% better fit to experimentally measured rocking curves than linear or Gaussian functions. A test phantom, a hollow nylon cylinder, was imaged at 20 keV using a Si(1 1 1) analyser at the ELETTRA synchrotron radiation facility. Our phase retrieval method yielded a more accurate object reconstruction than methods based on a linear fit to the rocking curve. Where reconstructions failed to map expected values, calculations of the Takagi number permitted distinction between the violation of the geometrical optics conditions and the failure of curve fitting procedures. The need for synchronized object/detector translation stages was removed by using a large, divergent beam and imaging the object in segments. Our image acquisition and reconstruction procedure enables quantitative phase retrieval for systems with a divergent source and accounts for imperfections in the analyser.

  20. Supersonic Flight Dynamics Test: Trajectory, Atmosphere, and Aerodynamics Reconstruction

    NASA Technical Reports Server (NTRS)

    Kutty, Prasad; Karlgaard, Christopher D.; Blood, Eric M.; O'Farrell, Clara; Ginn, Jason M.; Shoenenberger, Mark; Dutta, Soumyo

    2015-01-01

    The Supersonic Flight Dynamics Test is a full-scale flight test of a Supersonic Inflatable Aerodynamic Decelerator, which is part of the Low Density Supersonic Decelerator technology development project. The purpose of the project is to develop and mature aerodynamic decelerator technologies for landing large mass payloads on the surface of Mars. The technologies include a Supersonic Inflatable Aerodynamic Decelerator and Supersonic Parachutes. The first Supersonic Flight Dynamics Test occurred on June 28th, 2014 at the Pacific Missile Range Facility. This test was used to validate the test architecture for future missions. The flight was a success and, in addition, was able to acquire data on the aerodynamic performance of the supersonic inflatable decelerator. This paper describes the instrumentation, analysis techniques, and acquired flight test data utilized to reconstruct the vehicle trajectory, atmosphere, and aerodynamics. The results of the reconstruction show significantly higher lofting of the trajectory, which can partially be explained by off-nominal booster motor performance. The reconstructed vehicle force and moment coefficients fall well within pre-flight predictions. A parameter identification analysis indicates that the vehicle displayed greater aerodynamic static stability than seen in pre-flight computational predictions and ballistic range tests.

  1. Vectorization with SIMD extensions speeds up reconstruction in electron tomography.

    PubMed

    Agulleiro, J I; Garzón, E M; García, I; Fernández, J J

    2010-06-01

    Electron tomography allows structural studies of cellular structures at molecular detail. Large 3D reconstructions are needed to meet the resolution requirements. The processing time to compute these large volumes may be considerable and so, high performance computing techniques have been used traditionally. This work presents a vector approach to tomographic reconstruction that relies on the exploitation of the SIMD extensions available in modern processors in combination to other single processor optimization techniques. This approach succeeds in producing full resolution tomograms with an important reduction in processing time, as evaluated with the most common reconstruction algorithms, namely WBP and SIRT. The main advantage stems from the fact that this approach is to be run on standard computers without the need of specialized hardware, which facilitates the development, use and management of programs. Future trends in processor design open excellent opportunities for vector processing with processor's SIMD extensions in the field of 3D electron microscopy.

  2. National perspective on in-hospital emergency units in Iraq

    PubMed Central

    Lafta, Riyadh K.; Al-Nuaimi, Maha A.

    2013-01-01

    Background: Hospitals play a crucial role in providing communities with essential medical care during times of disasters. The emergency department is the most vital component of hospitals' inpatient business. In Iraq, at present, there are many casualties that cause a burden of work and the need for structural assessment, equipment updating and evaluation of process. Objective: To examine the current pragmatic functioning of the existing set-up of services of in-hospital emergency departments within some general hospitals in Baghdad and Mosul in order to establish a mechanism for future evaluation for the health services in our community. Methods: A cross-sectional study was employed to evaluate the structure, process and function of six major hospitals with emergency units: four major hospitals in Baghdad and two in Mosul. Results: The six surveyed emergency units are distinct units within general hospitals that serve (collectively) one quarter of the total population. More than one third of these units feature observation unit beds, laboratory services, imaging facilities, pharmacies with safe storage, and ambulatory entrance. Operation room was found only in one hospital's reception and waiting area. Consultation/track area, cubicles for infection control, and discrete tutorial rooms were not available. Patient assessment was performed (although without adequate privacy). The emergency specialist, family medicine specialist and interested general practitioner exist in one-third of the surveyed units. Psychiatrist, physiotherapists, occupational therapists, and social work links are not available. The shortage in medication, urgent vaccines and vital facilities is an obvious problem. Conclusions: Our emergency unit's level and standards of care are underdeveloped. The inconsistent process and inappropriate environments need to be reconstructed. The lack of drugs, commodities, communication infrastructure, audit and training all require effective build up. PMID:25003053

  3. Fast direct fourier reconstruction of radial and PROPELLER MRI data using the chirp transform algorithm on graphics hardware.

    PubMed

    Feng, Yanqiu; Song, Yanli; Wang, Cong; Xin, Xuegang; Feng, Qianjin; Chen, Wufan

    2013-10-01

    To develop and test a new algorithm for fast direct Fourier transform (DrFT) reconstruction of MR data on non-Cartesian trajectories composed of lines with equally spaced points. The DrFT, which is normally used as a reference in evaluating the accuracy of other reconstruction methods, can reconstruct images directly from non-Cartesian MR data without interpolation. However, DrFT reconstruction involves substantially intensive computation, which makes the DrFT impractical for clinical routine applications. In this article, the Chirp transform algorithm was introduced to accelerate the DrFT reconstruction of radial and Periodically Rotated Overlapping ParallEL Lines with Enhanced Reconstruction (PROPELLER) MRI data located on the trajectories that are composed of lines with equally spaced points. The performance of the proposed Chirp transform algorithm-DrFT algorithm was evaluated by using simulation and in vivo MRI data. After implementing the algorithm on a graphics processing unit, the proposed Chirp transform algorithm-DrFT algorithm achieved an acceleration of approximately one order of magnitude, and the speed-up factor was further increased to approximately three orders of magnitude compared with the traditional single-thread DrFT reconstruction. Implementation the Chirp transform algorithm-DrFT algorithm on the graphics processing unit can efficiently calculate the DrFT reconstruction of the radial and PROPELLER MRI data. Copyright © 2012 Wiley Periodicals, Inc.

  4. The high-rate data challenge: computing for the CBM experiment

    NASA Astrophysics Data System (ADS)

    Friese, V.; CBM Collaboration

    2017-10-01

    The Compressed Baryonic Matter experiment (CBM) is a next-generation heavy-ion experiment to be operated at the FAIR facility, currently under construction in Darmstadt, Germany. A key feature of CBM is very high interaction rate, exceeding those of contemporary nuclear collision experiments by several orders of magnitude. Such interaction rates forbid a conventional, hardware-triggered readout; instead, experiment data will be freely streaming from self-triggered front-end electronics. In order to reduce the huge raw data volume to a recordable rate, data will be selected exclusively on CPU, which necessitates partial event reconstruction in real-time. Consequently, the traditional segregation of online and offline software vanishes; an integrated on- and offline data processing concept is called for. In this paper, we will report on concepts and developments for computing for CBM as well as on the status of preparations for its first physics run.

  5. Development and evaluation of a Hadamard transform imaging spectrometer and a Hadamard transform thermal imager

    NASA Technical Reports Server (NTRS)

    Harwit, M.; Swift, R.; Wattson, R.; Decker, J.; Paganetti, R.

    1976-01-01

    A spectrometric imager and a thermal imager, which achieve multiplexing by the use of binary optical encoding masks, were developed. The masks are based on orthogonal, pseudorandom digital codes derived from Hadamard matrices. Spatial and/or spectral data is obtained in the form of a Hadamard transform of the spatial and/or spectral scene; computer algorithms are then used to decode the data and reconstruct images of the original scene. The hardware, algorithms and processing/display facility are described. A number of spatial and spatial/spectral images are presented. The achievement of a signal-to-noise improvement due to the signal multiplexing was also demonstrated. An analysis of the results indicates both the situations for which the multiplex advantage may be gained, and the limitations of the technique. A number of potential applications of the spectrometric imager are discussed.

  6. Microsurgical scalp reconstruction after a mountain lion attack.

    PubMed

    Hazani, Ron; Buntic, Rudolf F; Brooks, Darrell

    2008-09-01

    Mountain lion attacks on humans are rare and potentially fatal. Although few victims experience minor injuries, permanent disfigurement and disability is common among survivors of these assaults. Since 1986, a steady number of mountain lion attacks have been noted in California. We report a recent attack of a cougar on a couple hiking in California's Prairie Creek Redwoods State Park. The victim sustained a significant scalp injury that led to a life-threatening soft-tissue infection. We present an analysis of the injury pattern as it relates to the bite marks, the resulting degloving injury, and the surgical reconstruction. We also offer a current survey of the pathogens often found in cats' and mountain lions' bite wounds and the appropriate antibiotic treatment. Given the infrequency at which clinicians encounter mountain lion injuries, we recommend that after initial management and exclusion of life threatening injuries patients be transferred to a tertiary care facility capable of managing the various reconstructive challenges such as the one presented in this case.

  7. Correction of data truncation artifacts in differential phase contrast (DPC) tomosynthesis imaging

    NASA Astrophysics Data System (ADS)

    Garrett, John; Ge, Yongshuai; Li, Ke; Chen, Guang-Hong

    2015-10-01

    The use of grating based Talbot-Lau interferometry permits the acquisition of differential phase contrast (DPC) imaging with a conventional medical x-ray source and detector. However, due to the limited area of the gratings, limited area of the detector, or both, data truncation image artifacts are often observed in tomographic DPC acquisitions and reconstructions, such as tomosynthesis (limited-angle tomography). When data are truncated in the conventional x-ray absorption tomosynthesis imaging, a variety of methods have been developed to mitigate the truncation artifacts. However, the same strategies used to mitigate absorption truncation artifacts do not yield satisfactory reconstruction results in DPC tomosynthesis reconstruction. In this work, several new methods have been proposed to mitigate data truncation artifacts in a DPC tomosynthesis system. The proposed methods have been validated using experimental data of a mammography accreditation phantom, a bovine udder, as well as several human cadaver breast specimens using a bench-top DPC imaging system at our facility.

  8. Safeguards Approaches for Black Box Processes or Facilities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Diaz-Marcano, Helly; Gitau, Ernest TN; Hockert, John

    2013-09-25

    The objective of this study is to determine whether a safeguards approach can be developed for “black box” processes or facilities. These are facilities where a State or operator may limit IAEA access to specific processes or portions of a facility; in other cases, the IAEA may be prohibited access to the entire facility. The determination of whether a black box process or facility is safeguardable is dependent upon the details of the process type, design, and layout; the specific limitations on inspector access; and the restrictions placed upon the design information that can be provided to the IAEA. Thismore » analysis identified the necessary conditions for safeguardability of black box processes and facilities.« less

  9. Consolidated Laser-Induced Fluorescence Diagnostic Systems for the NASA Ames Arc Jet Facilities

    NASA Technical Reports Server (NTRS)

    Grinstead, Jay H.; Wilder, Michael C.; Porter, Barry J.; Brown, Jeffrey D.; Yeung, Dickson; Battazzo, Stephen J.; Brubaker, Timothy R.

    2016-01-01

    The spectroscopic diagnostic technique of two photon absorption laser-induced fluorescence (LIF) of atomic species for non-intrusive arc jet flow property measurement was first implemented at NASA Ames in the mid-1990s. In 2013-2014, NASA combined the agency's large-scale arc jet test capabilities at NASA Ames. Concurrent with that effort, the agency also sponsored a project to establish two comprehensive LIF diagnostic systems for the Aerodynamic Heating Facility (AHF) and Interaction Heating Facility (IHF) arc jets. The scope of the project enabled further engineering development of the existing IHF LIF system as well as the complete reconstruction of the AHF LIF system. The updated LIF systems are identical in design and capability. They represent the culmination of over 20 years of development experience in transitioning a specialized laboratory research tool into a measurement system for large-scale, high-demand test facilities. This paper will document the latest improvements of the LIF system design and demonstrations of the redeveloped AHF and IHF LIF systems.

  10. Cryo-EM Structure Determination Using Segmented Helical Image Reconstruction.

    PubMed

    Fromm, S A; Sachse, C

    2016-01-01

    Treating helices as single-particle-like segments followed by helical image reconstruction has become the method of choice for high-resolution structure determination of well-ordered helical viruses as well as flexible filaments. In this review, we will illustrate how the combination of latest hardware developments with optimized image processing routines have led to a series of near-atomic resolution structures of helical assemblies. Originally, the treatment of helices as a sequence of segments followed by Fourier-Bessel reconstruction revealed the potential to determine near-atomic resolution structures from helical specimens. In the meantime, real-space image processing of helices in a stack of single particles was developed and enabled the structure determination of specimens that resisted classical Fourier helical reconstruction and also facilitated high-resolution structure determination. Despite the progress in real-space analysis, the combination of Fourier and real-space processing is still commonly used to better estimate the symmetry parameters as the imposition of the correct helical symmetry is essential for high-resolution structure determination. Recent hardware advancement by the introduction of direct electron detectors has significantly enhanced the image quality and together with improved image processing procedures has made segmented helical reconstruction a very productive cryo-EM structure determination method. © 2016 Elsevier Inc. All rights reserved.

  11. Fetal brain volumetry through MRI volumetric reconstruction and segmentation

    PubMed Central

    Estroff, Judy A.; Barnewolt, Carol E.; Connolly, Susan A.; Warfield, Simon K.

    2013-01-01

    Purpose Fetal MRI volumetry is a useful technique but it is limited by a dependency upon motion-free scans, tedious manual segmentation, and spatial inaccuracy due to thick-slice scans. An image processing pipeline that addresses these limitations was developed and tested. Materials and methods The principal sequences acquired in fetal MRI clinical practice are multiple orthogonal single-shot fast spin echo scans. State-of-the-art image processing techniques were used for inter-slice motion correction and super-resolution reconstruction of high-resolution volumetric images from these scans. The reconstructed volume images were processed with intensity non-uniformity correction and the fetal brain extracted by using supervised automated segmentation. Results Reconstruction, segmentation and volumetry of the fetal brains for a cohort of twenty-five clinically acquired fetal MRI scans was done. Performance metrics for volume reconstruction, segmentation and volumetry were determined by comparing to manual tracings in five randomly chosen cases. Finally, analysis of the fetal brain and parenchymal volumes was performed based on the gestational age of the fetuses. Conclusion The image processing pipeline developed in this study enables volume rendering and accurate fetal brain volumetry by addressing the limitations of current volumetry techniques, which include dependency on motion-free scans, manual segmentation, and inaccurate thick-slice interpolation. PMID:20625848

  12. Speckle Image Reconstruction.

    DTIC Science & Technology

    1985-04-01

    from observations using the University of Arizona 2.3 meter telescope, the Kitt Peak National Observatory 4 meter telescope and the Multiple Mirror...Telescope. Kitt Peak Natioinal Observatory, a division of the National Optical Astronomy Observatories, is operated by the Association of Universities for...Research in Astronomy, Inc., under contract to the National Science Foundation. The Multiple Mirror Telescope is a joint facility of the University

  13. Exploration, Sampling, And Reconstruction of Free Energy Surfaces with Gaussian Process Regression.

    PubMed

    Mones, Letif; Bernstein, Noam; Csányi, Gábor

    2016-10-11

    Practical free energy reconstruction algorithms involve three separate tasks: biasing, measuring some observable, and finally reconstructing the free energy surface from those measurements. In more than one dimension, adaptive schemes make it possible to explore only relatively low lying regions of the landscape by progressively building up the bias toward the negative of the free energy surface so that free energy barriers are eliminated. Most schemes use the final bias as their best estimate of the free energy surface. We show that large gains in computational efficiency, as measured by the reduction of time to solution, can be obtained by separating the bias used for dynamics from the final free energy reconstruction itself. We find that biasing with metadynamics, measuring a free energy gradient estimator, and reconstructing using Gaussian process regression can give an order of magnitude reduction in computational cost.

  14. Reconstructing the interaction between dark energy and dark matter using Gaussian processes

    NASA Astrophysics Data System (ADS)

    Yang, Tao; Guo, Zong-Kuan; Cai, Rong-Gen

    2015-06-01

    We present a nonparametric approach to reconstruct the interaction between dark energy and dark matter directly from SNIa Union 2.1 data using Gaussian processes, which is a fully Bayesian approach for smoothing data. In this method, once the equation of state (w ) of dark energy is specified, the interaction can be reconstructed as a function of redshift. For the decaying vacuum energy case with w =-1 , the reconstructed interaction is consistent with the standard Λ CDM model, namely, there is no evidence for the interaction. This also holds for the constant w cases from -0.9 to -1.1 and for the Chevallier-Polarski-Linder (CPL) parametrization case. If the equation of state deviates obviously from -1 , the reconstructed interaction exists at 95% confidence level. This shows the degeneracy between the interaction and the equation of state of dark energy when they get constraints from the observational data.

  15. Optimization of Stereo Matching in 3D Reconstruction Based on Binocular Vision

    NASA Astrophysics Data System (ADS)

    Gai, Qiyang

    2018-01-01

    Stereo matching is one of the key steps of 3D reconstruction based on binocular vision. In order to improve the convergence speed and accuracy in 3D reconstruction based on binocular vision, this paper adopts the combination method of polar constraint and ant colony algorithm. By using the line constraint to reduce the search range, an ant colony algorithm is used to optimize the stereo matching feature search function in the proposed search range. Through the establishment of the stereo matching optimization process analysis model of ant colony algorithm, the global optimization solution of stereo matching in 3D reconstruction based on binocular vision system is realized. The simulation results show that by the combining the advantage of polar constraint and ant colony algorithm, the stereo matching range of 3D reconstruction based on binocular vision is simplified, and the convergence speed and accuracy of this stereo matching process are improved.

  16. Kinematic reconstruction in cardiovascular imaging.

    PubMed

    Bastarrika, G; Huebra Rodríguez, I J González de la; Calvo-Imirizaldu, M; Suárez Vega, V M; Alonso-Burgos, A

    2018-05-17

    Advances in clinical applications of computed tomography have been accompanied by improvements in advanced post-processing tools. In addition to multiplanar reconstructions, curved planar reconstructions, maximum intensity projections, and volumetric reconstructions, very recently kinematic reconstruction has been developed. This new technique, based on mathematical models that simulate the propagation of light beams through a volume of data, makes it possible to obtain very realistic three dimensional images. This article illustrates examples of kinematic reconstructions and compares them with classical volumetric reconstructions in patients with cardiovascular disease in a way that makes it easy to establish the differences between the two types of reconstruction. Kinematic reconstruction is a new method for representing three dimensional images that facilitates the explanation and comprehension of the findings. Copyright © 2018 SERAM. Publicado por Elsevier España, S.L.U. All rights reserved.

  17. SPECIAL ISSUE ON OPTICAL PROCESSING OF INFORMATION: Reconstruction of vector physical fields by optical tomography

    NASA Astrophysics Data System (ADS)

    Kulchin, Yurii N.; Vitrik, O. B.; Kamenev, O. T.; Kirichenko, O. V.; Petrov, Yu S.

    1995-10-01

    Reconstruction of vector physical fields by optical tomography, with the aid of a system of fibre-optic measuring lines, is considered. The reported experimental results are used to reconstruct the distribution of the square of the gradient of transverse displacements of a flat membrane.

  18. 42 CFR 82.33 - How will NIOSH inform the public of changes to the scientific elements underlying the dose...

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... scientific elements underlying the dose reconstruction process? 82.33 Section 82.33 Public Health PUBLIC... RELATED ACTIVITIES METHODS FOR CONDUCTING DOSE RECONSTRUCTION UNDER THE ENERGY EMPLOYEES OCCUPATIONAL ILLNESS COMPENSATION PROGRAM ACT OF 2000 Updating the Scientific Elements Underlying Dose Reconstructions...

  19. 42 CFR 82.33 - How will NIOSH inform the public of changes to the scientific elements underlying the dose...

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... scientific elements underlying the dose reconstruction process? 82.33 Section 82.33 Public Health PUBLIC... RELATED ACTIVITIES METHODS FOR CONDUCTING DOSE RECONSTRUCTION UNDER THE ENERGY EMPLOYEES OCCUPATIONAL ILLNESS COMPENSATION PROGRAM ACT OF 2000 Updating the Scientific Elements Underlying Dose Reconstructions...

  20. 42 CFR 82.33 - How will NIOSH inform the public of changes to the scientific elements underlying the dose...

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... scientific elements underlying the dose reconstruction process? 82.33 Section 82.33 Public Health PUBLIC... RELATED ACTIVITIES METHODS FOR CONDUCTING DOSE RECONSTRUCTION UNDER THE ENERGY EMPLOYEES OCCUPATIONAL ILLNESS COMPENSATION PROGRAM ACT OF 2000 Updating the Scientific Elements Underlying Dose Reconstructions...

  1. 42 CFR 82.33 - How will NIOSH inform the public of changes to the scientific elements underlying the dose...

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... scientific elements underlying the dose reconstruction process? 82.33 Section 82.33 Public Health PUBLIC... RELATED ACTIVITIES METHODS FOR CONDUCTING DOSE RECONSTRUCTION UNDER THE ENERGY EMPLOYEES OCCUPATIONAL ILLNESS COMPENSATION PROGRAM ACT OF 2000 Updating the Scientific Elements Underlying Dose Reconstructions...

  2. 42 CFR 82.31 - How can the public recommend changes to scientific elements underlying the dose reconstruction...

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... SERVICE, DEPARTMENT OF HEALTH AND HUMAN SERVICES OCCUPATIONAL SAFETY AND HEALTH RESEARCH AND RELATED ACTIVITIES METHODS FOR CONDUCTING DOSE RECONSTRUCTION UNDER THE ENERGY EMPLOYEES OCCUPATIONAL ILLNESS... elements underlying the dose reconstruction process, based on relevant new research findings and...

  3. The pLISA project in ASTERICS

    NASA Astrophysics Data System (ADS)

    De Bonis, Giulia; Bozza, Cristiano

    2017-03-01

    In the framework of Horizon 2020, the European Commission approved the ASTERICS initiative (ASTronomy ESFRI and Research Infrastructure CluSter) to collect knowledge and experiences from astronomy, astrophysics and particle physics and foster synergies among existing research infrastructures and scientific communities, hence paving the way for future ones. ASTERICS aims at producing a common set of tools and strategies to be applied in Astronomy ESFRI facilities. In particular, it will target the so-called multi-messenger approach to combine information from optical and radio telescopes, photon counters and neutrino telescopes. pLISA is a software tool under development in ASTERICS to help and promote machine learning as a unified approach to multivariate analysis of astrophysical data and signals. The library will offer a collection of classification parameters, estimators, classes and methods to be linked and used in reconstruction programs (and possibly also extended), to characterize events in terms of particle identification and energy. The pLISA library aims at offering the software infras tructure for applications developed inside different experiments and has been designed with an effort to extrapolate general, physics-related estimators from the specific features of the data model related to each particular experiment. pLISA is oriented towards parallel computing architectures, with awareness of the opportunity of using GPUs as accelerators demanding specifically optimized algorithms and to reduce the costs of pro cessing hardware requested for the reconstruction tasks. Indeed, a fast (ideally, real-time) reconstruction can open the way for the development or improvement of alert systems, typically required by multi-messenger search programmes among the different experi mental facilities involved in ASTERICS.

  4. Studies of Missing Energy Decays at Belle II

    NASA Astrophysics Data System (ADS)

    Guan, Yinghui

    The Belle II experiment at the SuperKEKB collider is a major upgrade of the KEK “B factory” facility in Tsukuba, Japan. The machine is designed for an instantaneous luminosity of 8 × 1035cm‑2s‑1, and the experiment is expected to accumulate a data sample of about 50 ab‑1. With this amount of data, decays sensitive to physics beyond the Standard Model can be studied with unprecedented precision. One promising set of modes are physics processes with missing energy such as B+ → τ+ν, B → D(∗)τν, and B → K(∗)νν¯ decays. The B → K(∗)νν¯ decay provides one of the cleanest experimental probes of the flavour-changing neutral current process b → sνν¯, which is sensitive to physics beyond the Standard Model. However, the missing energies of the neutrinos in the final state makes the measurement challenging and requires full reconstruction of the spectator B meson in e+e‑→ Υ(4S) → BB¯ events. This report discusses the expected sensitivities of Belle II for these rare decays.

  5. 40 CFR 52.279 - Food processing facilities.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 3 2012-07-01 2012-07-01 false Food processing facilities. 52.279... (CONTINUED) APPROVAL AND PROMULGATION OF IMPLEMENTATION PLANS California § 52.279 Food processing facilities... emissions from food processing facilities without any accompanying analyses demonstrating that these...

  6. 40 CFR 52.279 - Food processing facilities.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 3 2014-07-01 2014-07-01 false Food processing facilities. 52.279... (CONTINUED) APPROVAL AND PROMULGATION OF IMPLEMENTATION PLANS California § 52.279 Food processing facilities... emissions from food processing facilities without any accompanying analyses demonstrating that these...

  7. 40 CFR 52.279 - Food processing facilities.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 3 2011-07-01 2011-07-01 false Food processing facilities. 52.279... (CONTINUED) APPROVAL AND PROMULGATION OF IMPLEMENTATION PLANS California § 52.279 Food processing facilities... emissions from food processing facilities without any accompanying analyses demonstrating that these...

  8. 40 CFR 52.279 - Food processing facilities.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... emissions from food processing facilities without any accompanying analyses demonstrating that these... 40 Protection of Environment 3 2013-07-01 2013-07-01 false Food processing facilities. 52.279... (CONTINUED) APPROVAL AND PROMULGATION OF IMPLEMENTATION PLANS California § 52.279 Food processing facilities...

  9. Monte Carlo-based fluorescence molecular tomography reconstruction method accelerated by a cluster of graphic processing units.

    PubMed

    Quan, Guotao; Gong, Hui; Deng, Yong; Fu, Jianwei; Luo, Qingming

    2011-02-01

    High-speed fluorescence molecular tomography (FMT) reconstruction for 3-D heterogeneous media is still one of the most challenging problems in diffusive optical fluorescence imaging. In this paper, we propose a fast FMT reconstruction method that is based on Monte Carlo (MC) simulation and accelerated by a cluster of graphics processing units (GPUs). Based on the Message Passing Interface standard, we modified the MC code for fast FMT reconstruction, and different Green's functions representing the flux distribution in media are calculated simultaneously by different GPUs in the cluster. A load-balancing method was also developed to increase the computational efficiency. By applying the Fréchet derivative, a Jacobian matrix is formed to reconstruct the distribution of the fluorochromes using the calculated Green's functions. Phantom experiments have shown that only 10 min are required to get reconstruction results with a cluster of 6 GPUs, rather than 6 h with a cluster of multiple dual opteron CPU nodes. Because of the advantages of high accuracy and suitability for 3-D heterogeneity media with refractive-index-unmatched boundaries from the MC simulation, the GPU cluster-accelerated method provides a reliable approach to high-speed reconstruction for FMT imaging.

  10. Feature-constrained surface reconstruction approach for point cloud data acquired with 3D laser scanner

    NASA Astrophysics Data System (ADS)

    Wang, Yongbo; Sheng, Yehua; Lu, Guonian; Tian, Peng; Zhang, Kai

    2008-04-01

    Surface reconstruction is an important task in the field of 3d-GIS, computer aided design and computer graphics (CAD & CG), virtual simulation and so on. Based on available incremental surface reconstruction methods, a feature-constrained surface reconstruction approach for point cloud is presented. Firstly features are extracted from point cloud under the rules of curvature extremes and minimum spanning tree. By projecting local sample points to the fitted tangent planes and using extracted features to guide and constrain the process of local triangulation and surface propagation, topological relationship among sample points can be achieved. For the constructed models, a process named consistent normal adjustment and regularization is adopted to adjust normal of each face so that the correct surface model is achieved. Experiments show that the presented approach inherits the convenient implementation and high efficiency of traditional incremental surface reconstruction method, meanwhile, it avoids improper propagation of normal across sharp edges, which means the applicability of incremental surface reconstruction is greatly improved. Above all, appropriate k-neighborhood can help to recognize un-sufficient sampled areas and boundary parts, the presented approach can be used to reconstruct both open and close surfaces without additional interference.

  11. Automated Finite State Workflow for Distributed Data Production

    NASA Astrophysics Data System (ADS)

    Hajdu, L.; Didenko, L.; Lauret, J.; Amol, J.; Betts, W.; Jang, H. J.; Noh, S. Y.

    2016-10-01

    In statistically hungry science domains, data deluges can be both a blessing and a curse. They allow the narrowing of statistical errors from known measurements, and open the door to new scientific opportunities as research programs mature. They are also a testament to the efficiency of experimental operations. However, growing data samples may need to be processed with little or no opportunity for huge increases in computing capacity. A standard strategy has thus been to share resources across multiple experiments at a given facility. Another has been to use middleware that “glues” resources across the world so they are able to locally run the experimental software stack (either natively or virtually). We describe a framework STAR has successfully used to reconstruct a ~400 TB dataset consisting of over 100,000 jobs submitted to a remote site in Korea from STAR's Tier 0 facility at the Brookhaven National Laboratory. The framework automates the full workflow, taking raw data files from tape and writing Physics-ready output back to tape without operator or remote site intervention. Through hardening we have demonstrated 97(±2)% efficiency, over a period of 7 months of operation. The high efficiency is attributed to finite state checking with retries to encourage resilience in the system over capricious and fallible infrastructure.

  12. Parallel programming of gradient-based iterative image reconstruction schemes for optical tomography.

    PubMed

    Hielscher, Andreas H; Bartel, Sebastian

    2004-02-01

    Optical tomography (OT) is a fast developing novel imaging modality that uses near-infrared (NIR) light to obtain cross-sectional views of optical properties inside the human body. A major challenge remains the time-consuming, computational-intensive image reconstruction problem that converts NIR transmission measurements into cross-sectional images. To increase the speed of iterative image reconstruction schemes that are commonly applied for OT, we have developed and implemented several parallel algorithms on a cluster of workstations. Static process distribution as well as dynamic load balancing schemes suitable for heterogeneous clusters and varying machine performances are introduced and tested. The resulting algorithms are shown to accelerate the reconstruction process to various degrees, substantially reducing the computation times for clinically relevant problems.

  13. Retrospective reconstruction of Iodine-131 distribution through the analysis of Iodine-129

    NASA Astrophysics Data System (ADS)

    Matsuzaki, Hiroyuki; Muramatsu, Yasuyuki; Ohno, Takeshi; Mao, Wei

    2017-09-01

    Iodine-131 distribution released from the Fukushima Dai-ichi Nuclear Power Plant accident was reconstructed through the iodine-129 measurements. From nearly 1,000 surface soil samples iodine was extracted by the pyro hydrolysis method. Extracted iodine was then mixed with carrier, purified and finally collected as silver iodide. Silver iodide sample was pressed into the cathode holder and set at the ion source of the MALT facility, The University of Tokyo. The isotopic ratio 129I/127I was measured by means of Accelerator Mass Spectrometry. From 129I data obtained, 131I deposition map was constructed. There observed various fine structures in the map which could not estimated neither by the simulation nor 137Cs distribution.

  14. Accelerating Advanced MRI Reconstructions on GPUs

    PubMed Central

    Stone, S.S.; Haldar, J.P.; Tsao, S.C.; Hwu, W.-m.W.; Sutton, B.P.; Liang, Z.-P.

    2008-01-01

    Computational acceleration on graphics processing units (GPUs) can make advanced magnetic resonance imaging (MRI) reconstruction algorithms attractive in clinical settings, thereby improving the quality of MR images across a broad spectrum of applications. This paper describes the acceleration of such an algorithm on NVIDIA’s Quadro FX 5600. The reconstruction of a 3D image with 1283 voxels achieves up to 180 GFLOPS and requires just over one minute on the Quadro, while reconstruction on a quad-core CPU is twenty-one times slower. Furthermore, relative to the true image, the error exhibited by the advanced reconstruction is only 12%, while conventional reconstruction techniques incur error of 42%. PMID:21796230

  15. A survey of GPU-based acceleration techniques in MRI reconstructions

    PubMed Central

    Wang, Haifeng; Peng, Hanchuan; Chang, Yuchou

    2018-01-01

    Image reconstruction in magnetic resonance imaging (MRI) clinical applications has become increasingly more complicated. However, diagnostic and treatment require very fast computational procedure. Modern competitive platforms of graphics processing unit (GPU) have been used to make high-performance parallel computations available, and attractive to common consumers for computing massively parallel reconstruction problems at commodity price. GPUs have also become more and more important for reconstruction computations, especially when deep learning starts to be applied into MRI reconstruction. The motivation of this survey is to review the image reconstruction schemes of GPU computing for MRI applications and provide a summary reference for researchers in MRI community. PMID:29675361

  16. A survey of GPU-based acceleration techniques in MRI reconstructions.

    PubMed

    Wang, Haifeng; Peng, Hanchuan; Chang, Yuchou; Liang, Dong

    2018-03-01

    Image reconstruction in magnetic resonance imaging (MRI) clinical applications has become increasingly more complicated. However, diagnostic and treatment require very fast computational procedure. Modern competitive platforms of graphics processing unit (GPU) have been used to make high-performance parallel computations available, and attractive to common consumers for computing massively parallel reconstruction problems at commodity price. GPUs have also become more and more important for reconstruction computations, especially when deep learning starts to be applied into MRI reconstruction. The motivation of this survey is to review the image reconstruction schemes of GPU computing for MRI applications and provide a summary reference for researchers in MRI community.

  17. Accelerating Advanced MRI Reconstructions on GPUs.

    PubMed

    Stone, S S; Haldar, J P; Tsao, S C; Hwu, W-M W; Sutton, B P; Liang, Z-P

    2008-10-01

    Computational acceleration on graphics processing units (GPUs) can make advanced magnetic resonance imaging (MRI) reconstruction algorithms attractive in clinical settings, thereby improving the quality of MR images across a broad spectrum of applications. This paper describes the acceleration of such an algorithm on NVIDIA's Quadro FX 5600. The reconstruction of a 3D image with 128(3) voxels achieves up to 180 GFLOPS and requires just over one minute on the Quadro, while reconstruction on a quad-core CPU is twenty-one times slower. Furthermore, relative to the true image, the error exhibited by the advanced reconstruction is only 12%, while conventional reconstruction techniques incur error of 42%.

  18. UK-based prospective cohort study to anglicise and validate the FACE-Q Skin Cancer Module in patients with facial skin cancer undergoing surgical reconstruction: the PROMISCR (Patient-Reported Outcome Measure in Skin Cancer Reconstruction) study.

    PubMed

    Dobbs, Thomas; Hutchings, Hayley A; Whitaker, Iain S

    2017-09-24

    Skin cancer is the most common malignancy worldwide, often occurring on the face, where the cosmetic outcome of treatment is paramount. A number of skin cancer-specific patient-reported outcome measures (PROMs) exist, however none adequately consider the difference in type of reconstruction from a patient's point of view. It is the aim of this study to 'anglicise' (to UK English) a recently developed US PROM for facial skin cancer (the FACE-Q Skin Cancer Module) and to validate this UK version of the PROM. The validation will also involve an assessment of the items for relevance to facial reconstruction patients. This will either validate this new measure for the use in clinical care and research of various facial reconstructive options, or provide evidence that a more specific PROM is required. This is a prospective validation study of the FACE-Q Skin Cancer Module in a UK facial skin cancer population with a specific focus on the difference between types of reconstruction. The face and content validity of the FACE-Q questionnaire will initially be assessed by a review process involving patients, skin cancer specialists and methodologists. An assessment of whether questions are relevant and any missing questions will be made. Initial validation will then be carried out by recruiting a cohort of 100 study participants with skin cancer of the face pre-operatively. All eligible patients will be invited to complete the questionnaire preoperatively and postoperatively. Psychometric analysis will be performed to test validity, reliability and responsiveness to change. Subgroup analysis will be performed on patients undergoing different forms of reconstruction postexcision of their skin cancer. This study has been approved by the West Midlands, Edgbaston Research Ethics Committee (Ref 16/WM/0445). All personal data collected will be anonymised and patient-specific data will only be reported in terms of group demographics. Identifiable data collected will include the patient name and date of birth. Other collected personal data will include their diagnosis, treatment performed, method of reconstruction and complications. A unique identifier will be applied to each patient so that pretreatment and post-treatment questionnaire results can be compared. All data acquisition and storage will be in accordance with the Data Protection Act 1998. Following completion of the study, all records will be stored in the Abertawe Bro Morgannwg University (AMBU) Health Board archive facility. Only qualified personnel working on the project will have access to the data.The outputs from this work will be published as widely as possible in peer-review journals and it is our aim to make this open access. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  19. 40 CFR 60.706 - Reconstruction.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... PERFORMANCE FOR NEW STATIONARY SOURCES Standards of Performance for Volatile Organic Compound Emissions From Synthetic Organic Chemical Manufacturing Industry (SOCMI) Reactor Processes § 60.706 Reconstruction. (a) For...

  20. 40 CFR 60.706 - Reconstruction.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... PERFORMANCE FOR NEW STATIONARY SOURCES Standards of Performance for Volatile Organic Compound Emissions From Synthetic Organic Chemical Manufacturing Industry (SOCMI) Reactor Processes § 60.706 Reconstruction. (a) For...

  1. 15 CFR 923.13 - Energy facility planning process.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 15 Commerce and Foreign Trade 3 2010-01-01 2010-01-01 false Energy facility planning process. 923... RESOURCE MANAGEMENT COASTAL ZONE MANAGEMENT PROGRAM REGULATIONS Uses Subject to Management § 923.13 Energy facility planning process. The management program must contain a planning process for energy facilities...

  2. 15 CFR 923.13 - Energy facility planning process.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 15 Commerce and Foreign Trade 3 2012-01-01 2012-01-01 false Energy facility planning process. 923... RESOURCE MANAGEMENT COASTAL ZONE MANAGEMENT PROGRAM REGULATIONS Uses Subject to Management § 923.13 Energy facility planning process. The management program must contain a planning process for energy facilities...

  3. 15 CFR 923.13 - Energy facility planning process.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 15 Commerce and Foreign Trade 3 2013-01-01 2013-01-01 false Energy facility planning process. 923... RESOURCE MANAGEMENT COASTAL ZONE MANAGEMENT PROGRAM REGULATIONS Uses Subject to Management § 923.13 Energy facility planning process. The management program must contain a planning process for energy facilities...

  4. 15 CFR 923.13 - Energy facility planning process.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 15 Commerce and Foreign Trade 3 2011-01-01 2011-01-01 false Energy facility planning process. 923... RESOURCE MANAGEMENT COASTAL ZONE MANAGEMENT PROGRAM REGULATIONS Uses Subject to Management § 923.13 Energy facility planning process. The management program must contain a planning process for energy facilities...

  5. 15 CFR 923.13 - Energy facility planning process.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 15 Commerce and Foreign Trade 3 2014-01-01 2014-01-01 false Energy facility planning process. 923... RESOURCE MANAGEMENT COASTAL ZONE MANAGEMENT PROGRAM REGULATIONS Uses Subject to Management § 923.13 Energy facility planning process. The management program must contain a planning process for energy facilities...

  6. Syntactic Reconstruction and Reanalysis, Semantic Dead Ends, and Prefrontal Cortex

    ERIC Educational Resources Information Center

    Christensen, Ken Ramshoj

    2010-01-01

    The left inferior frontal gyrus (LIFG) is crucially has been found to be involved in syntactic processing of various kinds. This study investigates the cortical effects of two types of syntactic processes: (i) Reconstruction in ellipsis (recovery of left-out material given by context, "More people have been to Paris than" [...] "to…

  7. A simple measurement method of molecular relaxation in a gas by reconstructing acoustic velocity dispersion

    NASA Astrophysics Data System (ADS)

    Zhu, Ming; Liu, Tingting; Zhang, Xiangqun; Li, Caiyun

    2018-01-01

    Recently, a decomposition method of acoustic relaxation absorption spectra was used to capture the entire molecular multimode relaxation process of gas. In this method, the acoustic attenuation and phase velocity were measured jointly based on the relaxation absorption spectra. However, fast and accurate measurements of the acoustic attenuation remain challenging. In this paper, we present a method of capturing the molecular relaxation process by only measuring acoustic velocity, without the necessity of obtaining acoustic absorption. The method is based on the fact that the frequency-dependent velocity dispersion of a multi-relaxation process in a gas is the serial connection of the dispersions of interior single-relaxation processes. Thus, one can capture the relaxation times and relaxation strengths of N decomposed single-relaxation dispersions to reconstruct the entire multi-relaxation dispersion using the measurements of acoustic velocity at 2N  +  1 frequencies. The reconstructed dispersion spectra are in good agreement with experimental data for various gases and mixtures. The simulations also demonstrate the robustness of our reconstructive method.

  8. Current strategies with 1-stage prosthetic breast reconstruction

    PubMed Central

    2015-01-01

    Background 1-stage prosthetic breast reconstruction is gaining traction as a preferred method of breast reconstruction in select patients who undergo mastectomy for cancer or prevention. Methods Critical elements to the procedure including patient selection, technique, surgical judgment, and postoperative care were reviewed. Results Outcomes series reveal that in properly selected patients, direct-to-implant (DTI) reconstruction has similar low rates of complications and high rates of patient satisfaction compared to traditional 2-stage reconstruction. Conclusions 1-stage prosthetic breast reconstruction may be the procedure of choice in select patients undergoing mastectomy. Advantages include the potential for the entire reconstructive process to be complete in one surgery, the quick return to normal activities, and lack of donor site morbidity. PMID:26005643

  9. A protocol for generating a high-quality genome-scale metabolic reconstruction.

    PubMed

    Thiele, Ines; Palsson, Bernhard Ø

    2010-01-01

    Network reconstructions are a common denominator in systems biology. Bottom-up metabolic network reconstructions have been developed over the last 10 years. These reconstructions represent structured knowledge bases that abstract pertinent information on the biochemical transformations taking place within specific target organisms. The conversion of a reconstruction into a mathematical format facilitates a myriad of computational biological studies, including evaluation of network content, hypothesis testing and generation, analysis of phenotypic characteristics and metabolic engineering. To date, genome-scale metabolic reconstructions for more than 30 organisms have been published and this number is expected to increase rapidly. However, these reconstructions differ in quality and coverage that may minimize their predictive potential and use as knowledge bases. Here we present a comprehensive protocol describing each step necessary to build a high-quality genome-scale metabolic reconstruction, as well as the common trials and tribulations. Therefore, this protocol provides a helpful manual for all stages of the reconstruction process.

  10. A protocol for generating a high-quality genome-scale metabolic reconstruction

    PubMed Central

    Thiele, Ines; Palsson, Bernhard Ø.

    2011-01-01

    Network reconstructions are a common denominator in systems biology. Bottom-up metabolic network reconstructions have developed over the past 10 years. These reconstructions represent structured knowledge-bases that abstract pertinent information on the biochemical transformations taking place within specific target organisms. The conversion of a reconstruction into a mathematical format facilitates myriad computational biological studies including evaluation of network content, hypothesis testing and generation, analysis of phenotypic characteristics, and metabolic engineering. To date, genome-scale metabolic reconstructions for more than 30 organisms have been published and this number is expected to increase rapidly. However, these reconstructions differ in quality and coverage that may minimize their predictive potential and use as knowledge-bases. Here, we present a comprehensive protocol describing each step necessary to build a high-quality genome-scale metabolic reconstruction as well as common trials and tribulations. Therefore, this protocol provides a helpful manual for all stages of the reconstruction process. PMID:20057383

  11. Access to breast reconstruction after mastectomy and patient perspectives on reconstruction decision making.

    PubMed

    Morrow, Monica; Li, Yun; Alderman, Amy K; Jagsi, Reshma; Hamilton, Ann S; Graff, John J; Hawley, Sarah T; Katz, Steven J

    2014-10-01

    Most women undergoing mastectomy for breast cancer do not undergo breast reconstruction. To examine correlates of breast reconstruction after mastectomy and to determine if a significant unmet need for reconstruction exists. We used Surveillance, Epidemiology, and End Results registries from Los Angeles, California, and Detroit, Michigan, for rapid case ascertainment to identify a sample of women aged 20 to 79 years diagnosed as having ductal carcinoma in situ or stages I to III invasive breast cancer. Black and Latina women were oversampled to ensure adequate representation of racial/ethnic minorities. Eligible participants were able to complete a survey in English or Spanish. Of 3252 women sent the initial survey a median of 9 months after diagnosis, 2290 completed it. Those who remained disease free were surveyed 4 years later to determine the frequency of immediate and delayed reconstruction and patient attitudes toward the procedure; 1536 completed the follow-up survey. The 485 who remained disease free at follow-up underwent analysis. Disease-free survival of breast cancer. Breast reconstruction at any time after mastectomy and patient satisfaction with different aspects of the reconstruction decision-making process. Response rates in the initial and follow-up surveys were 73.1% and 67.7%, respectively (overall, 49.4%). Of 485 patients reporting mastectomy at the initial survey and remaining disease free, 24.8% underwent immediate and 16.8% underwent delayed reconstruction (total, 41.6%). Factors significantly associated with not undergoing reconstruction were black race (adjusted odds ratio [AOR], 2.16 [95% CI, 1.11-4.20]; P = .004), lower educational level (AOR, 4.49 [95% CI, 2.31-8.72]; P < .001), increased age (AOR in 10-year increments, 2.53 [95% CI, 1.77-3.61]; P < .001), major comorbidity (AOR, 2.27 [95% CI, 1.01-5.11]; P = .048), and chemotherapy (AOR, 1.82 [95% CI, 0.99-3.31]; P = .05). Only 13.3% of women were dissatisfied with the reconstruction decision-making process, but dissatisfaction was higher among nonwhite patients in the sample (AOR, 2.87 [95% CI, 1.27-6.51]; P = .03). The most common patient-reported reasons for not having reconstruction were the desire to avoid additional surgery (48.5%) and the belief that it was not important (33.8%), but 36.3% expressed fear of implants. Reasons for avoiding reconstruction and systems barriers to care varied by race; barriers were more common among nonwhite participants. Residual demand for reconstruction at 4 years was low, with only 30 of 263 who did not undergo reconstruction still considering the procedure. Reconstruction rates largely reflect patient demand; most patients are satisfied with the decision-making process about reconstruction. Specific approaches are needed to address lingering patient-level and system factors with a negative effect on reconstruction among minority women.

  12. Grammar-based Automatic 3D Model Reconstruction from Terrestrial Laser Scanning Data

    NASA Astrophysics Data System (ADS)

    Yu, Q.; Helmholz, P.; Belton, D.; West, G.

    2014-04-01

    The automatic reconstruction of 3D buildings has been an important research topic during the last years. In this paper, a novel method is proposed to automatically reconstruct the 3D building models from segmented data based on pre-defined formal grammar and rules. Such segmented data can be extracted e.g. from terrestrial or mobile laser scanning devices. Two steps are considered in detail. The first step is to transform the segmented data into 3D shapes, for instance using the DXF (Drawing Exchange Format) format which is a CAD data file format used for data interchange between AutoCAD and other program. Second, we develop a formal grammar to describe the building model structure and integrate the pre-defined grammars into the reconstruction process. Depending on the different segmented data, the selected grammar and rules are applied to drive the reconstruction process in an automatic manner. Compared with other existing approaches, our proposed method allows the model reconstruction directly from 3D shapes and takes the whole building into account.

  13. Fast implementations of reconstruction-based scatter compensation in fully 3D SPECT image reconstruction

    NASA Astrophysics Data System (ADS)

    Kadrmas, Dan J.; Frey, Eric C.; Karimi, Seemeen S.; Tsui, Benjamin M. W.

    1998-04-01

    Accurate scatter compensation in SPECT can be performed by modelling the scatter response function during the reconstruction process. This method is called reconstruction-based scatter compensation (RBSC). It has been shown that RBSC has a number of advantages over other methods of compensating for scatter, but using RBSC for fully 3D compensation has resulted in prohibitively long reconstruction times. In this work we propose two new methods that can be used in conjunction with existing methods to achieve marked reductions in RBSC reconstruction times. The first method, coarse-grid scatter modelling, significantly accelerates the scatter model by exploiting the fact that scatter is dominated by low-frequency information. The second method, intermittent RBSC, further accelerates the reconstruction process by limiting the number of iterations during which scatter is modelled. The fast implementations were evaluated using a Monte Carlo simulated experiment of the 3D MCAT phantom with tracer, and also using experimentally acquired data with tracer. Results indicated that these fast methods can reconstruct, with fully 3D compensation, images very similar to those obtained using standard RBSC methods, and in reconstruction times that are an order of magnitude shorter. Using these methods, fully 3D iterative reconstruction with RBSC can be performed well within the realm of clinically realistic times (under 10 minutes for image reconstruction).

  14. Single-image-based Modelling Architecture from a Historical Photograph

    NASA Astrophysics Data System (ADS)

    Dzwierzynska, Jolanta

    2017-10-01

    Historical photographs are proved to be very useful to provide a dimensional and geometrical analysis of buildings as well as to generate 3D reconstruction of the whole structure. The paper addresses the problem of single historical photograph analysis and modelling of an architectural object from it. Especially, it focuses on reconstruction of the original look of New-Town synagogue from the single historic photograph, when camera calibration is completely unknown. Due to the fact that the photograph faithfully followed the geometric rules of perspective, it was possible to develop and apply the method to obtain a correct 3D reconstruction of the building. The modelling process consisted of a series of familiar steps: feature extraction, determination of base elements of perspective, dimensional analyses and 3D reconstruction. Simple formulas were proposed in order to estimate location of characteristic points of the building in 3D Cartesian system of axes on the base of their location in 2D Cartesian system of axes. The reconstruction process proceeded well, although slight corrections were necessary. It was possible to reconstruct the shape of the building in general, and two of its facades in detail. The reconstruction of the other two facades requires some additional information or the additional picture. The success of the presented reconstruction method depends on the geometrical content of the photograph as well as quality of the picture, which ensures the legibility of building edges. The presented method of reconstruction is a combination of the descriptive method of reconstruction and computer aid; therefore, it seems to be universal. It can prove useful for single-image-based modelling architecture.

  15. Towards Better Calibration of Modern Palynological Data against Climate: A Case Study in Osaka Bay, Japan

    NASA Astrophysics Data System (ADS)

    Kitaba, I.; Nakagawa, T.; McClymont, E.; Dettman, D. L.; Yamada, K.; Takemura, K.; Hyodo, M.

    2014-12-01

    Many of the difficulties in the pollen fossil-based paleoclimate reconstruction in coastal regions derive from the complex sedimentary processes of the near-shore environment. In order to examine this problem, we carried out pollen analysis of surface sediments collected from 35 sites in Osaka Bay, Japan. Using the biomisation method, the surrounding vegetation was accurately reconstructed at all sites. Applying the modern analogue technique to the same data, however, led to reconstructed temperatures that were lower by ca. 5 deg. C and precipitation amounts higher by ca. 5000 mm than the current sea level climate of the region. The range of reconstructed values was larger than the reconstruction error associated with the method. The principal component analysis shows that the surface pollen variation in Osaka Bay reflects sedimentary processes. This significant error associated with the quantitative climatic reconstruction using pollen data is attributed to the fact that the pollen assemblage is not determined solely by climate but reflects non-climatic influences. The accuracy and precision of climatic reconstruction can be improved significantly by expanding counts of minor taxa. Given this result, we re-examined the reconstructed climate using Osaka Bay palynological record reported in Kitaba et al. (2013). This new method did not significantly alter the overall variation in the reconstructed climate, and thus we conclude that the reconstruction was generally reliable. However, some intervals were strongly affected by depositional environmental change. In these, a climate signal can be extracted by excluding the patterns that arise from coastal sedimentation.

  16. Recent advances in the reconstruction of cranio-maxillofacial defects using computer-aided design/computer-aided manufacturing.

    PubMed

    Oh, Ji-Hyeon

    2018-12-01

    With the development of computer-aided design/computer-aided manufacturing (CAD/CAM) technology, it has been possible to reconstruct the cranio-maxillofacial defect with more accurate preoperative planning, precise patient-specific implants (PSIs), and shorter operation times. The manufacturing processes include subtractive manufacturing and additive manufacturing and should be selected in consideration of the material type, available technology, post-processing, accuracy, lead time, properties, and surface quality. Materials such as titanium, polyethylene, polyetheretherketone (PEEK), hydroxyapatite (HA), poly-DL-lactic acid (PDLLA), polylactide-co-glycolide acid (PLGA), and calcium phosphate are used. Design methods for the reconstruction of cranio-maxillofacial defects include the use of a pre-operative model printed with pre-operative data, printing a cutting guide or template after virtual surgery, a model after virtual surgery printed with reconstructed data using a mirror image, and manufacturing PSIs by directly obtaining PSI data after reconstruction using a mirror image. By selecting the appropriate design method, manufacturing process, and implant material according to the case, it is possible to obtain a more accurate surgical procedure, reduced operation time, the prevention of various complications that can occur using the traditional method, and predictive results compared to the traditional method.

  17. Ultrafast and scalable cone-beam CT reconstruction using MapReduce in a cloud computing environment.

    PubMed

    Meng, Bowen; Pratx, Guillem; Xing, Lei

    2011-12-01

    Four-dimensional CT (4DCT) and cone beam CT (CBCT) are widely used in radiation therapy for accurate tumor target definition and localization. However, high-resolution and dynamic image reconstruction is computationally demanding because of the large amount of data processed. Efficient use of these imaging techniques in the clinic requires high-performance computing. The purpose of this work is to develop a novel ultrafast, scalable and reliable image reconstruction technique for 4D CBCT∕CT using a parallel computing framework called MapReduce. We show the utility of MapReduce for solving large-scale medical physics problems in a cloud computing environment. In this work, we accelerated the Feldcamp-Davis-Kress (FDK) algorithm by porting it to Hadoop, an open-source MapReduce implementation. Gated phases from a 4DCT scans were reconstructed independently. Following the MapReduce formalism, Map functions were used to filter and backproject subsets of projections, and Reduce function to aggregate those partial backprojection into the whole volume. MapReduce automatically parallelized the reconstruction process on a large cluster of computer nodes. As a validation, reconstruction of a digital phantom and an acquired CatPhan 600 phantom was performed on a commercial cloud computing environment using the proposed 4D CBCT∕CT reconstruction algorithm. Speedup of reconstruction time is found to be roughly linear with the number of nodes employed. For instance, greater than 10 times speedup was achieved using 200 nodes for all cases, compared to the same code executed on a single machine. Without modifying the code, faster reconstruction is readily achievable by allocating more nodes in the cloud computing environment. Root mean square error between the images obtained using MapReduce and a single-threaded reference implementation was on the order of 10(-7). Our study also proved that cloud computing with MapReduce is fault tolerant: the reconstruction completed successfully with identical results even when half of the nodes were manually terminated in the middle of the process. An ultrafast, reliable and scalable 4D CBCT∕CT reconstruction method was developed using the MapReduce framework. Unlike other parallel computing approaches, the parallelization and speedup required little modification of the original reconstruction code. MapReduce provides an efficient and fault tolerant means of solving large-scale computing problems in a cloud computing environment.

  18. Ultrafast and scalable cone-beam CT reconstruction using MapReduce in a cloud computing environment

    PubMed Central

    Meng, Bowen; Pratx, Guillem; Xing, Lei

    2011-01-01

    Purpose: Four-dimensional CT (4DCT) and cone beam CT (CBCT) are widely used in radiation therapy for accurate tumor target definition and localization. However, high-resolution and dynamic image reconstruction is computationally demanding because of the large amount of data processed. Efficient use of these imaging techniques in the clinic requires high-performance computing. The purpose of this work is to develop a novel ultrafast, scalable and reliable image reconstruction technique for 4D CBCT/CT using a parallel computing framework called MapReduce. We show the utility of MapReduce for solving large-scale medical physics problems in a cloud computing environment. Methods: In this work, we accelerated the Feldcamp–Davis–Kress (FDK) algorithm by porting it to Hadoop, an open-source MapReduce implementation. Gated phases from a 4DCT scans were reconstructed independently. Following the MapReduce formalism, Map functions were used to filter and backproject subsets of projections, and Reduce function to aggregate those partial backprojection into the whole volume. MapReduce automatically parallelized the reconstruction process on a large cluster of computer nodes. As a validation, reconstruction of a digital phantom and an acquired CatPhan 600 phantom was performed on a commercial cloud computing environment using the proposed 4D CBCT/CT reconstruction algorithm. Results: Speedup of reconstruction time is found to be roughly linear with the number of nodes employed. For instance, greater than 10 times speedup was achieved using 200 nodes for all cases, compared to the same code executed on a single machine. Without modifying the code, faster reconstruction is readily achievable by allocating more nodes in the cloud computing environment. Root mean square error between the images obtained using MapReduce and a single-threaded reference implementation was on the order of 10−7. Our study also proved that cloud computing with MapReduce is fault tolerant: the reconstruction completed successfully with identical results even when half of the nodes were manually terminated in the middle of the process. Conclusions: An ultrafast, reliable and scalable 4D CBCT/CT reconstruction method was developed using the MapReduce framework. Unlike other parallel computing approaches, the parallelization and speedup required little modification of the original reconstruction code. MapReduce provides an efficient and fault tolerant means of solving large-scale computing problems in a cloud computing environment. PMID:22149842

  19. Development of a GNSS water vapour tomography system using algebraic reconstruction techniques

    NASA Astrophysics Data System (ADS)

    Bender, Michael; Dick, Galina; Ge, Maorong; Deng, Zhiguo; Wickert, Jens; Kahle, Hans-Gert; Raabe, Armin; Tetzlaff, Gerd

    2011-05-01

    A GNSS water vapour tomography system developed to reconstruct spatially resolved humidity fields in the troposphere is described. The tomography system was designed to process the slant path delays of about 270 German GNSS stations in near real-time with a temporal resolution of 30 min, a horizontal resolution of 40 km and a vertical resolution of 500 m or better. After a short introduction to the GPS slant delay processing the framework of the GNSS tomography is described in detail. Different implementations of the iterative algebraic reconstruction techniques (ART) used to invert the linear inverse problem are discussed. It was found that the multiplicative techniques (MART) provide the best results with least processing time, i.e., a tomographic reconstruction of about 26,000 slant delays on a 8280 cell grid can be obtained in less than 10 min. Different iterative reconstruction techniques are compared with respect to their convergence behaviour and some numerical parameters. The inversion can be considerably stabilized by using additional non-GNSS observations and implementing various constraints. Different strategies for initialising the tomography and utilizing extra information are discussed. At last an example of a reconstructed field of the wet refractivity is presented and compared to the corresponding distribution of the integrated water vapour, an analysis of a numerical weather model (COSMO-DE) and some radiosonde profiles.

  20. 3D prostate histology image reconstruction: Quantifying the impact of tissue deformation and histology section location

    PubMed Central

    Gibson, Eli; Gaed, Mena; Gómez, José A.; Moussa, Madeleine; Pautler, Stephen; Chin, Joseph L.; Crukley, Cathie; Bauman, Glenn S.; Fenster, Aaron; Ward, Aaron D.

    2013-01-01

    Background: Guidelines for localizing prostate cancer on imaging are ideally informed by registered post-prostatectomy histology. 3D histology reconstruction methods can support this by reintroducing 3D spatial information lost during histology processing. The need to register small, high-grade foci drives a need for high accuracy. Accurate 3D reconstruction method design is impacted by the answers to the following central questions of this work. (1) How does prostate tissue deform during histology processing? (2) What spatial misalignment of the tissue sections is induced by microtome cutting? (3) How does the choice of reconstruction model affect histology reconstruction accuracy? Materials and Methods: Histology, paraffin block face and magnetic resonance images were acquired for 18 whole mid-gland tissue slices from six prostates. 7-15 homologous landmarks were identified on each image. Tissue deformation due to histology processing was characterized using the target registration error (TRE) after landmark-based registration under four deformation models (rigid, similarity, affine and thin-plate-spline [TPS]). The misalignment of histology sections from the front faces of tissue slices was quantified using manually identified landmarks. The impact of reconstruction models on the TRE after landmark-based reconstruction was measured under eight reconstruction models comprising one of four deformation models with and without constraining histology images to the tissue slice front faces. Results: Isotropic scaling improved the mean TRE by 0.8-1.0 mm (all results reported as 95% confidence intervals), while skew or TPS deformation improved the mean TRE by <0.1 mm. The mean misalignment was 1.1-1.9° (angle) and 0.9-1.3 mm (depth). Using isotropic scaling, the front face constraint raised the mean TRE by 0.6-0.8 mm. Conclusions: For sub-millimeter accuracy, 3D reconstruction models should not constrain histology images to the tissue slice front faces and should be flexible enough to model isotropic scaling. PMID:24392245

  1. A novel partial volume effects correction technique integrating deconvolution associated with denoising within an iterative PET image reconstruction

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Merlin, Thibaut, E-mail: thibaut.merlin@telecom-bretagne.eu; Visvikis, Dimitris; Fernandez, Philippe

    2015-02-15

    Purpose: Partial volume effect (PVE) plays an important role in both qualitative and quantitative PET image accuracy, especially for small structures. A previously proposed voxelwise PVE correction method applied on PET reconstructed images involves the use of Lucy–Richardson deconvolution incorporating wavelet-based denoising to limit the associated propagation of noise. The aim of this study is to incorporate the deconvolution, coupled with the denoising step, directly inside the iterative reconstruction process to further improve PVE correction. Methods: The list-mode ordered subset expectation maximization (OSEM) algorithm has been modified accordingly with the application of the Lucy–Richardson deconvolution algorithm to the current estimationmore » of the image, at each reconstruction iteration. Acquisitions of the NEMA NU2-2001 IQ phantom were performed on a GE DRX PET/CT system to study the impact of incorporating the deconvolution inside the reconstruction [with and without the point spread function (PSF) model] in comparison to its application postreconstruction and to standard iterative reconstruction incorporating the PSF model. The impact of the denoising step was also evaluated. Images were semiquantitatively assessed by studying the trade-off between the intensity recovery and the noise level in the background estimated as relative standard deviation. Qualitative assessments of the developed methods were additionally performed on clinical cases. Results: Incorporating the deconvolution without denoising within the reconstruction achieved superior intensity recovery in comparison to both standard OSEM reconstruction integrating a PSF model and application of the deconvolution algorithm in a postreconstruction process. The addition of the denoising step permitted to limit the SNR degradation while preserving the intensity recovery. Conclusions: This study demonstrates the feasibility of incorporating the Lucy–Richardson deconvolution associated with a wavelet-based denoising in the reconstruction process to better correct for PVE. Future work includes further evaluations of the proposed method on clinical datasets and the use of improved PSF models.« less

  2. 40 CFR 122.3 - Exclusions.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... an energy or mining facility, a storage facility or a seafood processing facility, or when secured to a storage facility or a seafood processing facility, or when secured to the bed of the ocean...

  3. 40 CFR 122.3 - Exclusions.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... an energy or mining facility, a storage facility or a seafood processing facility, or when secured to a storage facility or a seafood processing facility, or when secured to the bed of the ocean...

  4. 40 CFR 122.3 - Exclusions.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... an energy or mining facility, a storage facility or a seafood processing facility, or when secured to a storage facility or a seafood processing facility, or when secured to the bed of the ocean...

  5. 40 CFR 122.3 - Exclusions.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... an energy or mining facility, a storage facility or a seafood processing facility, or when secured to a storage facility or a seafood processing facility, or when secured to the bed of the ocean...

  6. 40 CFR 122.3 - Exclusions.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... an energy or mining facility, a storage facility or a seafood processing facility, or when secured to a storage facility or a seafood processing facility, or when secured to the bed of the ocean...

  7. Updates in Head and Neck Reconstruction.

    PubMed

    Largo, Rene D; Garvey, Patrick B

    2018-02-01

    After reading this article, the participant should be able to: 1. Have a basic understanding of virtual planning, rapid prototype modeling, three-dimensional printing, and computer-assisted design and manufacture. 2. Understand the principles of combining virtual planning and vascular mapping. 3. Understand principles of flap choice and design in preoperative planning of free osteocutaneous flaps in mandible and midface reconstruction. 4. Discuss advantages and disadvantages of computer-assisted design and manufacture in reconstruction of advanced oncologic mandible and midface defects. Virtual planning and rapid prototype modeling are increasingly used in head and neck reconstruction with the aim of achieving superior surgical outcomes in functionally and aesthetically critical areas of the head and neck compared with conventional reconstruction. The reconstructive surgeon must be able to understand this rapidly-advancing technology, along with its advantages and disadvantages. There is no limit to the degree to which patient-specific data may be integrated into the virtual planning process. For example, vascular mapping can be incorporated into virtual planning of mandible or midface reconstruction. Representative mandible and midface cases are presented to illustrate the process of virtual planning. Although virtual planning has become helpful in head and neck reconstruction, its routine use may be limited by logistic challenges, increased acquisition costs, and limited flexibility for intraoperative modifications. Nevertheless, the authors believe that the superior functional and aesthetic results realized with virtual planning outweigh the limitations.

  8. Honduras: Political and Economic Situation and U.S. Relations

    DTIC Science & Technology

    2006-10-13

    2006.6 Amid the country’s hurricane reconstruction efforts, Honduras signed a poverty reduction and growth facility ( PRGF ) agreement with the International...macroeconomic discipline and to develop a comprehensive poverty reduction strategy. In February 2004, Honduras signed a three-year PRGF agreement...IMF Executive Board Completes Third Program and Financing Assurances Reviews under Honduras’ PRGF Arrangement,” Press Release No. 05/280, Dec. 16, 2005

  9. Reconstructing Space- and Energy-Dependent Exciton Generation in Solution-Processed Inverted Organic Solar Cells.

    PubMed

    Wang, Yuheng; Zhang, Yajie; Lu, Guanghao; Feng, Xiaoshan; Xiao, Tong; Xie, Jing; Liu, Xiaoyan; Ji, Jiahui; Wei, Zhixiang; Bu, Laju

    2018-04-25

    Photon absorption-induced exciton generation plays an important role in determining the photovoltaic properties of donor/acceptor organic solar cells with an inverted architecture. However, the reconstruction of light harvesting and thus exciton generation at different locations within organic inverted device are still not well resolved. Here, we investigate the film depth-dependent light absorption spectra in a small molecule donor/acceptor film. Including depth-dependent spectra into an optical transfer matrix method allows us to reconstruct both film depth- and energy-dependent exciton generation profiles, using which short-circuit current and external quantum efficiency of the inverted device are simulated and compared with the experimental measurements. The film depth-dependent spectroscopy, from which we are able to simultaneously reconstruct light harvesting profile, depth-dependent composition distribution, and vertical energy level variations, provides insights into photovoltaic process. In combination with appropriate material processing methods and device architecture, the method proposed in this work will help optimizing film depth-dependent optical/electronic properties for high-performance solar cells.

  10. Accelerated numerical processing of electronically recorded holograms with reduced speckle noise.

    PubMed

    Trujillo, Carlos; Garcia-Sucerquia, Jorge

    2013-09-01

    The numerical reconstruction of digitally recorded holograms suffers from speckle noise. An accelerated method that uses general-purpose computing in graphics processing units to reduce that noise is shown. The proposed methodology utilizes parallelized algorithms to record, reconstruct, and superimpose multiple uncorrelated holograms of a static scene. For the best tradeoff between reduction of the speckle noise and processing time, the method records, reconstructs, and superimposes six holograms of 1024 × 1024 pixels in 68 ms; for this case, the methodology reduces the speckle noise by 58% compared with that exhibited by a single hologram. The fully parallelized method running on a commodity graphics processing unit is one order of magnitude faster than the same technique implemented on a regular CPU using its multithreading capabilities. Experimental results are shown to validate the proposal.

  11. Exploration Flight Test 1 Afterbody Aerothermal Environment Reconstruction

    NASA Technical Reports Server (NTRS)

    Hyatt, Andrew J.; Oliver, Brandon; Amar, Adam; Lessard, Victor

    2016-01-01

    The Exploration Flight Test 1 vehicle included roughly 100 near surface thermocouples on the after body of the vehicle. The temperature traces at each of these instruments have been used to perform inverse environment reconstruction to determine the aerothermal environment experienced during re-entry of the vehicle. This paper provides an overview of the reconstructed environments and identifies critical aspects of the environment. These critical aspects include transition and reaction control system jet influence. A blind test of the process and reconstruction tool was also performed to build confidence in the reconstructed environments. Finally, an uncertainty quantification analysis was also performed to identify the impact of each of the uncertainties on the reconstructed environments.

  12. Reconstruction of atmospheric pollutant concentrations from remote sensing data - An application of distributed parameter observer theory

    NASA Technical Reports Server (NTRS)

    Koda, M.; Seinfeld, J. H.

    1982-01-01

    The reconstruction of a concentration distribution from spatially averaged and noise-corrupted data is a central problem in processing atmospheric remote sensing data. Distributed parameter observer theory is used to develop reconstructibility conditions for distributed parameter systems having measurements typical of those in remote sensing. The relation of the reconstructibility condition to the stability of the distributed parameter observer is demonstrated. The theory is applied to a variety of remote sensing situations, and it is found that those in which concentrations are measured as a function of altitude satisfy the conditions of distributed state reconstructibility.

  13. A collaborative computing framework of cloud network and WBSN applied to fall detection and 3-D motion reconstruction.

    PubMed

    Lai, Chin-Feng; Chen, Min; Pan, Jeng-Shyang; Youn, Chan-Hyun; Chao, Han-Chieh

    2014-03-01

    As cloud computing and wireless body sensor network technologies become gradually developed, ubiquitous healthcare services prevent accidents instantly and effectively, as well as provides relevant information to reduce related processing time and cost. This study proposes a co-processing intermediary framework integrated cloud and wireless body sensor networks, which is mainly applied to fall detection and 3-D motion reconstruction. In this study, the main focuses includes distributed computing and resource allocation of processing sensing data over the computing architecture, network conditions and performance evaluation. Through this framework, the transmissions and computing time of sensing data are reduced to enhance overall performance for the services of fall events detection and 3-D motion reconstruction.

  14. A very efficient RCS data compression and reconstruction technique, volume 4

    NASA Technical Reports Server (NTRS)

    Tseng, N. Y.; Burnside, W. D.

    1992-01-01

    A very efficient compression and reconstruction scheme for RCS measurement data was developed. The compression is done by isolating the scattering mechanisms on the target and recording their individual responses in the frequency and azimuth scans, respectively. The reconstruction, which is an inverse process of the compression, is granted by the sampling theorem. Two sets of data, the corner reflectors and the F-117 fighter model, were processed and the results were shown to be convincing. The compression ratio can be as large as several hundred, depending on the target's geometry and scattering characteristics.

  15. Education for Reconstruction. The Regeneration of Educational Capacity Following National Upheaval. Oxford Studies in Comparative Education.

    ERIC Educational Resources Information Center

    Arnhold, Nina; Bekker, Julia; Kersh, Natasha; McLeish, Elizabeth; Phillips, David

    This report examines the main questions that need to be addressed by agencies concerned with processes of reconstruction in countries that have experienced crisis (e.g., war, natural disaster, and extreme political and economic upheaval). The report focuses on educational reconstruction in its various manifestations. Within each heading, the…

  16. In vitro cytotoxicity and surface topography evaluation of additive manufacturing titanium implant materials.

    PubMed

    Tuomi, Jukka T; Björkstrand, Roy V; Pernu, Mikael L; Salmi, Mika V J; Huotilainen, Eero I; Wolff, Jan E H; Vallittu, Pekka K; Mäkitie, Antti A

    2017-03-01

    Custom-designed patient-specific implants and reconstruction plates are to date commonly manufactured using two different additive manufacturing (AM) technologies: direct metal laser sintering (DMLS) and electron beam melting (EBM). The purpose of this investigation was to characterize the surface structure and to assess the cytotoxicity of titanium alloys processed using DMLS and EBM technologies as the existing information on these issues is scarce. "Processed" and "polished" DMLS and EBM disks were assessed. Microscopic examination revealed titanium alloy particles and surface flaws on the processed materials. These surface flaws were subsequently removed by polishing. Surface roughness of EBM processed titanium was higher than that of DMLS processed. The cytotoxicity results of the DMLS and EBM discs were compared with a "gold standard" commercially available titanium mandible reconstruction plate. The mean cell viability for all discs was 82.6% (range, 77.4 to 89.7) and 83.3% for the control reconstruction plate. The DMLS and EBM manufactured titanium plates were non-cytotoxic both in "processed" and in "polished" forms.

  17. Towards extensive spatio-temporal reconstructions of North American land cover: a comparison of state-of-the-art pollen-vegetation models

    NASA Astrophysics Data System (ADS)

    Dawson, A.; Trachsel, M.; Goring, S. J.; Paciorek, C. J.; McLachlan, J. S.; Jackson, S. T.; Williams, J. W.

    2017-12-01

    Pollen records have been extensively used to reconstruct past changes in vegetation and study the underlying processes. However, developing the statistical techniques needed to accurately represent both data and process uncertainties is a formidable challenge. Recent advances in paleoecoinformatics (e.g. the Neotoma Paleoecology Database and the European Pollen Database), Bayesian age-depth models, and process-based pollen-vegetation models, and Bayesian hierarchical modeling have pushed paleovegetation reconstructions forward to a point where multiple sources of uncertainty can be incorporated into reconstructions, which in turn enables new hypotheses to be asked and more rigorous integration of paleovegetation data with earth system models and terrestrial ecosystem models. Several kinds of pollen-vegetation models have been developed, notably LOVE/REVEALS, STEPPS, and classical transfer functions such as the modern analog technique. LOVE/REVEALS has been adopted as the standard method for the LandCover6k effort to develop quantitative reconstructions of land cover for the Holocene, while STEPPS has been developed recently as part of the PalEON project and applied to reconstruct with uncertainty shifts in forest composition in New England and the upper Midwest during the late Holocene. Each PVM has different assumptions and structure and uses different input data, but few comparisons among approaches yet exist. Here, we present new reconstructions of land cover change in northern North America during the Holocene based on LOVE/REVEALS and data drawn from the Neotoma database and compare STEPPS-based reconstructions to those from LOVE/REVEALS. These parallel developments with LOVE/REVEALS provide an opportunity to compare and contrast models, and to begin to generate continental scale reconstructions, with explicit uncertainties, that can provide a base for interdisciplinary research within the biogeosciences. We show how STEPPS provides an important benchmark for past land-cover reconstruction, and how the LandCover 6k effort in North America advances our understanding of the past by allowing cross-continent comparisons using standardized methods and quantifying the impact of humans in the early Anthropocene.

  18. 3D frequency-domain ultrasound waveform tomography breast imaging

    NASA Astrophysics Data System (ADS)

    Sandhu, Gursharan Yash; West, Erik; Li, Cuiping; Roy, Olivier; Duric, Neb

    2017-03-01

    Frequency-domain ultrasound waveform tomography is a promising method for the visualization and characterization of breast disease. It has previously been shown to accurately reconstruct the sound speed distributions of breasts of varying densities. The reconstructed images show detailed morphological and quantitative information that can help differentiate different types of breast disease including benign and malignant lesions. The attenuation properties of an ex vivo phantom have also been assessed. However, the reconstruction algorithms assumed a 2D geometry while the actual data acquisition process was not. Although clinically useful sound speed images can be reconstructed assuming this mismatched geometry, artifacts from the reconstruction process exist within the reconstructed images. This is especially true for registration across different modalities and when the 2D assumption is violated. For example, this happens when a patient's breast is rapidly sloping. It is also true for attenuation imaging where energy lost or gained out of the plane gets transformed into artifacts within the image space. In this paper, we will briefly review ultrasound waveform tomography techniques, give motivation for pursuing the 3D method, discuss the 3D reconstruction algorithm, present the results of 3D forward modeling, show the mismatch that is induced by the violation of 3D modeling via numerical simulations, and present a 3D inversion of a numerical phantom.

  19. Accelerated Compressed Sensing Based CT Image Reconstruction.

    PubMed

    Hashemi, SayedMasoud; Beheshti, Soosan; Gill, Patrick R; Paul, Narinder S; Cobbold, Richard S C

    2015-01-01

    In X-ray computed tomography (CT) an important objective is to reduce the radiation dose without significantly degrading the image quality. Compressed sensing (CS) enables the radiation dose to be reduced by producing diagnostic images from a limited number of projections. However, conventional CS-based algorithms are computationally intensive and time-consuming. We propose a new algorithm that accelerates the CS-based reconstruction by using a fast pseudopolar Fourier based Radon transform and rebinning the diverging fan beams to parallel beams. The reconstruction process is analyzed using a maximum-a-posterior approach, which is transformed into a weighted CS problem. The weights involved in the proposed model are calculated based on the statistical characteristics of the reconstruction process, which is formulated in terms of the measurement noise and rebinning interpolation error. Therefore, the proposed method not only accelerates the reconstruction, but also removes the rebinning and interpolation errors. Simulation results are shown for phantoms and a patient. For example, a 512 × 512 Shepp-Logan phantom when reconstructed from 128 rebinned projections using a conventional CS method had 10% error, whereas with the proposed method the reconstruction error was less than 1%. Moreover, computation times of less than 30 sec were obtained using a standard desktop computer without numerical optimization.

  20. Accelerated Compressed Sensing Based CT Image Reconstruction

    PubMed Central

    Hashemi, SayedMasoud; Beheshti, Soosan; Gill, Patrick R.; Paul, Narinder S.; Cobbold, Richard S. C.

    2015-01-01

    In X-ray computed tomography (CT) an important objective is to reduce the radiation dose without significantly degrading the image quality. Compressed sensing (CS) enables the radiation dose to be reduced by producing diagnostic images from a limited number of projections. However, conventional CS-based algorithms are computationally intensive and time-consuming. We propose a new algorithm that accelerates the CS-based reconstruction by using a fast pseudopolar Fourier based Radon transform and rebinning the diverging fan beams to parallel beams. The reconstruction process is analyzed using a maximum-a-posterior approach, which is transformed into a weighted CS problem. The weights involved in the proposed model are calculated based on the statistical characteristics of the reconstruction process, which is formulated in terms of the measurement noise and rebinning interpolation error. Therefore, the proposed method not only accelerates the reconstruction, but also removes the rebinning and interpolation errors. Simulation results are shown for phantoms and a patient. For example, a 512 × 512 Shepp-Logan phantom when reconstructed from 128 rebinned projections using a conventional CS method had 10% error, whereas with the proposed method the reconstruction error was less than 1%. Moreover, computation times of less than 30 sec were obtained using a standard desktop computer without numerical optimization. PMID:26167200

  1. A Polarized High-Energy Photon Beam for Production of Exotic Mesons

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Senderovich, Igor

    2012-01-01

    This work describes design, prototyping and testing of various components of the Jefferson Lab Hall D photon beamline. These include coherent bremsstrahlung radiators to be used in this facility for generating the photon beam, a fine resolution hodoscope for the facility's tagging spectrometer, and a photon beam position sensor for stabilizing the beam on a collimator. The principal instrumentation project was the hodoscope: its design, implementation and beam testing will be thoroughly described. Studies of the coherent bremsstrahlung radiators involved X-ray characterization of diamond crystals to identify the appropriate line of manufactured radiators and the proper techniques for thinning themmore » to the desired specification of the beamline. The photon beam position sensor project involved completion of a designed detector and its beam test. The results of these shorter studies will also be presented. The second part of this work discusses a Monte Carlo study of a possible photo-production and decay channel in the GlueX experiment that will be housed in the Hall D facility. Specifically, the γ p → Xp → b 1 π → ω π +1 π -1 channel was studied including its Amplitude Analysis. This exercise attempted to generate a possible physics signal, complete with internal angular momentum states, and be able to reconstruct the signal in the detector and find the proper set of JPC quantum numbers through an amplitude fit. Derivation of the proper set of amplitudes in the helicity basis is described, followed by a discussion of the implementation, generation of the data sets, reconstruction techniques, the amplitude fit and results of this study.« less

  2. Gains in efficiency and scientific potential of continental climate reconstruction provided by the LRC LacCore Facility, University of Minnesota

    NASA Astrophysics Data System (ADS)

    Noren, A.; Brady, K.; Myrbo, A.; Ito, E.

    2007-12-01

    Lacustrine sediment cores comprise an integral archive for the determination of continental paleoclimate, for their potentially high temporal resolution and for their ability to resolve spatial variability in climate across vast sections of the globe. Researchers studying these archives now have a large, nationally-funded, public facility dedicated to the support of their efforts. The LRC LacCore Facility, funded by NSF and the University of Minnesota, provides free or low-cost assistance to any portion of research projects, depending on the specific needs of the project. A large collection of field equipment (site survey equipment, coring devices, boats/platforms, water sampling devices) for nearly any lacustrine setting is available for rental, and Livingstone-type corers and drive rods may be purchased. LacCore staff can accompany field expeditions to operate these devices and curate samples, or provide training prior to device rental. The Facility maintains strong connections to experienced shipping agents and customs brokers, which vastly improves transport and importation of samples. In the lab, high-end instrumentation (e.g., multisensor loggers, high-resolution digital linescan cameras) provides a baseline of fundamental analyses before any sample material is consumed. LacCore staff provide support and training in lithological description, including smear-slide, XRD, and SEM analyses. The LRC botanical macrofossil reference collection is a valuable resource for both core description and detailed macrofossil analysis. Dedicated equipment and space for various subsample analyses streamlines these endeavors; subsamples for several analyses may be submitted for preparation or analysis by Facility technicians for a fee (e.g., carbon and sulfur coulometry, grain size, pollen sample preparation and analysis, charcoal, biogenic silica, LOI, freeze drying). The National Lacustrine Core Repository now curates ~9km of sediment cores from expeditions around the world, and stores metadata and analytical data for all cores processed at the facility. Any researcher may submit sample requests for material in archived cores. Supplies for field (e.g., polycarbonate pipe, endcaps), lab (e.g., sample containers, pollen sample spike), and curation (e.g., D-tubes) are sold at cost. In collaboration with facility users, staff continually develop new equipment, supplies, and procedures as needed in order to provide the best and most comprehensive set of services to the research community.

  3. Unique life sciences research facilities at NASA Ames Research Center

    NASA Technical Reports Server (NTRS)

    Mulenburg, G. M.; Vasques, M.; Caldwell, W. F.; Tucker, J.

    1994-01-01

    The Life Science Division at NASA's Ames Research Center has a suite of specialized facilities that enable scientists to study the effects of gravity on living systems. This paper describes some of these facilities and their use in research. Seven centrifuges, each with its own unique abilities, allow testing of a variety of parameters on test subjects ranging from single cells through hardware to humans. The Vestibular Research Facility allows the study of both centrifugation and linear acceleration on animals and humans. The Biocomputation Center uses computers for 3D reconstruction of physiological systems, and interactive research tools for virtual reality modeling. Psycophysiological, cardiovascular, exercise physiology, and biomechanical studies are conducted in the 12 bed Human Research Facility and samples are analyzed in the certified Central Clinical Laboratory and other laboratories at Ames. Human bedrest, water immersion and lower body negative pressure equipment are also available to study physiological changes associated with weightlessness. These and other weightlessness models are used in specialized laboratories for the study of basic physiological mechanisms, metabolism and cell biology. Visual-motor performance, perception, and adaptation are studied using ground-based models as well as short term weightlessness experiments (parabolic flights). The unique combination of Life Science research facilities, laboratories, and equipment at Ames Research Center are described in detail in relation to their research contributions.

  4. Accelerating image reconstruction in dual-head PET system by GPU and symmetry properties.

    PubMed

    Chou, Cheng-Ying; Dong, Yun; Hung, Yukai; Kao, Yu-Jiun; Wang, Weichung; Kao, Chien-Min; Chen, Chin-Tu

    2012-01-01

    Positron emission tomography (PET) is an important imaging modality in both clinical usage and research studies. We have developed a compact high-sensitivity PET system that consisted of two large-area panel PET detector heads, which produce more than 224 million lines of response and thus request dramatic computational demands. In this work, we employed a state-of-the-art graphics processing unit (GPU), NVIDIA Tesla C2070, to yield an efficient reconstruction process. Our approaches ingeniously integrate the distinguished features of the symmetry properties of the imaging system and GPU architectures, including block/warp/thread assignments and effective memory usage, to accelerate the computations for ordered subset expectation maximization (OSEM) image reconstruction. The OSEM reconstruction algorithms were implemented employing both CPU-based and GPU-based codes, and their computational performance was quantitatively analyzed and compared. The results showed that the GPU-accelerated scheme can drastically reduce the reconstruction time and thus can largely expand the applicability of the dual-head PET system.

  5. L'Aquila's reconstruction challenges: has Italy learned from its previous earthquake disasters?

    PubMed

    Ozerdem, Alpaslan; Rufini, Gianni

    2013-01-01

    Italy is an earthquake-prone country and its disaster emergency response experiences over the past few decades have varied greatly, with some being much more successful than others. Overall, however, its reconstruction efforts have been criticised for being ad hoc, delayed, ineffective, and untargeted. In addition, while the emergency relief response to the L'Aquila earthquake of 6 April 2009-the primary case study in this evaluation-seems to have been successful, the reconstruction initiative got off to a very problematic start. To explore the root causes of this phenomenon, the paper argues that, owing to the way in which Italian Prime Minister Silvio Berlusconi has politicised the process, the L'Aquila reconstruction endeavour is likely to suffer problems with local ownership, national/regional/municipal coordination, and corruption. It concludes with a set of recommendations aimed at addressing the pitfalls that may confront the L'Aquila reconstruction process over the next few years. © 2013 The Author(s). Journal compilation © Overseas Development Institute, 2013.

  6. State machine analysis of sensor data from dynamic processes

    DOEpatents

    Cook, William R.; Brabson, John M.; Deland, Sharon M.

    2003-12-23

    A state machine model analyzes sensor data from dynamic processes at a facility to identify the actual processes that were performed at the facility during a period of interest for the purpose of remote facility inspection. An inspector can further input the expected operations into the state machine model and compare the expected, or declared, processes to the actual processes to identify undeclared processes at the facility. The state machine analysis enables the generation of knowledge about the state of the facility at all levels, from location of physical objects to complex operational concepts. Therefore, the state machine method and apparatus may benefit any agency or business with sensored facilities that stores or manipulates expensive, dangerous, or controlled materials or information.

  7. Consolidated Laser-Induced Fluorescence Diagnostic Systems for the NASA Ames Arc Jet Facilities

    NASA Technical Reports Server (NTRS)

    Grinstead, Jay H.; Wilder, Michael C.; Porter, Barry J.; Brown, Jeffrey D.; Yeung, Dickson; Battazzo, Stephen J.; Brubaker, Timothy R.

    2016-01-01

    The spectroscopic diagnostic technique of two photon absorption laser-induced fluorescence (TALIF) of atomic species for non-intrusive arc jet flow property measurement was first implemented at NASA Ames in the mid-1990s. Use of TALIF expanded at NASA Ames and to NASA Johnson's arc jet facility in the late 2000s. In 2013-2014, NASA combined the agency's large-scale arc jet test capabilities at NASA Ames. Concurrent with that effort, the agency also sponsored a project to establish two comprehensive LIF diagnostic systems for the Aerodynamic Heating Facility (AHF) and Interaction Heating Facility (IHF) arc jets. The scope of the project enabled further engineering development of the existing IHF LIF system as well as the complete reconstruction of the original AHF LIF system. The updated LIF systems are identical in design and capability. They represent the culmination of over 20 years of development experience in transitioning a specialized laboratory research tool into a measurement system for large-scale, high-demand test facilities. This paper documents the overall system design from measurement requirements to implementation. Representative data from the redeveloped AHF and IHF LIF systems are also presented.

  8. Consolidated Laser-Induced Fluorescence Diagnostic Systems for the NASA Ames Arc Jet Facilities

    NASA Technical Reports Server (NTRS)

    Grinstead, Jay; Wilder, Michael C.; Porter, Barry; Brown, Jeff; Yeung, Dickson; Battazzo, Steve; Brubaker, Tim

    2016-01-01

    The spectroscopic diagnostic technique of two photon absorption laser-induced fluorescence (TALIF) of atomic species for non-intrusive arc jet flow property measurement was first implemented at NASA Ames in the mid-1990s. Use of TALIF expanded at NASA Ames and to NASA Johnsons arc jet facility in the late 2000s. In 2013-2014, NASA combined the agency's large-scale arc jet test capabilities at NASA Ames. Concurrent with that effort, the agency also sponsored a project to establish two comprehensive LIF diagnostic systems for the Aerodynamic Heating Facility (AHF) and Interaction Heating Facility (IHF) arc jets. The scope of the project enabled further engineering development of the existing IHF LIF system as well as the complete reconstruction of the original AHF LIF system. The updated LIF systems are identical in design and capability. They represent the culmination of over 20 years of development experience in transitioning a specialized laboratory research tool into a measurement system for large-scale, high-demand test facilities. This paper documents the overall system design from measurement requirements to implementation. Representative data from the redeveloped AHF and IHF LIF systems are also presented.

  9. The Submental Island Flap Is a Viable Reconstructive Option for a Variety of Head and Neck Ablative Defects.

    PubMed

    Barton, Blair M; Riley, Charles A; Pou, Jason D; Hasney, Christian P; Moore, Brian A

    2018-01-01

    The submental island flap (SIF) is a pedicled flap based upon the submental artery and vein. Its utility in reconstruction following ablative head and neck procedures has been applied to various subsites including skin, lip, buccal mucosa, retromolar trigone, parotidectomy defects, and tongue. We review our experience using the SIF for reconstruction following tumor ablation. This prospective case series with medical record review includes consecutive patients undergoing SIF reconstruction following ablative surgery for malignancy at a single tertiary care facility between November 2014 and November 2016. We examined preoperative variables, surgical procedures, and postoperative outcomes. Thirty-seven patients met inclusion criteria. Twenty-nine were male; the average age was 64.3 (±12.4) years. Seventeen cancers involved the oral cavity, 11 involved the skin, 8 were in the oropharynx, and 1 was in the paranasal sinus. The average size of the SIF was 38.8 cm 2 (±17.6 cm 2 ). Four partial flap losses occurred; none required revision surgery. The average length of stay for these patients was 7.2 (±6.1) days. The SIF is a robust flap that can be reliably used for a variety of head and neck defects following tumor ablation with an acceptable rate of donor- and flap-related complications.

  10. Reconstruction for limited-projection fluorescence molecular tomography based on projected restarted conjugate gradient normal residual.

    PubMed

    Cao, Xu; Zhang, Bin; Liu, Fei; Wang, Xin; Bai, Jing

    2011-12-01

    Limited-projection fluorescence molecular tomography (FMT) can greatly reduce the acquisition time, which is suitable for resolving fast biology processes in vivo but suffers from severe ill-posedness because of the reconstruction using only limited projections. To overcome the severe ill-posedness, we report a reconstruction method based on the projected restarted conjugate gradient normal residual. The reconstruction results of two phantom experiments demonstrate that the proposed method is feasible for limited-projection FMT. © 2011 Optical Society of America

  11. Breast Reconstruction with Implants

    MedlinePlus

    ... implants is a complex procedure performed by a plastic surgeon. The breast reconstruction process can start at ... doctor may recommend that you meet with a plastic surgeon. Consult a plastic surgeon who's board certified ...

  12. Shading correction assisted iterative cone-beam CT reconstruction

    NASA Astrophysics Data System (ADS)

    Yang, Chunlin; Wu, Pengwei; Gong, Shutao; Wang, Jing; Lyu, Qihui; Tang, Xiangyang; Niu, Tianye

    2017-11-01

    Recent advances in total variation (TV) technology enable accurate CT image reconstruction from highly under-sampled and noisy projection data. The standard iterative reconstruction algorithms, which work well in conventional CT imaging, fail to perform as expected in cone beam CT (CBCT) applications, wherein the non-ideal physics issues, including scatter and beam hardening, are more severe. These physics issues result in large areas of shading artifacts and cause deterioration to the piecewise constant property assumed in reconstructed images. To overcome this obstacle, we incorporate a shading correction scheme into low-dose CBCT reconstruction and propose a clinically acceptable and stable three-dimensional iterative reconstruction method that is referred to as the shading correction assisted iterative reconstruction. In the proposed method, we modify the TV regularization term by adding a shading compensation image to the reconstructed image to compensate for the shading artifacts while leaving the data fidelity term intact. This compensation image is generated empirically, using image segmentation and low-pass filtering, and updated in the iterative process whenever necessary. When the compensation image is determined, the objective function is minimized using the fast iterative shrinkage-thresholding algorithm accelerated on a graphic processing unit. The proposed method is evaluated using CBCT projection data of the Catphan© 600 phantom and two pelvis patients. Compared with the iterative reconstruction without shading correction, the proposed method reduces the overall CT number error from around 200 HU to be around 25 HU and increases the spatial uniformity by a factor of 20 percent, given the same number of sparsely sampled projections. A clinically acceptable and stable iterative reconstruction algorithm for CBCT is proposed in this paper. Differing from the existing algorithms, this algorithm incorporates a shading correction scheme into the low-dose CBCT reconstruction and achieves more stable optimization path and more clinically acceptable reconstructed image. The method proposed by us does not rely on prior information and thus is practically attractive to the applications of low-dose CBCT imaging in the clinic.

  13. Event Reconstruction in the PandaRoot framework

    NASA Astrophysics Data System (ADS)

    Spataro, Stefano

    2012-12-01

    The PANDA experiment will study the collisions of beams of anti-protons, with momenta ranging from 2-15 GeV/c, with fixed proton and nuclear targets in the charm energy range, and will be built at the FAIR facility. In preparation for the experiment, the PandaRoot software framework is under development for detector simulation, reconstruction and data analysis, running on an Alien2-based grid. The basic features are handled by the FairRoot framework, based on ROOT and Virtual Monte Carlo, while the PANDA detector specifics and reconstruction code are implemented inside PandaRoot. The realization of Technical Design Reports for the tracking detectors has pushed the finalization of the tracking reconstruction code, which is complete for the Target Spectrometer, and of the analysis tools. Particle Identification algorithms are currently implemented using Bayesian approach and compared to Multivariate Analysis methods. Moreover, the PANDA data acquisition foresees a triggerless operation in which events are not defined by a hardware 1st level trigger decision, but all the signals are stored with time stamps requiring a deconvolution by the software. This has led to a redesign of the software from an event basis to a time-ordered structure. In this contribution, the reconstruction capabilities of the Panda spectrometer will be reported, focusing on the performances of the tracking system and the results for the analysis of physics benchmark channels, as well as the new (and challenging) concept of time-based simulation and its implementation.

  14. From regained function to daily use: experiences of surgical reconstruction of grip in people with tetraplegia.

    PubMed

    Wangdell, Johanna; Carlsson, Gunnel; Friden, Jan

    2014-01-01

    To capture patients' relearning processes from regained function to improvements in daily life after grip reconstructive surgery in tetraplegia. Eleven people with tetraplegia who underwent grip reconstructive surgery during February 2009 to March 2011. Qualitative interviews were conducted 7 to 17 months after surgery and analysed using grounded theory. Determination to reach a higher level of independence was the core concept to integrate regained function into daily life. There were 3 phases identified; "Initiate activity training," "Establish hand control in daily activities," and "Challenge dependence." Between the phases psychological stages occurred, first; "a belief in improved ability", and later in the process; "confidence in ability". The process to fully integrate regain function in daily life was described as long and time-consuming. However, the participants claimed it useful to do the skills training in their home environment, without long-term in clinic rehabilitation. Relearning activities in daily life after a grip reconstruction is a time-consuming and demanding process. It includes skills training, mental strategies and psychological stages together with environmental and social factors. Accordingly, rehabilitation after grip reconstruction in tetraplegia should focus on both grip skills and psychological stages, to encourage that patient's keep their determination and achieve greater independence. Implications for Rehabilitation There is a stepwise process to transform improved function into daily use. The most important factor to transform improved function into daily use was motivation to reach a higher independence. Other important factors were; skills training, use of individual learning strategies, belief and confidence in personal ability, social and environmental factors. There was a long and demanding process to fully transform the improved function into daily use. The participants preferred to do activity training in the specific environment, usually at home.

  15. Emergency planning and the acute toxic potency of inhaled ammonia.

    PubMed Central

    Michaels, R A

    1999-01-01

    Ammonia is present in agriculture and commerce in many if not most communities. This report evaluates the toxic potency of ammonia, based on three types of data: anecdotal data, in some cases predating World War 1, reconstructions of contemporary industrial accidents, and animal bioassays. Standards and guidelines for human exposure have been driven largely by the anecdotal data, suggesting that ammonia at 5,000-10,000 parts per million, volume/volume (ppm-v), might be lethal within 5-10 min. However, contemporary accident reconstructions suggest that ammonia lethality requires higher concentrations. For example, 33,737 ppm-v was a 5-min zero-mortality value in a major ammonia release in 1973 in South Africa. Comparisons of secondary reports of ammonia lethality with original sources revealed discrepancies in contemporary sources, apparently resulting from failure to examine old documents or accurately translate foreign documents. The present investigation revealed that contemporary accident reconstructions yield ammonia lethality levels comparable to those in dozens of reports of animal bioassays, after adjustment of concentrations to human equivalent concentrations via U.S. Environmental Protection Agency (EPA) procedures. Ammonia levels potentially causing irreversible injury or impairing the ability of exposed people to escape from further exposure or from coincident perils similarly have been biased downwardly in contemporary sources. The EPA has identified ammonia as one of 366 extremely hazardous substances subject to community right-to-know provisions of the Superfund Act and emergency planning provisions of the Clean Air Act. The Clean Air Act defines emergency planning zones (EPZs) around industrial facilities exceeding a threshold quantity of ammonia on-site. This study suggests that EPZ areas around ammonia facilities can be reduced, thereby also reducing emergency planning costs, which will vary roughly with the EPZ radius squared. Images Figure 1 Figure 2 PMID:10417358

  16. Longitudinal phase space tomography using a booster cavity at PITZ

    NASA Astrophysics Data System (ADS)

    Malyutin, D.; Gross, M.; Isaev, I.; Khojoyan, M.; Kourkafas, G.; Krasilnikov, M.; Marchetti, B.; Otevrel, M.; Stephan, F.; Vashchenko, G.

    2017-11-01

    The knowledge of the longitudinal phase space (LPS) of electron beams is of great importance for optimizing the performance of high brightness photo injectors. To get the longitudinal phase space of an electron bunch in a linear accelerator a tomographic technique can be used. The method is based on measurements of the bunch momentum spectra while varying the bunch energy chirp. The energy chirp can be varied by one of the RF accelerating structures in the accelerator and the resulting momentum distribution can be measured with a dipole spectrometer further downstream. As a result, the longitudinal phase space can be reconstructed. Application of the tomographic technique for reconstruction of the longitudinal phase space is introduced in detail in this paper. Measurement results from the PITZ facility are shown and analyzed.

  17. Proposal for a new categorization of aseptic processing facilities based on risk assessment scores.

    PubMed

    Katayama, Hirohito; Toda, Atsushi; Tokunaga, Yuji; Katoh, Shigeo

    2008-01-01

    Risk assessment of aseptic processing facilities was performed using two published risk assessment tools. Calculated risk scores were compared with experimental test results, including environmental monitoring and media fill run results, in three different types of facilities. The two risk assessment tools used gave a generally similar outcome. However, depending on the tool used, variations were observed in the relative scores between the facilities. For the facility yielding the lowest risk scores, the corresponding experimental test results showed no contamination, indicating that these ordinal testing methods are insufficient to evaluate this kind of facility. A conventional facility having acceptable aseptic processing lines gave relatively high risk scores. The facility showing a rather high risk score demonstrated the usefulness of conventional microbiological test methods. Considering the significant gaps observed in calculated risk scores and in the ordinal microbiological test results between advanced and conventional facilities, we propose a facility categorization based on risk assessment. The most important risk factor in aseptic processing is human intervention. When human intervention is eliminated from the process by advanced hardware design, the aseptic processing facility can be classified into a new risk category that is better suited for assuring sterility based on a new set of criteria rather than on currently used microbiological analysis. To fully benefit from advanced technologies, we propose three risk categories for these aseptic facilities.

  18. 77 FR 823 - Guidance for Fuel Cycle Facility Change Processes

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-01-06

    ... NUCLEAR REGULATORY COMMISSION [NRC-2009-0262] Guidance for Fuel Cycle Facility Change Processes... Fuel Cycle Facility Change Processes.'' This regulatory guide describes the types of changes for which fuel cycle facility licensees should seek prior approval from the NRC and discusses how licensees can...

  19. CLARA: CLAS12 Reconstruction and Analysis Framework

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gyurjyan, Vardan; Matta, Sebastian Mancilla; Oyarzun, Ricardo

    2016-11-01

    In this paper we present SOA based CLAS12 event Reconstruction and Analyses (CLARA) framework. CLARA design focus is on two main traits: real-time data stream processing, and service-oriented architecture (SOA) in a flow based programming (FBP) paradigm. Data driven and data centric architecture of CLARA presents an environment for developing agile, elastic, multilingual data processing applications. The CLARA framework presents solutions capable of processing large volumes of data interactively and substantially faster than batch systems.

  20. Fast simulation of reconstructed phylogenies under global time-dependent birth-death processes.

    PubMed

    Höhna, Sebastian

    2013-06-01

    Diversification rates and patterns may be inferred from reconstructed phylogenies. Both the time-dependent and the diversity-dependent birth-death process can produce the same observed patterns of diversity over time. To develop and test new models describing the macro-evolutionary process of diversification, generic and fast algorithms to simulate under these models are necessary. Simulations are not only important for testing and developing models but play an influential role in the assessment of model fit. In the present article, I consider as the model a global time-dependent birth-death process where each species has the same rates but rates may vary over time. For this model, I derive the likelihood of the speciation times from a reconstructed phylogenetic tree and show that each speciation event is independent and identically distributed. This fact can be used to simulate efficiently reconstructed phylogenetic trees when conditioning on the number of species, the time of the process or both. I show the usability of the simulation by approximating the posterior predictive distribution of a birth-death process with decreasing diversification rates applied on a published bird phylogeny (family Cettiidae). The methods described in this manuscript are implemented in the R package TESS, available from the repository CRAN (http://cran.r-project.org/web/packages/TESS/). Supplementary data are available at Bioinformatics online.

  1. Construction and Reconstruction of Identity through Biographical Learning. The Role of Language and Culture.

    ERIC Educational Resources Information Center

    Bron, Agnieszka

    The process by which people construct and reconstruct their identities when they face cultural changes resulting from education, learning, or moving to another culture was examined through a study of narratives from immigrants to Sweden and students who were in the process of learning to become researchers at a well-established university, as well…

  2. K-space data processing for magnetic resonance elastography (MRE).

    PubMed

    Corbin, Nadège; Breton, Elodie; de Mathelin, Michel; Vappou, Jonathan

    2017-04-01

    Magnetic resonance elastography (MRE) requires substantial data processing based on phase image reconstruction, wave enhancement, and inverse problem solving. The objective of this study is to propose a new, fast MRE method based on MR raw data processing, particularly adapted to applications requiring fast MRE measurement or high elastogram update rate. The proposed method allows measuring tissue elasticity directly from raw data without prior phase image reconstruction and without phase unwrapping. Experimental feasibility is assessed both in a gelatin phantom and in the liver of a porcine model in vivo. Elastograms are reconstructed with the raw MRE method and compared to those obtained using conventional MRE. In a third experiment, changes in elasticity are monitored in real-time in a gelatin phantom during its solidification by using both conventional MRE and raw MRE. The raw MRE method shows promising results by providing similar elasticity values to the ones obtained with conventional MRE methods while decreasing the number of processing steps and circumventing the delicate step of phase unwrapping. Limitations of the proposed method are the influence of the magnitude on the elastogram and the requirement for a minimum number of phase offsets. This study demonstrates the feasibility of directly reconstructing elastograms from raw data.

  3. Learning Lives of North Korean Young Defectors: A Preliminary Study of Reconstructing Identity in Career Development

    ERIC Educational Resources Information Center

    Park, Hyewon; Kim, Junghwan; Schied, Fred M.

    2015-01-01

    This study of eleven young North Korean Defectors (NKDs) examines how they engage in daily learning focusing on the process of identity reconstruction through their attempt to engage in career development activities. For the purposes of this paper one case was selected to illustrate how a reconstructed identity is learned. The main research…

  4. Soil life in reconstructed ecosystems: initial soil food web responses after rebuilding a forest soil profile for a climate change experiment

    Treesearch

    Paul T. Rygiewicz; Vicente J. Monleon; Elaine R. Ingham; Kendall J. Martin; Mark G. Johnson

    2010-01-01

    Disrupting ecosystem components, while transferring and reconstructing them for experiments can produce myriad responses. Establishing the extent of these biological responses as the system approaches a new equilibrium allows us more reliably to emulate comparable native systems. That is, the sensitivity of analyzing ecosystem processes in a reconstructed system is...

  5. Mars Entry Atmospheric Data System Trajectory Reconstruction Algorithms and Flight Results

    NASA Technical Reports Server (NTRS)

    Karlgaard, Christopher D.; Kutty, Prasad; Schoenenberger, Mark; Shidner, Jeremy; Munk, Michelle

    2013-01-01

    The Mars Entry Atmospheric Data System is a part of the Mars Science Laboratory, Entry, Descent, and Landing Instrumentation project. These sensors are a system of seven pressure transducers linked to ports on the entry vehicle forebody to record the pressure distribution during atmospheric entry. These measured surface pressures are used to generate estimates of atmospheric quantities based on modeled surface pressure distributions. Specifically, angle of attack, angle of sideslip, dynamic pressure, Mach number, and freestream atmospheric properties are reconstructed from the measured pressures. Such data allows for the aerodynamics to become decoupled from the assumed atmospheric properties, allowing for enhanced trajectory reconstruction and performance analysis as well as an aerodynamic reconstruction, which has not been possible in past Mars entry reconstructions. This paper provides details of the data processing algorithms that are utilized for this purpose. The data processing algorithms include two approaches that have commonly been utilized in past planetary entry trajectory reconstruction, and a new approach for this application that makes use of the pressure measurements. The paper describes assessments of data quality and preprocessing, and results of the flight data reduction from atmospheric entry, which occurred on August 5th, 2012.

  6. Reconstruction of network topology using status-time-series data

    NASA Astrophysics Data System (ADS)

    Pandey, Pradumn Kumar; Badarla, Venkataramana

    2018-01-01

    Uncovering the heterogeneous connection pattern of a networked system from the available status-time-series (STS) data of a dynamical process on the network is of great interest in network science and known as a reverse engineering problem. Dynamical processes on a network are affected by the structure of the network. The dependency between the diffusion dynamics and structure of the network can be utilized to retrieve the connection pattern from the diffusion data. Information of the network structure can help to devise the control of dynamics on the network. In this paper, we consider the problem of network reconstruction from the available status-time-series (STS) data using matrix analysis. The proposed method of network reconstruction from the STS data is tested successfully under susceptible-infected-susceptible (SIS) diffusion dynamics on real-world and computer-generated benchmark networks. High accuracy and efficiency of the proposed reconstruction procedure from the status-time-series data define the novelty of the method. Our proposed method outperforms compressed sensing theory (CST) based method of network reconstruction using STS data. Further, the same procedure of network reconstruction is applied to the weighted networks. The ordering of the edges in the weighted networks is identified with high accuracy.

  7. Convex Accelerated Maximum Entropy Reconstruction

    PubMed Central

    Worley, Bradley

    2016-01-01

    Maximum entropy (MaxEnt) spectral reconstruction methods provide a powerful framework for spectral estimation of nonuniformly sampled datasets. Many methods exist within this framework, usually defined based on the magnitude of a Lagrange multiplier in the MaxEnt objective function. An algorithm is presented here that utilizes accelerated first-order convex optimization techniques to rapidly and reliably reconstruct nonuniformly sampled NMR datasets using the principle of maximum entropy. This algorithm – called CAMERA for Convex Accelerated Maximum Entropy Reconstruction Algorithm – is a new approach to spectral reconstruction that exhibits fast, tunable convergence in both constant-aim and constant-lambda modes. A high-performance, open source NMR data processing tool is described that implements CAMERA, and brief comparisons to existing reconstruction methods are made on several example spectra. PMID:26894476

  8. Reclaiming and Reshaping Life: Patterns of Reconstruction After the Suicide of a Loved One.

    PubMed

    Castelli Dransart, Dolores Angela

    2017-06-01

    The objective of this study is to identify patterns (components and processes) of reconstruction of suicide survivors. In-depth interviews were conducted with 50 survivors of suicide in Switzerland. Data were analyzed using ATLAS.ti and according to the Grounded Theory principles. Survivors of suicide face four major challenges: dealing with the impact of suicide, searching for meaning, clarifying responsibility, and finding a personal style of reaction and coping. The various ways in which survivors fare through the specific processes of the challenges result in various patterns of reconstruction: the vulnerability, transformation, commitment, and hard blow. The unique characteristics and dynamics of each of them are highlighted. Health care providers would benefit from an approach based on the dynamics of the various patterns of reconstruction in providing appropriate support to survivors of suicide.

  9. 3D Representative Volume Element Reconstruction of Fiber Composites via Orientation Tensor and Substructure Features

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Yi; Chen, Wei; Xu, Hongyi

    To provide a seamless integration of manufacturing processing simulation and fiber microstructure modeling, two new stochastic 3D microstructure reconstruction methods are proposed for two types of random fiber composites: random short fiber composites, and Sheet Molding Compounds (SMC) chopped fiber composites. A Random Sequential Adsorption (RSA) algorithm is first developed to embed statistical orientation information into 3D RVE reconstruction of random short fiber composites. For the SMC composites, an optimized Voronoi diagram based approach is developed for capturing the substructure features of SMC chopped fiber composites. The proposed methods are distinguished from other reconstruction works by providing a way ofmore » integrating statistical information (fiber orientation tensor) obtained from material processing simulation, as well as capturing the multiscale substructures of the SMC composites.« less

  10. Data preparation and evaluation techniques for x-ray diffraction microscopy.

    PubMed

    Steinbrener, Jan; Nelson, Johanna; Huang, Xiaojing; Marchesini, Stefano; Shapiro, David; Turner, Joshua J; Jacobsen, Chris

    2010-08-30

    The post-experiment processing of X-ray Diffraction Microscopy data is often time-consuming and difficult. This is mostly due to the fact that even if a preliminary result has been reconstructed, there is no definitive answer as to whether or not a better result with more consistently retrieved phases can still be obtained. We show here that the first step in data analysis, the assembly of two-dimensional diffraction patterns from a large set of raw diffraction data, is crucial to obtaining reconstructions of highest possible consistency. We have developed software that automates this process and results in consistently accurate diffraction patterns. We have furthermore derived some criteria of validity for a tool commonly used to assess the consistency of reconstructions, the phase retrieval transfer function, and suggest a modified version that has improved utility for judging reconstruction quality.

  11. Spacelab Data Processing Facility

    NASA Technical Reports Server (NTRS)

    1983-01-01

    The capabilities of the Spacelab Data Processing Facility (SPDPF) are highlighted. The capturing, quality monitoring, processing, accounting, and forwarding of vital Spacelab data to various user facilities around the world are described.

  12. Plenoptic particle image velocimetry with multiple plenoptic cameras

    NASA Astrophysics Data System (ADS)

    Fahringer, Timothy W.; Thurow, Brian S.

    2018-07-01

    Plenoptic particle image velocimetry was recently introduced as a viable three-dimensional, three-component velocimetry technique based on light field cameras. One of the main benefits of this technique is its single camera configuration allowing the technique to be applied in facilities with limited optical access. The main drawback of this configuration is decreased accuracy in the out-of-plane dimension. This work presents a solution with the addition of a second plenoptic camera in a stereo-like configuration. A framework for reconstructing volumes with multiple plenoptic cameras including the volumetric calibration and reconstruction algorithms, including: integral refocusing, filtered refocusing, multiplicative refocusing, and MART are presented. It is shown that the addition of a second camera improves the reconstruction quality and removes the ‘cigar’-like elongation associated with the single camera system. In addition, it is found that adding a third camera provides minimal improvement. Further metrics of the reconstruction quality are quantified in terms of a reconstruction algorithm, particle density, number of cameras, camera separation angle, voxel size, and the effect of common image noise sources. In addition, a synthetic Gaussian ring vortex is used to compare the accuracy of the single and two camera configurations. It was determined that the addition of a second camera reduces the RMSE velocity error from 1.0 to 0.1 voxels in depth and 0.2 to 0.1 voxels in the lateral spatial directions. Finally, the technique is applied experimentally on a ring vortex and comparisons are drawn from the four presented reconstruction algorithms, where it was found that MART and multiplicative refocusing produced the cleanest vortex structure and had the least shot-to-shot variability. Filtered refocusing is able to produce the desired structure, albeit with more noise and variability, while integral refocusing struggled to produce a coherent vortex ring.

  13. Tensor-based Dictionary Learning for Dynamic Tomographic Reconstruction

    PubMed Central

    Tan, Shengqi; Zhang, Yanbo; Wang, Ge; Mou, Xuanqin; Cao, Guohua; Wu, Zhifang; Yu, Hengyong

    2015-01-01

    In dynamic computed tomography (CT) reconstruction, the data acquisition speed limits the spatio-temporal resolution. Recently, compressed sensing theory has been instrumental in improving CT reconstruction from far few-view projections. In this paper, we present an adaptive method to train a tensor-based spatio-temporal dictionary for sparse representation of an image sequence during the reconstruction process. The correlations among atoms and across phases are considered to capture the characteristics of an object. The reconstruction problem is solved by the alternating direction method of multipliers. To recover fine or sharp structures such as edges, the nonlocal total variation is incorporated into the algorithmic framework. Preclinical examples including a sheep lung perfusion study and a dynamic mouse cardiac imaging demonstrate that the proposed approach outperforms the vectorized dictionary-based CT reconstruction in the case of few-view reconstruction. PMID:25779991

  14. Low dose reconstruction algorithm for differential phase contrast imaging.

    PubMed

    Wang, Zhentian; Huang, Zhifeng; Zhang, Li; Chen, Zhiqiang; Kang, Kejun; Yin, Hongxia; Wang, Zhenchang; Marco, Stampanoni

    2011-01-01

    Differential phase contrast imaging computed tomography (DPCI-CT) is a novel x-ray inspection method to reconstruct the distribution of refraction index rather than the attenuation coefficient in weakly absorbing samples. In this paper, we propose an iterative reconstruction algorithm for DPCI-CT which benefits from the new compressed sensing theory. We first realize a differential algebraic reconstruction technique (DART) by discretizing the projection process of the differential phase contrast imaging into a linear partial derivative matrix. In this way the compressed sensing reconstruction problem of DPCI reconstruction can be transformed to a resolved problem in the transmission imaging CT. Our algorithm has the potential to reconstruct the refraction index distribution of the sample from highly undersampled projection data. Thus it can significantly reduce the dose and inspection time. The proposed algorithm has been validated by numerical simulations and actual experiments.

  15. A multiresolution approach to iterative reconstruction algorithms in X-ray computed tomography.

    PubMed

    De Witte, Yoni; Vlassenbroeck, Jelle; Van Hoorebeke, Luc

    2010-09-01

    In computed tomography, the application of iterative reconstruction methods in practical situations is impeded by their high computational demands. Especially in high resolution X-ray computed tomography, where reconstruction volumes contain a high number of volume elements (several giga voxels), this computational burden prevents their actual breakthrough. Besides the large amount of calculations, iterative algorithms require the entire volume to be kept in memory during reconstruction, which quickly becomes cumbersome for large data sets. To overcome this obstacle, we present a novel multiresolution reconstruction, which greatly reduces the required amount of memory without significantly affecting the reconstructed image quality. It is shown that, combined with an efficient implementation on a graphical processing unit, the multiresolution approach enables the application of iterative algorithms in the reconstruction of large volumes at an acceptable speed using only limited resources.

  16. Methodology for Image-Based Reconstruction of Ventricular Geometry for Patient-Specific Modeling of Cardiac Electrophysiology

    PubMed Central

    Prakosa, A.; Malamas, P.; Zhang, S.; Pashakhanloo, F.; Arevalo, H.; Herzka, D. A.; Lardo, A.; Halperin, H.; McVeigh, E.; Trayanova, N.; Vadakkumpadan, F.

    2014-01-01

    Patient-specific modeling of ventricular electrophysiology requires an interpolated reconstruction of the 3-dimensional (3D) geometry of the patient ventricles from the low-resolution (Lo-res) clinical images. The goal of this study was to implement a processing pipeline for obtaining the interpolated reconstruction, and thoroughly evaluate the efficacy of this pipeline in comparison with alternative methods. The pipeline implemented here involves contouring the epi- and endocardial boundaries in Lo-res images, interpolating the contours using the variational implicit functions method, and merging the interpolation results to obtain the ventricular reconstruction. Five alternative interpolation methods, namely linear, cubic spline, spherical harmonics, cylindrical harmonics, and shape-based interpolation were implemented for comparison. In the thorough evaluation of the processing pipeline, Hi-res magnetic resonance (MR), computed tomography (CT), and diffusion tensor (DT) MR images from numerous hearts were used. Reconstructions obtained from the Hi-res images were compared with the reconstructions computed by each of the interpolation methods from a sparse sample of the Hi-res contours, which mimicked Lo-res clinical images. Qualitative and quantitative comparison of these ventricular geometry reconstructions showed that the variational implicit functions approach performed better than others. Additionally, the outcomes of electrophysiological simulations (sinus rhythm activation maps and pseudo-ECGs) conducted using models based on the various reconstructions were compared. These electrophysiological simulations demonstrated that our implementation of the variational implicit functions-based method had the best accuracy. PMID:25148771

  17. Comparative assessment of pressure field reconstructions from particle image velocimetry measurements and Lagrangian particle tracking

    NASA Astrophysics Data System (ADS)

    van Gent, P. L.; Michaelis, D.; van Oudheusden, B. W.; Weiss, P.-É.; de Kat, R.; Laskari, A.; Jeon, Y. J.; David, L.; Schanz, D.; Huhn, F.; Gesemann, S.; Novara, M.; McPhaden, C.; Neeteson, N. J.; Rival, D. E.; Schneiders, J. F. G.; Schrijer, F. F. J.

    2017-04-01

    A test case for pressure field reconstruction from particle image velocimetry (PIV) and Lagrangian particle tracking (LPT) has been developed by constructing a simulated experiment from a zonal detached eddy simulation for an axisymmetric base flow at Mach 0.7. The test case comprises sequences of four subsequent particle images (representing multi-pulse data) as well as continuous time-resolved data which can realistically only be obtained for low-speed flows. Particle images were processed using tomographic PIV processing as well as the LPT algorithm `Shake-The-Box' (STB). Multiple pressure field reconstruction techniques have subsequently been applied to the PIV results (Eulerian approach, iterative least-square pseudo-tracking, Taylor's hypothesis approach, and instantaneous Vortex-in-Cell) and LPT results (FlowFit, Vortex-in-Cell-plus, Voronoi-based pressure evaluation, and iterative least-square pseudo-tracking). All methods were able to reconstruct the main features of the instantaneous pressure fields, including methods that reconstruct pressure from a single PIV velocity snapshot. Highly accurate reconstructed pressure fields could be obtained using LPT approaches in combination with more advanced techniques. In general, the use of longer series of time-resolved input data, when available, allows more accurate pressure field reconstruction. Noise in the input data typically reduces the accuracy of the reconstructed pressure fields, but none of the techniques proved to be critically sensitive to the amount of noise added in the present test case.

  18. Patient Information Needs and Breast Reconstruction After Mastectomy: A Qualitative Meta-Synthesis.

    PubMed

    Carr, Tracey L; Groot, Gary; Cochran, David; Holtslander, Lorraine

    2018-04-27

    Although many women benefit from breast reconstruction after mastectomy, several studies report women's dissatisfaction with the level of information they were provided with before reconstruction. The present meta-synthesis examines the qualitative literature that explores women's experiences of breast reconstruction after mastectomy and highlights women's healthcare information needs. After a comprehensive search of 6 electronic databases (CINAHL, Cochrane Library, EMBASE, MEDLINE, PsycINFO, and Scopus), we followed the methodology for synthesizing qualitative research. The search produced 423 studies, which were assessed against 5 inclusion criteria. A meta-synthesis methodology was used to analyze the data through taxonomic classification and constant targeted comparison. Some 17 studies met the inclusion criteria, and findings from 16 studies were synthesized. The role of the healthcare practitioner is noted as a major influence on women's expectations, and in some instances, women did not feel adequately informed about the outcomes of surgery and the recovery process. In general, women's desire for normality and effective emotional coping shapes their information needs. The information needs of women are better understood after considering women's actual experiences with breast reconstruction. It is important to inform women of the immediate outcomes of reconstruction surgery and the recovery process. In an attempt to better address women's information needs, healthcare practitioners should discover women's initial expectations of reconstruction as a starting point in the consultation. In addition, the research revealed the importance of the nurse navigator in terms of assisting women through the recovery process.This is an open-access article distributed under the terms of the Creative Commons Attribution-Non Commercial-No Derivatives License 4.0 (CCBY-NC-ND), where it is permissible to download and share the work provided it is properly cited. The work cannot be changed in any way or used commercially without permission from the journal.

  19. Online Event Reconstruction in the CBM Experiment at FAIR

    NASA Astrophysics Data System (ADS)

    Akishina, Valentina; Kisel, Ivan

    2018-02-01

    Targeting for rare observables, the CBM experiment will operate at high interaction rates of up to 10 MHz, which is unprecedented in heavy-ion experiments so far. It requires a novel free-streaming readout system and a new concept of data processing. The huge data rates of the CBM experiment will be reduced online to the recordable rate before saving the data to the mass storage. Full collision reconstruction and selection will be performed online in a dedicated processor farm. In order to make an efficient event selection online a clean sample of particles has to be provided by the reconstruction package called First Level Event Selection (FLES). The FLES reconstruction and selection package consists of several modules: track finding, track fitting, event building, short-lived particles finding, and event selection. Since detector measurements contain also time information, the event building is done at all stages of the reconstruction process. The input data are distributed within the FLES farm in a form of time-slices. A time-slice is reconstructed in parallel between processor cores. After all tracks of the whole time-slice are found and fitted, they are collected into clusters of tracks originated from common primary vertices, which then are fitted, thus identifying the interaction points. Secondary tracks are associated with primary vertices according to their estimated production time. After that short-lived particles are found and the full event building process is finished. The last stage of the FLES package is a selection of events according to the requested trigger signatures. The event reconstruction procedure and the results of its application to simulated collisions in the CBM detector setup are presented and discussed in detail.

  20. Joint MR-PET reconstruction using a multi-channel image regularizer

    PubMed Central

    Koesters, Thomas; Otazo, Ricardo; Bredies, Kristian; Sodickson, Daniel K

    2016-01-01

    While current state of the art MR-PET scanners enable simultaneous MR and PET measurements, the acquired data sets are still usually reconstructed separately. We propose a new multi-modality reconstruction framework using second order Total Generalized Variation (TGV) as a dedicated multi-channel regularization functional that jointly reconstructs images from both modalities. In this way, information about the underlying anatomy is shared during the image reconstruction process while unique differences are preserved. Results from numerical simulations and in-vivo experiments using a range of accelerated MR acquisitions and different MR image contrasts demonstrate improved PET image quality, resolution, and quantitative accuracy. PMID:28055827

  1. The PRISM (Pliocene Palaeoclimate) reconstruction: Time for a paradigm shift

    USGS Publications Warehouse

    Dowsett, Harry J.; Robinson, Marci M.; Stoll, Danielle K.; Foley, Kevin M.; Johnson, Andrew L. A.; Williams, Mark; Riesselman, Christina

    2013-01-01

    Global palaeoclimate reconstructions have been invaluable to our understanding of the causes and effects of climate change, but single-temperature representations of the oceanic mixed layer for data–model comparisons are outdated, and the time for a paradigm shift in marine palaeoclimate reconstruction is overdue. The new paradigm in marine palaeoclimate reconstruction stems the loss of valuable climate information and instead presents a holistic and nuanced interpretation of multi-dimensional oceanographic processes and responses. A wealth of environmental information is hidden within the US Geological Survey's Pliocene Research,Interpretation and Synoptic Mapping (PRISM) marine palaeoclimate reconstruction, and we introduce here a plan to incorporate all valuable climate data into the next generation of PRISM products. Beyond the global approach and focus, we plan to incorporate regional climate dynamics with emphasis on processes, integrating multiple environmental proxies wherever available in order to better characterize the mixed layer, and developing a finer time slice within the Mid-Piacenzian Age of the Pliocene, complemented by underused proxies that offer snapshots into environmental conditions. The result will be a proxy-rich, temporally nested, process-oriented approach in a digital format - a relational database with geographic information system capabilities comprising a three-dimensional grid representing the surface layer, with a plethora of data in each cell.

  2. Single-exposure color digital holography

    NASA Astrophysics Data System (ADS)

    Feng, Shaotong; Wang, Yanhui; Zhu, Zhuqing; Nie, Shouping

    2010-11-01

    In this paper, we report a method for color image reconstruction by recording only one single multi-wavelength hologram. In the recording process, three lasers of different wavelengths emitting in the red, green and blue regions are used for illuminating on the object and the object diffraction fields will arrive at the hologram plane simultaneously. Three reference beams with different spatial angles will interfere with the corresponding object diffraction fields on the hologram plane, respectively. Finally, a series of sub-holograms incoherently overlapped on the CCD to be recorded as a multi-wavelength hologram. Angular division multiplexing is employed to reference beams so that the spatial spectra of the multiple recordings will be separated in the Fourier plane. In the reconstruction process, the multi-wavelength hologram will be Fourier transformed into its Fourier plane, where the spatial spectra of different wavelengths are separated and can be easily extracted by employing frequency filtering. The extracted spectra are used to reconstruct the corresponding monochromatic complex amplitudes, which will be synthesized to reconstruct the color image. For singleexposure recording technique, it is convenient for applications on the real-time image processing fields. However, the quality of the reconstructed images is affected by speckle noise. How to improve the quality of the images needs for further research.

  3. Status of the MIND simulation and analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cervera Villanueva, A.; Martin-Albo, J.; Laing, A.

    2010-03-30

    A realistic simulation of the Neutrino Factory detectors is required in order to fully understand the sensitivity of such a facility to the remaining parameters and degeneracies of the neutrino mixing matrix. Here described is the status of a modular software framework being developed to accommodate such a study. The results of initial studies of the reconstruction software and expected efficiency curves in the context of the golden channel are given.

  4. Trajectory measurements and correlations in the final focus beam line at the KEK Accelerator Test Facility

    NASA Astrophysics Data System (ADS)

    Renier, Y.; Bambade, P.; Tauchi, T.; White, G. R.; Boogert, S.

    2013-06-01

    The Accelerator Test Facility 2 (ATF2) commissioning group aims to demonstrate the feasibility of the beam delivery system of the next linear colliders (ILC and CLIC) as well as to define and to test the tuning methods. As the design vertical beam sizes of the linear colliders are about few nanometers, the stability of the trajectory as well as the control of the aberrations are very critical. ATF2 commissioning started in December 2008, and thanks to submicron resolution beam position monitors (BPMs), it has been possible to measure the beam position fluctuation along the final focus of ATF2 during the 2009 runs. The optics was not the nominal one yet, with a lower focusing to make the tuning easier. In this paper, a method to measure the noise of each BPM every pulse, in a model-independent way, will be presented. A method to reconstruct the trajectory’s fluctuations is developed which uses the previously determined BPM resolution. As this reconstruction provides a measurement of the beam energy fluctuations, it was also possible to measure the horizontal and vertical dispersion function at each BPMs parasitically. The spatial and angular dispersions can be fitted from these measurements with uncertainties comparable with usual measurements.

  5. Event reconstruction for the CBM-RICH prototype beamtest data in 2014

    NASA Astrophysics Data System (ADS)

    Adamczewski-Musch, J.; Akishin, P.; Becker, K.-H.; Belogurov, S.; Bendarouach, J.; Boldyreva, N.; Deveaux, C.; Dobyrn, V.; Dürr, M.; Eschke, J.; Förtsch, J.; Heep, J.; Höhne, C.; Kampert, K.-H.; Kochenda, L.; Kopfer, J.; Kravtsov, P.; Kres, I.; Lebedev, S.; Lebedeva, E.; Leonova, E.; Linev, S.; Mahmoud, T.; Michel, J.; Miftakhov, N.; Niebur, W.; Ovcharenko, E.; Patel, V.; Pauly, C.; Pfeifer, D.; Querchfeld, S.; Rautenberg, J.; Reinecke, S.; Riabov, Y.; Roshchin, E.; Samsonov, V.; Schetinin, V.; Tarasenkova, O.; Traxler, M.; Ugur, C.; Vznuzdaev, E.; Vznuzdaev, M.

    2017-12-01

    The Compressed Baryonic Matter (CBM) experiment at the future FAIR facility will investigate the QCD phase diagram at high net baryon densities and moderate temperatures in A+A collisions from 2 to 11 AGeV (SIS100). Electron identification in CBM will be performed by a Ring Imaging Cherenkov (RICH) detector and Transition Radiation Detectors (TRD). A real size prototype of the RICH detector was tested together with other CBM groups at the CERN PS/T9 beam line in 2014. For the first time the data format used the FLESnet protocol from CBM delivering free streaming data. The analysis was fully performed within the CBMROOT framework. In this contribution the data analysis and the event reconstruction methods which were used for obtained data are discussed. Rings were reconstructed using an algorithm based on the Hough Transform method and their parameters were derived with high accuracy by circle and ellipse fitting procedures. We present results of the application of the presented algorithms. In particular we compare results with and without Wavelength shifting (WLS) coating.

  6. 3D reconstruction of nuclear reactions using GEM TPC with planar readout

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bihałowicz, Jan Stefan

    2015-02-24

    The research program of the Extreme Light Infrastructure – Nuclear Physics (ELI-NP) laboratory under construction in Magurele, Romania facilities the need of developing a gaseous active-target detector providing 3D reconstruction of charged products of nuclear reactions induced by gamma beam. The monoenergetic, high-energy (E{sub γ} > 19 MeV) gamma beam of intensity 10{sup 13}γ/s allows studying nuclear reactions in astrophysics. A Time Projection Chamber with crossed strip readout (eTPC) is proposed as one of the imaging detectors. The special feature of the readout electrode structure is a 2D reconstruction based on the information read out simultaneously from three arrays ofmore » strips that form virtual pixels. It is expected to reach similar spatial resolution as for pixel readout at largely reduced cost of electronics. The paper presents the current progress and first results of the small scale prototype TPC which is a one of implementation steps towards eTPC detector proposed in the Technical Design Report of Charged Particles Detection at ELI-NP.« less

  7. Bayesian Inference for Source Reconstruction: A Real-World Application

    PubMed Central

    Yee, Eugene; Hoffman, Ian; Ungar, Kurt

    2014-01-01

    This paper applies a Bayesian probabilistic inferential methodology for the reconstruction of the location and emission rate from an actual contaminant source (emission from the Chalk River Laboratories medical isotope production facility) using a small number of activity concentration measurements of a noble gas (Xenon-133) obtained from three stations that form part of the International Monitoring System radionuclide network. The sampling of the resulting posterior distribution of the source parameters is undertaken using a very efficient Markov chain Monte Carlo technique that utilizes a multiple-try differential evolution adaptive Metropolis algorithm with an archive of past states. It is shown that the principal difficulty in the reconstruction lay in the correct specification of the model errors (both scale and structure) for use in the Bayesian inferential methodology. In this context, two different measurement models for incorporation of the model error of the predicted concentrations are considered. The performance of both of these measurement models with respect to their accuracy and precision in the recovery of the source parameters is compared and contrasted. PMID:27379292

  8. Interior reconstruction method based on rotation-translation scanning model.

    PubMed

    Wang, Xianchao; Tang, Ziyue; Yan, Bin; Li, Lei; Bao, Shanglian

    2014-01-01

    In various applications of computed tomography (CT), it is common that the reconstructed object is over the field of view (FOV) or we may intend to sue a FOV which only covers the region of interest (ROI) for the sake of reducing radiation dose. These kinds of imaging situations often lead to interior reconstruction problems which are difficult cases in the reconstruction field of CT, due to the truncated projection data at every view angle. In this paper, an interior reconstruction method is developed based on a rotation-translation (RT) scanning model. The method is implemented by first scanning the reconstructed region, and then scanning a small region outside the support of the reconstructed object after translating the rotation centre. The differentiated backprojection (DBP) images of the reconstruction region and the small region outside the object can be respectively obtained from the two-time scanning data without data rebinning process. At last, the projection onto convex sets (POCS) algorithm is applied to reconstruct the interior region. Numerical simulations are conducted to validate the proposed reconstruction method.

  9. Accelerated speckle imaging with the ATST visible broadband imager

    NASA Astrophysics Data System (ADS)

    Wöger, Friedrich; Ferayorni, Andrew

    2012-09-01

    The Advanced Technology Solar Telescope (ATST), a 4 meter class telescope for observations of the solar atmosphere currently in construction phase, will generate data at rates of the order of 10 TB/day with its state of the art instrumentation. The high-priority ATST Visible Broadband Imager (VBI) instrument alone will create two data streams with a bandwidth of 960 MB/s each. Because of the related data handling issues, these data will be post-processed with speckle interferometry algorithms in near-real time at the telescope using the cost-effective Graphics Processing Unit (GPU) technology that is supported by the ATST Data Handling System. In this contribution, we lay out the VBI-specific approach to its image processing pipeline, put this into the context of the underlying ATST Data Handling System infrastructure, and finally describe the details of how the algorithms were redesigned to exploit data parallelism in the speckle image reconstruction algorithms. An algorithm re-design is often required to efficiently speed up an application using GPU technology; we have chosen NVIDIA's CUDA language as basis for our implementation. We present our preliminary results of the algorithm performance using our test facilities, and base a conservative estimate on the requirements of a full system that could achieve near real-time performance at ATST on these results.

  10. Stromal vascular fraction isolated from lipo-aspirates using an automated processing system: bench and bed analysis.

    PubMed

    Doi, Kentaro; Tanaka, Shinsuke; Iida, Hideo; Eto, Hitomi; Kato, Harunosuke; Aoi, Noriyuki; Kuno, Shinichiro; Hirohi, Toshitsugu; Yoshimura, Kotaro

    2013-11-01

    The heterogeneous stromal vascular fraction (SVF), containing adipose-derived stem/progenitor cells (ASCs), can be easily isolated through enzymatic digestion of aspirated adipose tissue. In clinical settings, however, strict control of technical procedures according to standard operating procedures and validation of cell-processing conditions are required. Therefore, we evaluated the efficiency and reliability of an automated system for SVF isolation from adipose tissue. SVF cells, freshly isolated using the automated procedure, showed comparable number and viability to those from manual isolation. Flow cytometric analysis confirmed an SVF cell composition profile similar to that after manual isolation. In addition, the ASC yield after 1 week in culture was also not significantly different between the two groups. Our clinical study, in which SVF cells isolated with the automated system were transplanted with aspirated fat tissue for soft tissue augmentation/reconstruction in 42 patients, showed satisfactory outcomes with no serious side-effects. Taken together, our results suggested that the automated isolation system is as reliable a method as manual isolation and may also be useful in clinical settings. Automated isolation is expected to enable cell-based clinical trials in small facilities with an aseptic room, without the necessity of a good manufacturing practice-level cell processing area. Copyright © 2012 John Wiley & Sons, Ltd.

  11. A graphic user interface for efficient 3D photo-reconstruction based on free software

    NASA Astrophysics Data System (ADS)

    Castillo, Carlos; James, Michael; Gómez, Jose A.

    2015-04-01

    Recently, different studies have stressed the applicability of 3D photo-reconstruction based on Structure from Motion algorithms in a wide range of geoscience applications. For the purpose of image photo-reconstruction, a number of commercial and freely available software packages have been developed (e.g. Agisoft Photoscan, VisualSFM). The workflow involves typically different stages such as image matching, sparse and dense photo-reconstruction, point cloud filtering and georeferencing. For approaches using open and free software, each of these stages usually require different applications. In this communication, we present an easy-to-use graphic user interface (GUI) developed in Matlab® code as a tool for efficient 3D photo-reconstruction making use of powerful existing software: VisualSFM (Wu, 2015) for photo-reconstruction and CloudCompare (Girardeau-Montaut, 2015) for point cloud processing. The GUI performs as a manager of configurations and algorithms, taking advantage of the command line modes of existing software, which allows an intuitive and automated processing workflow for the geoscience user. The GUI includes several additional features: a) a routine for significantly reducing the duration of the image matching operation, normally the most time consuming stage; b) graphical outputs for understanding the overall performance of the algorithm (e.g. camera connectivity, point cloud density); c) a number of useful options typically performed before and after the photo-reconstruction stage (e.g. removal of blurry images, image renaming, vegetation filtering); d) a manager of batch processing for the automated reconstruction of different image datasets. In this study we explore the advantages of this new tool by testing its performance using imagery collected in several soil erosion applications. References Girardeau-Montaut, D. 2015. CloudCompare documentation accessed at http://cloudcompare.org/ Wu, C. 2015. VisualSFM documentation access at http://ccwu.me/vsfm/doc.html#.

  12. The artificial retina processor for track reconstruction at the LHC crossing rate

    DOE PAGES

    Abba, A.; Bedeschi, F.; Citterio, M.; ...

    2015-03-16

    We present results of an R&D study for a specialized processor capable of precisely reconstructing, in pixel detectors, hundreds of charged-particle tracks from high-energy collisions at 40 MHz rate. We apply a highly parallel pattern-recognition algorithm, inspired by studies of the processing of visual images by the brain as it happens in nature, and describe in detail an efficient hardware implementation in high-speed, high-bandwidth FPGA devices. This is the first detailed demonstration of reconstruction of offline-quality tracks at 40 MHz and makes the device suitable for processing Large Hadron Collider events at the full crossing frequency.

  13. A spectral image processing algorithm for evaluating the influence of the illuminants on the reconstructed reflectance

    NASA Astrophysics Data System (ADS)

    Toadere, Florin

    2017-12-01

    A spectral image processing algorithm that allows the illumination of the scene with different illuminants together with the reconstruction of the scene's reflectance is presented. Color checker spectral image and CIE A (warm light 2700 K), D65 (cold light 6500 K) and Cree TW Series LED T8 (4000 K) are employed for scene illumination. Illuminants used in the simulations have different spectra and, as a result of their illumination, the colors of the scene change. The influence of the illuminants on the reconstruction of the scene's reflectance is estimated. Demonstrative images and reflectance showing the operation of the algorithm are illustrated.

  14. 42 CFR 82.32 - How will NIOSH make changes in scientific elements underlying the dose reconstruction process...

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... PUBLIC HEALTH SERVICE, DEPARTMENT OF HEALTH AND HUMAN SERVICES OCCUPATIONAL SAFETY AND HEALTH RESEARCH AND RELATED ACTIVITIES METHODS FOR CONDUCTING DOSE RECONSTRUCTION UNDER THE ENERGY EMPLOYEES...

  15. Multi-threaded Event Processing with DANA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    David Lawrence; Elliott Wolin

    2007-05-14

    The C++ data analysis framework DANA has been written to support the next generation of Nuclear Physics experiments at Jefferson Lab commensurate with the anticipated 12GeV upgrade. The DANA framework was designed to allow multi-threaded event processing with a minimal impact on developers of reconstruction software. This document describes how DANA implements multi-threaded event processing and compares it to simply running multiple instances of a program. Also presented are relative reconstruction rates for Pentium4, Xeon, and Opteron based machines.

  16. 18 CFR 157.21 - Pre-filing procedures and review process for LNG terminal facilities and other natural gas...

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... the pre-filing review of any pipeline or other natural gas facilities, including facilities not... from the subject LNG terminal facilities to the existing natural gas pipeline infrastructure. (b) Other... and review process for LNG terminal facilities and other natural gas facilities prior to filing of...

  17. A fast CT reconstruction scheme for a general multi-core PC.

    PubMed

    Zeng, Kai; Bai, Erwei; Wang, Ge

    2007-01-01

    Expensive computational cost is a severe limitation in CT reconstruction for clinical applications that need real-time feedback. A primary example is bolus-chasing computed tomography (CT) angiography (BCA) that we have been developing for the past several years. To accelerate the reconstruction process using the filtered backprojection (FBP) method, specialized hardware or graphics cards can be used. However, specialized hardware is expensive and not flexible. The graphics processing unit (GPU) in a current graphic card can only reconstruct images in a reduced precision and is not easy to program. In this paper, an acceleration scheme is proposed based on a multi-core PC. In the proposed scheme, several techniques are integrated, including utilization of geometric symmetry, optimization of data structures, single-instruction multiple-data (SIMD) processing, multithreaded computation, and an Intel C++ compilier. Our scheme maintains the original precision and involves no data exchange between the GPU and CPU. The merits of our scheme are demonstrated in numerical experiments against the traditional implementation. Our scheme achieves a speedup of about 40, which can be further improved by several folds using the latest quad-core processors.

  18. 3D road marking reconstruction from street-level calibrated stereo pairs

    NASA Astrophysics Data System (ADS)

    Soheilian, Bahman; Paparoditis, Nicolas; Boldo, Didier

    This paper presents an automatic approach to road marking reconstruction using stereo pairs acquired by a mobile mapping system in a dense urban area. Two types of road markings were studied: zebra crossings (crosswalks) and dashed lines. These two types of road markings consist of strips having known shape and size. These geometric specifications are used to constrain the recognition of strips. In both cases (i.e. zebra crossings and dashed lines), the reconstruction method consists of three main steps. The first step extracts edge points from the left and right images of a stereo pair and computes 3D linked edges using a matching process. The second step comprises a filtering process that uses the known geometric specifications of road marking objects. The goal is to preserve linked edges that can plausibly belong to road markings and to filter others out. The final step uses the remaining linked edges to fit a theoretical model to the data. The method developed has been used for processing a large number of images. Road markings are successfully and precisely reconstructed in dense urban areas under real traffic conditions.

  19. A Fast CT Reconstruction Scheme for a General Multi-Core PC

    PubMed Central

    Zeng, Kai; Bai, Erwei; Wang, Ge

    2007-01-01

    Expensive computational cost is a severe limitation in CT reconstruction for clinical applications that need real-time feedback. A primary example is bolus-chasing computed tomography (CT) angiography (BCA) that we have been developing for the past several years. To accelerate the reconstruction process using the filtered backprojection (FBP) method, specialized hardware or graphics cards can be used. However, specialized hardware is expensive and not flexible. The graphics processing unit (GPU) in a current graphic card can only reconstruct images in a reduced precision and is not easy to program. In this paper, an acceleration scheme is proposed based on a multi-core PC. In the proposed scheme, several techniques are integrated, including utilization of geometric symmetry, optimization of data structures, single-instruction multiple-data (SIMD) processing, multithreaded computation, and an Intel C++ compilier. Our scheme maintains the original precision and involves no data exchange between the GPU and CPU. The merits of our scheme are demonstrated in numerical experiments against the traditional implementation. Our scheme achieves a speedup of about 40, which can be further improved by several folds using the latest quad-core processors. PMID:18256731

  20. A novel pre-processing technique for improving image quality in digital breast tomosynthesis.

    PubMed

    Kim, Hyeongseok; Lee, Taewon; Hong, Joonpyo; Sabir, Sohail; Lee, Jung-Ryun; Choi, Young Wook; Kim, Hak Hee; Chae, Eun Young; Cho, Seungryong

    2017-02-01

    Nonlinear pre-reconstruction processing of the projection data in computed tomography (CT) where accurate recovery of the CT numbers is important for diagnosis is usually discouraged, for such a processing would violate the physics of image formation in CT. However, one can devise a pre-processing step to enhance detectability of lesions in digital breast tomosynthesis (DBT) where accurate recovery of the CT numbers is fundamentally impossible due to the incompleteness of the scanned data. Since the detection of lesions such as micro-calcifications and mass in breasts is the purpose of using DBT, it is justified that a technique producing higher detectability of lesions is a virtue. A histogram modification technique was developed in the projection data domain. Histogram of raw projection data was first divided into two parts: One for the breast projection data and the other for background. Background pixel values were set to a single value that represents the boundary between breast and background. After that, both histogram parts were shifted by an appropriate amount of offset and the histogram-modified projection data were log-transformed. Filtered-backprojection (FBP) algorithm was used for image reconstruction of DBT. To evaluate performance of the proposed method, we computed the detectability index for the reconstructed images from clinically acquired data. Typical breast border enhancement artifacts were greatly suppressed and the detectability of calcifications and masses was increased by use of the proposed method. Compared to a global threshold-based post-reconstruction processing technique, the proposed method produced images of higher contrast without invoking additional image artifacts. In this work, we report a novel pre-processing technique that improves detectability of lesions in DBT and has potential advantages over the global threshold-based post-reconstruction processing technique. The proposed method not only increased the lesion detectability but also reduced typical image artifacts pronounced in conventional FBP-based DBT. © 2016 American Association of Physicists in Medicine.

  1. Simultaneous two-wavelength holographic interferometry in a superorbital expansion tube facility.

    PubMed

    McIntyre, T J; Wegener, M J; Bishop, A I; Rubinsztein-Dunlop, H

    1997-11-01

    A new variation of holographic interferometry has been utilized to perform simultaneous two-wavelength measurements, allowing quantitative analysis of the heavy particle and electron densities in a superorbital facility. An air test gas accelerated to 12 km/s was passed over a cylindrical model, simulating reentry conditions encountered by a space vehicle on a superorbital mission. Laser beams with two different wavelengths have been overlapped, passed through the test section, and simultaneously recorded on a single holographic plate. Reconstruction of the hologram generated two separate interferograms at different angles from which the quantitative measurements were made. With this technique, a peak electron concentration of (5.5 +/- 0.5) x 10(23) m(-3) was found behind a bow shock on a cylinder.

  2. Reconstructing European forest management from 1600 to 2010

    NASA Astrophysics Data System (ADS)

    McGrath, M. J.; Luyssaert, S.; Meyfroidt, P.; Kaplan, J. O.; Buergi, M.; Chen, Y.; Erb, K.; Gimmi, U.; McInerney, D.; Naudts, K.; Otto, J.; Pasztor, F.; Ryder, J.; Schelhaas, M.-J.; Valade, A.

    2015-04-01

    European forest use for fuel, timber and food dates back to pre-Roman times. Century-scale ecological processes and their legacy effects require accounting for forest management when studying today's forest carbon sink. Forest management reconstructions that are used to drive land surface models are one way to quantify the impact of both historical and today's large scale application of forest management on today's forest-related carbon sink and surface climate. In this study we reconstruct European forest management from 1600 to 2010 making use of diverse approaches, data sources and assumptions. Between 1600 and 1828, a demand-supply approach was used in which wood supply was reconstructed based on estimates of historical annual wood increment and land cover reconstructions. For the same period demand estimates accounted for the fuelwood needed in households, wood used in food processing, charcoal used in metal smelting and salt production, timber for construction and population estimates. Comparing estimated demand and supply resulted in a spatially explicit reconstruction of the share of forests under coppice, high stand management and forest left unmanaged. For the reconstruction between 1829 and 2010 a supply-driven back-casting method was used. The method used age reconstructions from the years 1950 to 2010 as its starting point. Our reconstruction reproduces the most important changes in forest management between 1600 and 2010: (1) an increase of 593 000 km2 in conifers at the expense of deciduous forest (decreasing by 538 000 km2), (2) a 612 000 km2 decrease in unmanaged forest, (3) a 152 000 km2 decrease in coppice management, (4) a 818 000 km2 increase in high stand management, and (5) the rise and fall of litter raking which at its peak in 1853 removed 50 Tg dry litter per year.

  3. Ill-posed problem and regularization in reconstruction of radiobiological parameters from serial tumor imaging data

    NASA Astrophysics Data System (ADS)

    Chvetsov, Alevei V.; Sandison, George A.; Schwartz, Jeffrey L.; Rengan, Ramesh

    2015-11-01

    The main objective of this article is to improve the stability of reconstruction algorithms for estimation of radiobiological parameters using serial tumor imaging data acquired during radiation therapy. Serial images of tumor response to radiation therapy represent a complex summation of several exponential processes as treatment induced cell inactivation, tumor growth rates, and the rate of cell loss. Accurate assessment of treatment response would require separation of these processes because they define radiobiological determinants of treatment response and, correspondingly, tumor control probability. However, the estimation of radiobiological parameters using imaging data can be considered an inverse ill-posed problem because a sum of several exponentials would produce the Fredholm integral equation of the first kind which is ill posed. Therefore, the stability of reconstruction of radiobiological parameters presents a problem even for the simplest models of tumor response. To study stability of the parameter reconstruction problem, we used a set of serial CT imaging data for head and neck cancer and a simplest case of a two-level cell population model of tumor response. Inverse reconstruction was performed using a simulated annealing algorithm to minimize a least squared objective function. Results show that the reconstructed values of cell surviving fractions and cell doubling time exhibit significant nonphysical fluctuations if no stabilization algorithms are applied. However, after applying a stabilization algorithm based on variational regularization, the reconstruction produces statistical distributions for survival fractions and doubling time that are comparable to published in vitro data. This algorithm is an advance over our previous work where only cell surviving fractions were reconstructed. We conclude that variational regularization allows for an increase in the number of free parameters in our model which enables development of more-advanced parameter reconstruction algorithms.

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Slattery, Stuart R.

    In this study we analyze and extend mesh-free algorithms for three-dimensional data transfer problems in partitioned multiphysics simulations. We first provide a direct comparison between a mesh-based weighted residual method using the common-refinement scheme and two mesh-free algorithms leveraging compactly supported radial basis functions: one using a spline interpolation and one using a moving least square reconstruction. Through the comparison we assess both the conservation and accuracy of the data transfer obtained from each of the methods. We do so for a varying set of geometries with and without curvature and sharp features and for functions with and without smoothnessmore » and with varying gradients. Our results show that the mesh-based and mesh-free algorithms are complementary with cases where each was demonstrated to perform better than the other. We then focus on the mesh-free methods by developing a set of algorithms to parallelize them based on sparse linear algebra techniques. This includes a discussion of fast parallel radius searching in point clouds and restructuring the interpolation algorithms to leverage data structures and linear algebra services designed for large distributed computing environments. The scalability of our new algorithms is demonstrated on a leadership class computing facility using a set of basic scaling studies. Finally, these scaling studies show that for problems with reasonable load balance, our new algorithms for both spline interpolation and moving least square reconstruction demonstrate both strong and weak scalability using more than 100,000 MPI processes with billions of degrees of freedom in the data transfer operation.« less

  5. Model-based adaptive 3D sonar reconstruction in reverberating environments.

    PubMed

    Saucan, Augustin-Alexandru; Sintes, Christophe; Chonavel, Thierry; Caillec, Jean-Marc Le

    2015-10-01

    In this paper, we propose a novel model-based approach for 3D underwater scene reconstruction, i.e., bathymetry, for side scan sonar arrays in complex and highly reverberating environments like shallow water areas. The presence of multipath echoes and volume reverberation generates false depth estimates. To improve the resulting bathymetry, this paper proposes and develops an adaptive filter, based on several original geometrical models. This multimodel approach makes it possible to track and separate the direction of arrival trajectories of multiple echoes impinging the array. Echo tracking is perceived as a model-based processing stage, incorporating prior information on the temporal evolution of echoes in order to reject cluttered observations generated by interfering echoes. The results of the proposed filter on simulated and real sonar data showcase the clutter-free and regularized bathymetric reconstruction. Model validation is carried out with goodness of fit tests, and demonstrates the importance of model-based processing for bathymetry reconstruction.

  6. Photogrammetric Network for Evaluation of Human Faces for Face Reconstruction Purpose

    NASA Astrophysics Data System (ADS)

    Schrott, P.; Detrekői, Á.; Fekete, K.

    2012-08-01

    Facial reconstruction is the process of reconstructing the geometry of faces of persons from skeletal remains. A research group (BME Cooperation Research Center for Biomechanics) was formed representing several organisations to combine knowledgebases of different disciplines like anthropology, medical, mechanical, archaeological sciences etc. to computerize the face reconstruction process based on a large dataset of 3D face and skull models gathered from living persons: cranial data from CT scans and face models from photogrammetric evaluations. The BUTE Dept. of Photogrammetry and Geoinformatics works on the method and technology of the 3D data acquisition for the face models. In this paper we will present the research and results of the photogrammetric network design, the modelling to deal with visibility constraints, and the investigation of the developed basic photogrammetric configuration to specify the result characteristics to be expected using the device built for the photogrammetric face measurements.

  7. Data preparation and evaluation techniques for x-ray diffraction microscopy

    DOE PAGES

    Steinbrener, Jan; Nelson, Johanna; Huang, Xiaojing; ...

    2010-01-01

    The post-experiment processing of X-ray Diffraction Microscopy data is often time-consuming and difficult. This is mostly due to the fact that even if a preliminary result has been reconstructed, there is no definitive answer as to whether or not a better result with more consistently retrieved phases can still be obtained. In addition, we show here that the first step in data analysis, the assembly of two-dimensional diffraction patterns from a large set of raw diffraction data, is crucial to obtaining reconstructions of highest possible consistency. We have developed software that automates this process and results in consistently accurate diffractionmore » patterns. We have furthermore derived some criteria of validity for a tool commonly used to assess the consistency of reconstructions, the phase retrieval transfer function, and suggest a modified version that has improved utility for judging reconstruction quality.« less

  8. Respiratory motion correction in emission tomography image reconstruction.

    PubMed

    Reyes, Mauricio; Malandain, Grégoire; Koulibaly, Pierre Malick; González Ballester, Miguel A; Darcourt, Jacques

    2005-01-01

    In Emission Tomography imaging, respiratory motion causes artifacts in lungs and cardiac reconstructed images, which lead to misinterpretations and imprecise diagnosis. Solutions like respiratory gating, correlated dynamic PET techniques, list-mode data based techniques and others have been tested with improvements over the spatial activity distribution in lungs lesions, but with the disadvantages of requiring additional instrumentation or discarding part of the projection data used for reconstruction. The objective of this study is to incorporate respiratory motion correction directly into the image reconstruction process, without any additional acquisition protocol consideration. To this end, we propose an extension to the Maximum Likelihood Expectation Maximization (MLEM) algorithm that includes a respiratory motion model, which takes into account the displacements and volume deformations produced by the respiratory motion during the data acquisition process. We present results from synthetic simulations incorporating real respiratory motion as well as from phantom and patient data.

  9. Research on assessment and improvement method of remote sensing image reconstruction

    NASA Astrophysics Data System (ADS)

    Sun, Li; Hua, Nian; Yu, Yanbo; Zhao, Zhanping

    2018-01-01

    Remote sensing image quality assessment and improvement is an important part of image processing. Generally, the use of compressive sampling theory in remote sensing imaging system can compress images while sampling which can improve efficiency. A method of two-dimensional principal component analysis (2DPCA) is proposed to reconstruct the remote sensing image to improve the quality of the compressed image in this paper, which contain the useful information of image and can restrain the noise. Then, remote sensing image quality influence factors are analyzed, and the evaluation parameters for quantitative evaluation are introduced. On this basis, the quality of the reconstructed images is evaluated and the different factors influence on the reconstruction is analyzed, providing meaningful referential data for enhancing the quality of remote sensing images. The experiment results show that evaluation results fit human visual feature, and the method proposed have good application value in the field of remote sensing image processing.

  10. 78 FR 72899 - Draft Guidance for Industry on Registration for Human Drug Compounding Outsourcing Facilities...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-12-04

    ... facilities. The draft guidance discusses the process for registration of outsourcing facilities. The draft... outsourcing facilities that will participate in the process. Estimated reporting burden until September 30...] Draft Guidance for Industry on Registration for Human Drug Compounding Outsourcing Facilities Under...

  11. Real-Time Compressive Sensing MRI Reconstruction Using GPU Computing and Split Bregman Methods

    PubMed Central

    Smith, David S.; Gore, John C.; Yankeelov, Thomas E.; Welch, E. Brian

    2012-01-01

    Compressive sensing (CS) has been shown to enable dramatic acceleration of MRI acquisition in some applications. Being an iterative reconstruction technique, CS MRI reconstructions can be more time-consuming than traditional inverse Fourier reconstruction. We have accelerated our CS MRI reconstruction by factors of up to 27 by using a split Bregman solver combined with a graphics processing unit (GPU) computing platform. The increases in speed we find are similar to those we measure for matrix multiplication on this platform, suggesting that the split Bregman methods parallelize efficiently. We demonstrate that the combination of the rapid convergence of the split Bregman algorithm and the massively parallel strategy of GPU computing can enable real-time CS reconstruction of even acquisition data matrices of dimension 40962 or more, depending on available GPU VRAM. Reconstruction of two-dimensional data matrices of dimension 10242 and smaller took ~0.3 s or less, showing that this platform also provides very fast iterative reconstruction for small-to-moderate size images. PMID:22481908

  12. Real-Time Compressive Sensing MRI Reconstruction Using GPU Computing and Split Bregman Methods.

    PubMed

    Smith, David S; Gore, John C; Yankeelov, Thomas E; Welch, E Brian

    2012-01-01

    Compressive sensing (CS) has been shown to enable dramatic acceleration of MRI acquisition in some applications. Being an iterative reconstruction technique, CS MRI reconstructions can be more time-consuming than traditional inverse Fourier reconstruction. We have accelerated our CS MRI reconstruction by factors of up to 27 by using a split Bregman solver combined with a graphics processing unit (GPU) computing platform. The increases in speed we find are similar to those we measure for matrix multiplication on this platform, suggesting that the split Bregman methods parallelize efficiently. We demonstrate that the combination of the rapid convergence of the split Bregman algorithm and the massively parallel strategy of GPU computing can enable real-time CS reconstruction of even acquisition data matrices of dimension 4096(2) or more, depending on available GPU VRAM. Reconstruction of two-dimensional data matrices of dimension 1024(2) and smaller took ~0.3 s or less, showing that this platform also provides very fast iterative reconstruction for small-to-moderate size images.

  13. Computer-assisted innovations in craniofacial surgery.

    PubMed

    Rudman, Kelli; Hoekzema, Craig; Rhee, John

    2011-08-01

    Reconstructive surgery for complex craniofacial defects challenges even the most experienced surgeons. Preoperative reconstructive planning requires consideration of both functional and aesthetic properties of the mandible, orbit, and midface. Technological innovations allow for computer-assisted preoperative planning, computer-aided manufacturing of patient-specific implants (PSIs), and computer-assisted intraoperative navigation. Although many case reports discuss computer-assisted preoperative planning and creation of custom implants, a general overview of computer-assisted innovations is not readily available. This article reviews innovations in computer-assisted reconstructive surgery including anatomic considerations when using PSIs, technologies available for preoperative planning, work flow and process of obtaining a PSI, and implant materials available for PSIs. A case example follows illustrating the use of this technology in the reconstruction of an orbital-frontal-temporal defect with a PSI. Computer-assisted reconstruction of complex craniofacial defects provides the reconstructive surgeon with innovative options for challenging reconstructive cases. As technology advances, applications of computer-assisted reconstruction will continue to expand. © Thieme Medical Publishers.

  14. Novel Method of Storing and Reconstructing Events at Fermilab E-906/SeaQuest Using a MySQL Database

    NASA Astrophysics Data System (ADS)

    Hague, Tyler

    2010-11-01

    Fermilab E-906/SeaQuest is a fixed target experiment at Fermi National Accelerator Laboratory. We are investigating the antiquark asymmetry in the nucleon sea. By examining the ratio of the Drell- Yan cross sections of proton-proton and proton-deuterium collisions we can determine the asymmetry ratio. An essential feature in the development of the analysis software is to update the event reconstruction to modern software tools. We are doing this in a unique way by doing a majority of the calculations within an SQL database. Using a MySQL database allows us to take advantage of off-the-shelf software without sacrificing ROOT compatibility and avoid network bottlenecks with server-side data selection. Using our raw data we create stubs, or partial tracks, at each station which are pieced together to create full tracks. Our reconstruction process uses dynamically created SQL statements to analyze the data. These SQL statements create tables that contain the final reconstructed tracks as well as intermediate values. This poster will explain the reconstruction process and how it is being implemented.

  15. [Reconstructive surgery of penile deformities and tissue deficiencies].

    PubMed

    Kelemen, Zsolt

    2009-05-31

    Penile deformities and tissue deficiencies can disturb sexual intercourse or make it impossible. The aim of the study is to summarize the different diseases according to their clinical appearance and pathological processes and to review operative methods and personal experiences. Surgical treatment of hypo- and epispadias is usually performed in childhood, but curvatures after unsuccessful operation can demand the reconstruction of urethra, skin and corpora cavernosa eventually. Peyronie's disease and curvature after penile fracture desire the reconstruction of tunica albuginea. Plaque surgery used to be performed with dermal, tunica vaginalis or venous grafts, but best results are obtained by shortening procedure on the contralateral side according to the Heinecke-Mikulitz principle. Tissue deficiencies and curvatures were observed after necrotic inflammatory processes, like Fournier's gangrene or chronic dermatitis. Skin defects were cured by flaps and grafts. Abscesses of penis, severe tissue defects and also curvatures were observed after intracavernous injection in cases of erectile dysfunction. Possibilities of reconstruction seem to be very poor. Oil granuloma of penis presents a new task for penile reconstruction. The best results of skin replacement were achieved by temporary embedding of the penis in scrotum.

  16. A comparison of charcoal measurements for reconstruction of Mediterranean paleo-fire frequency in the mountains of Corsica

    NASA Astrophysics Data System (ADS)

    Leys, Bérangère; Carcaillet, Christopher; Dezileau, Laurent; Ali, Adam A.; Bradshaw, Richard H. W.

    2013-05-01

    Fire-history reconstructions inferred from sedimentary charcoal records are based on measuring sieved charcoal fragment area, estimating fragment volume, or counting fragments. Similar fire histories are reconstructed from these three approaches for boreal lake sediment cores, using locally defined thresholds. Here, we test the same approach for a montane Mediterranean lake in which taphonomical processes might differ from boreal lakes through fragmentation of charcoal particles. The Mediterranean charcoal series are characterized by highly variable charcoal accumulation rates. Results there indicate that the three proxies do not provide comparable fire histories. The differences are attributable to charcoal fragmentation. This could be linked to fire type (crown or surface fires) or taphonomical processes, including charcoal transportation in the catchment area or in the sediment. The lack of correlation between the concentration of charcoal and of mineral matter suggests that fragmentation is not linked to erosion. Reconstructions based on charcoal area are more robust and stable than those based on fragment counts. Area-based reconstructions should therefore be used instead of the particle-counting method when fragmentation may influence the fragment abundance.

  17. Pan-sharpening via compressed superresolution reconstruction and multidictionary learning

    NASA Astrophysics Data System (ADS)

    Shi, Cheng; Liu, Fang; Li, Lingling; Jiao, Licheng; Hao, Hongxia; Shang, Ronghua; Li, Yangyang

    2018-01-01

    In recent compressed sensing (CS)-based pan-sharpening algorithms, pan-sharpening performance is affected by two key problems. One is that there are always errors between the high-resolution panchromatic (HRP) image and the linear weighted high-resolution multispectral (HRM) image, resulting in spatial and spectral information lost. The other is that the dictionary construction process depends on the nontruth training samples. These problems have limited applications to CS-based pan-sharpening algorithm. To solve these two problems, we propose a pan-sharpening algorithm via compressed superresolution reconstruction and multidictionary learning. Through a two-stage implementation, compressed superresolution reconstruction model reduces the error effectively between the HRP and the linear weighted HRM images. Meanwhile, the multidictionary with ridgelet and curvelet is learned for both the two stages in the superresolution reconstruction process. Since ridgelet and curvelet can better capture the structure and directional characteristics, a better reconstruction result can be obtained. Experiments are done on the QuickBird and IKONOS satellites images. The results indicate that the proposed algorithm is competitive compared with the recent CS-based pan-sharpening methods and other well-known methods.

  18. Diffraction Correlation to Reconstruct Highly Strained Particles

    NASA Astrophysics Data System (ADS)

    Brown, Douglas; Harder, Ross; Clark, Jesse; Kim, J. W.; Kiefer, Boris; Fullerton, Eric; Shpyrko, Oleg; Fohtung, Edwin

    2015-03-01

    Through the use of coherent x-ray diffraction a three-dimensional diffraction pattern of a highly strained nano-crystal can be recorded in reciprocal space by a detector. Only the intensities are recorded, resulting in a loss of the complex phase. The recorded diffraction pattern therefore requires computational processing to reconstruct the density and complex distribution of the diffracted nano-crystal. For highly strained crystals, standard methods using HIO and ER algorithms are no longer sufficient to reconstruct the diffraction pattern. Our solution is to correlate the symmetry in reciprocal space to generate an a priori shape constraint to guide the computational reconstruction of the diffraction pattern. This approach has improved the ability to accurately reconstruct highly strained nano-crystals.

  19. Modeling the Non-Equilibrium Process of the Chemical Adsorption of Ammonia on GaN(0001) Reconstructed Surfaces Based on Steepest-Entropy-Ascent Quantum Thermodynamics.

    PubMed

    Kusaba, Akira; Li, Guanchen; von Spakovsky, Michael R; Kangawa, Yoshihiro; Kakimoto, Koichi

    2017-08-15

    Clearly understanding elementary growth processes that depend on surface reconstruction is essential to controlling vapor-phase epitaxy more precisely. In this study, ammonia chemical adsorption on GaN(0001) reconstructed surfaces under metalorganic vapor phase epitaxy (MOVPE) conditions (3Ga-H and N ad -H + Ga-H on a 2 × 2 unit cell) is investigated using steepest-entropy-ascent quantum thermodynamics (SEAQT). SEAQT is a thermodynamic-ensemble based, first-principles framework that can predict the behavior of non-equilibrium processes, even those far from equilibrium where the state evolution is a combination of reversible and irreversible dynamics. SEAQT is an ideal choice to handle this problem on a first-principles basis since the chemical adsorption process starts from a highly non-equilibrium state. A result of the analysis shows that the probability of adsorption on 3Ga-H is significantly higher than that on N ad -H + Ga-H. Additionally, the growth temperature dependence of these adsorption probabilities and the temperature increase due to the heat of reaction is determined. The non-equilibrium thermodynamic modeling applied can lead to better control of the MOVPE process through the selection of preferable reconstructed surfaces. The modeling also demonstrates the efficacy of DFT-SEAQT coupling for determining detailed non-equilibrium process characteristics with a much smaller computational burden than would be entailed with mechanics-based, microscopic-mesoscopic approaches.

  20. Modeling the Non-Equilibrium Process of the Chemical Adsorption of Ammonia on GaN(0001) Reconstructed Surfaces Based on Steepest-Entropy-Ascent Quantum Thermodynamics

    PubMed Central

    Kusaba, Akira; von Spakovsky, Michael R.; Kangawa, Yoshihiro; Kakimoto, Koichi

    2017-01-01

    Clearly understanding elementary growth processes that depend on surface reconstruction is essential to controlling vapor-phase epitaxy more precisely. In this study, ammonia chemical adsorption on GaN(0001) reconstructed surfaces under metalorganic vapor phase epitaxy (MOVPE) conditions (3Ga-H and Nad-H + Ga-H on a 2 × 2 unit cell) is investigated using steepest-entropy-ascent quantum thermodynamics (SEAQT). SEAQT is a thermodynamic-ensemble based, first-principles framework that can predict the behavior of non-equilibrium processes, even those far from equilibrium where the state evolution is a combination of reversible and irreversible dynamics. SEAQT is an ideal choice to handle this problem on a first-principles basis since the chemical adsorption process starts from a highly non-equilibrium state. A result of the analysis shows that the probability of adsorption on 3Ga-H is significantly higher than that on Nad-H + Ga-H. Additionally, the growth temperature dependence of these adsorption probabilities and the temperature increase due to the heat of reaction is determined. The non-equilibrium thermodynamic modeling applied can lead to better control of the MOVPE process through the selection of preferable reconstructed surfaces. The modeling also demonstrates the efficacy of DFT-SEAQT coupling for determining detailed non-equilibrium process characteristics with a much smaller computational burden than would be entailed with mechanics-based, microscopic-mesoscopic approaches. PMID:28809816

  1. Dense soft tissue 3D reconstruction refined with super-pixel segmentation for robotic abdominal surgery.

    PubMed

    Penza, Veronica; Ortiz, Jesús; Mattos, Leonardo S; Forgione, Antonello; De Momi, Elena

    2016-02-01

    Single-incision laparoscopic surgery decreases postoperative infections, but introduces limitations in the surgeon's maneuverability and in the surgical field of view. This work aims at enhancing intra-operative surgical visualization by exploiting the 3D information about the surgical site. An interactive guidance system is proposed wherein the pose of preoperative tissue models is updated online. A critical process involves the intra-operative acquisition of tissue surfaces. It can be achieved using stereoscopic imaging and 3D reconstruction techniques. This work contributes to this process by proposing new methods for improved dense 3D reconstruction of soft tissues, which allows a more accurate deformation identification and facilitates the registration process. Two methods for soft tissue 3D reconstruction are proposed: Method 1 follows the traditional approach of the block matching algorithm. Method 2 performs a nonparametric modified census transform to be more robust to illumination variation. The simple linear iterative clustering (SLIC) super-pixel algorithm is exploited for disparity refinement by filling holes in the disparity images. The methods were validated using two video datasets from the Hamlyn Centre, achieving an accuracy of 2.95 and 1.66 mm, respectively. A comparison with ground-truth data demonstrated the disparity refinement procedure: (1) increases the number of reconstructed points by up to 43 % and (2) does not affect the accuracy of the 3D reconstructions significantly. Both methods give results that compare favorably with the state-of-the-art methods. The computational time constraints their applicability in real time, but can be greatly improved by using a GPU implementation.

  2. MO-C-18A-01: Advances in Model-Based 3D Image Reconstruction

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, G; Pan, X; Stayman, J

    2014-06-15

    Recent years have seen the emergence of CT image reconstruction techniques that exploit physical models of the imaging system, photon statistics, and even the patient to achieve improved 3D image quality and/or reduction of radiation dose. With numerous advantages in comparison to conventional 3D filtered backprojection, such techniques bring a variety of challenges as well, including: a demanding computational load associated with sophisticated forward models and iterative optimization methods; nonlinearity and nonstationarity in image quality characteristics; a complex dependency on multiple free parameters; and the need to understand how best to incorporate prior information (including patient-specific prior images) within themore » reconstruction process. The advantages, however, are even greater – for example: improved image quality; reduced dose; robustness to noise and artifacts; task-specific reconstruction protocols; suitability to novel CT imaging platforms and noncircular orbits; and incorporation of known characteristics of the imager and patient that are conventionally discarded. This symposium features experts in 3D image reconstruction, image quality assessment, and the translation of such methods to emerging clinical applications. Dr. Chen will address novel methods for the incorporation of prior information in 3D and 4D CT reconstruction techniques. Dr. Pan will show recent advances in optimization-based reconstruction that enable potential reduction of dose and sampling requirements. Dr. Stayman will describe a “task-based imaging” approach that leverages models of the imaging system and patient in combination with a specification of the imaging task to optimize both the acquisition and reconstruction process. Dr. Samei will describe the development of methods for image quality assessment in such nonlinear reconstruction techniques and the use of these methods to characterize and optimize image quality and dose in a spectrum of clinical applications. Learning Objectives: Learn the general methodologies associated with model-based 3D image reconstruction. Learn the potential advantages in image quality and dose associated with model-based image reconstruction. Learn the challenges associated with computational load and image quality assessment for such reconstruction methods. Learn how imaging task can be incorporated as a means to drive optimal image acquisition and reconstruction techniques. Learn how model-based reconstruction methods can incorporate prior information to improve image quality, ease sampling requirements, and reduce dose.« less

  3. Estimation of 3D reconstruction errors in a stereo-vision system

    NASA Astrophysics Data System (ADS)

    Belhaoua, A.; Kohler, S.; Hirsch, E.

    2009-06-01

    The paper presents an approach for error estimation for the various steps of an automated 3D vision-based reconstruction procedure of manufactured workpieces. The process is based on a priori planning of the task and built around a cognitive intelligent sensory system using so-called Situation Graph Trees (SGT) as a planning tool. Such an automated quality control system requires the coordination of a set of complex processes performing sequentially data acquisition, its quantitative evaluation and the comparison with a reference model (e.g., CAD object model) in order to evaluate quantitatively the object. To ensure efficient quality control, the aim is to be able to state if reconstruction results fulfill tolerance rules or not. Thus, the goal is to evaluate independently the error for each step of the stereo-vision based 3D reconstruction (e.g., for calibration, contour segmentation, matching and reconstruction) and then to estimate the error for the whole system. In this contribution, we analyze particularly the segmentation error due to localization errors for extracted edge points supposed to belong to lines and curves composing the outline of the workpiece under evaluation. The fitting parameters describing these geometric features are used as quality measure to determine confidence intervals and finally to estimate the segmentation errors. These errors are then propagated through the whole reconstruction procedure, enabling to evaluate their effect on the final 3D reconstruction result, specifically on position uncertainties. Lastly, analysis of these error estimates enables to evaluate the quality of the 3D reconstruction, as illustrated by the shown experimental results.

  4. Birth-death models and coalescent point processes: the shape and probability of reconstructed phylogenies.

    PubMed

    Lambert, Amaury; Stadler, Tanja

    2013-12-01

    Forward-in-time models of diversification (i.e., speciation and extinction) produce phylogenetic trees that grow "vertically" as time goes by. Pruning the extinct lineages out of such trees leads to natural models for reconstructed trees (i.e., phylogenies of extant species). Alternatively, reconstructed trees can be modelled by coalescent point processes (CPPs), where trees grow "horizontally" by the sequential addition of vertical edges. Each new edge starts at some random speciation time and ends at the present time; speciation times are drawn from the same distribution independently. CPPs lead to extremely fast computation of tree likelihoods and simulation of reconstructed trees. Their topology always follows the uniform distribution on ranked tree shapes (URT). We characterize which forward-in-time models lead to URT reconstructed trees and among these, which lead to CPP reconstructed trees. We show that for any "asymmetric" diversification model in which speciation rates only depend on time and extinction rates only depend on time and on a non-heritable trait (e.g., age), the reconstructed tree is CPP, even if extant species are incompletely sampled. If rates additionally depend on the number of species, the reconstructed tree is (only) URT (but not CPP). We characterize the common distribution of speciation times in the CPP description, and discuss incomplete species sampling as well as three special model cases in detail: (1) the extinction rate does not depend on a trait; (2) rates do not depend on time; (3) mass extinctions may happen additionally at certain points in the past. Copyright © 2013 Elsevier Inc. All rights reserved.

  5. An Application of Business Process Management to Health Care Facilities.

    PubMed

    Hassan, Mohsen M D

    The purpose of this article is to help health care facility managers and personnel identify significant elements of their facilities to address, and steps and actions to follow, when applying business process management to them. The ABPMP (Association of Business Process Management Professionals) life-cycle model of business process management is adopted, and steps from Lean, business process reengineering, and Six Sigma, and actions from operations management are presented to implement it. Managers of health care facilities can find in business process management a more comprehensive approach to improving their facilities than Lean, Six Sigma, business process reengineering, and ad hoc approaches that does not conflict with them because many of their elements can be included under its umbrella. Furthermore, the suggested application of business process management can guide and relieve them from selecting among these approaches, as well as provide them with specific steps and actions that they can follow. This article fills a gap in the literature by presenting a much needed comprehensive application of business process management to health care facilities that has specific steps and actions for implementation.

  6. Reconstruction and Analysis for the DUNE 35-ton Liquid Argon Prototype

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wallbank, Michael James

    Neutrino physics is approaching the precision era, with current and future experiments aiming to perform highly accurate measurements of the parameters which govern the phenomenon of neutrino oscillations. The ultimate ambition with these results is to search for evidence of CP-violation in the lepton sector, currently hinted at in the world-leading analyses from present experiments, which may explain the dominance of matter over antimatter in the Universe. The Deep Underground Neutrino Experiment (DUNE) is a future long-baseline experiment based at Fermi National Accelerator Laboratory (FNAL), with a far detector at the Sanford Underground Research Facility (SURF) and a baseline ofmore » 1300 km. In order to make the required precision measurements, the far detector will consist of 40 kton liquid argon and an embedded time projection chamber. This promising technology is still in development and, since each detector module is around a factor 15 larger than any previous experiment employing this design, prototyping the detector and design choices is critical to the success of the experiment. The 35-ton experiment was constructed for this purpose and will be described in detail in this thesis. The outcomes of the 35-ton prototype are already influencing DUNE and, following the successes and lessons learned from the experiment, confidence can be taken forward to the next stage of the DUNE programme. The main oscillation signal at DUNE will be electron neutrino appearance from the muon neutrino beam. High-precision studies of these νe interactions requires advanced processing and event reconstruction techniques, particularly in the handling of showering particles such as electrons and photons. Novel methods developed for the purposes of shower reconstruction in liquid argon are presented with an aim to successfully develop a selection to use in a νe charged-current analysis, and a first-generation selection using the new techniques is presented.« less

  7. Biomimetic Composite Scaffold for Breast Reconstruction Following Tumor Resection

    DTIC Science & Technology

    2005-09-01

    developing an innovative biomimetic scaffold materials by combining two natural polymers: silk fibroin (from Bombyx mori silk worm) and chitosan (from...M.D. Anderson Cancer Center (UTMDACC) is a component of the University of Texas System. It is one of the nation’s original three Comprehensive Cancer...follow protocols • Properly collect, measure, and document data from the lab, clinic, and/or animal facilities 22 Table 2. Origin of Country and

  8. Review of digital holography reconstruction methods

    NASA Astrophysics Data System (ADS)

    Dovhaliuk, Rostyslav Yu.

    2018-01-01

    Development of digital holography opened new ways of both transparent and opaque objects non-destructive study. In this paper, a digital hologram reconstruction process is investigated. The advantages and limitations of common wave propagation methods are discussed. The details of a software implementation of a digital hologram reconstruction methods are presented. Finally, the performance of each wave propagation method is evaluated, and recommendations about possible use cases for each of them are given.

  9. [Applicability of Pedicled Coronoid Process and Temporal Muscle(Fascial)Combined(PCPTM)Flap for Reconstruction of Orbital Floor Defect Following Hemi-Maxillectomy for Advanced Maxillary Cancer - A Report of Two Cases].

    PubMed

    Karino, Masaaki; Kanno, Takahiro; Kaneko, Ichiro; Ide, Taichi; Yoshino, Aya; Sekine, Joji

    2017-11-01

    We usually perform surgery for resectable oral and maxillofacial carcinomas. Following complete cancer resection, reconstruction of soft and hard tissues using various types of local flaps and/or vascularized free flaps is usually performed. The maxilla is composed of various anatomical structures. In particular, reconstruction of the orbit is one of the most important and challenging procedures for prevention of functional and esthetic complications. Here we report 2 cases of orbital floor defect reconstruction following advanced maxillary cancer resection using a pedicled coronoid process and temporal muscle (fascial)combined(PCPTM)flap. Case 1: A 69-year-old Japanese man with squamous cell carcinoma of the left maxilla (cT4aN2bM0, Stage IV A). Case 2: An 86-year-old Japanese woman with recurrence of myoepithelial carcinoma of the left maxilla. In both cases, the orbital floor defect was reconstructed following hemi-maxillectomy using a PCPTM flap. Minor infection and/or partial necrosis were observed postoperatively, and a maxillofacial prosthesis was used in one case. A PCPTM flap was feasible for reconstruction of surgical defects of the orbital floor following maxillectomy for cancer.

  10. Statistical image reconstruction from correlated data with applications to PET

    PubMed Central

    Alessio, Adam; Sauer, Ken; Kinahan, Paul

    2008-01-01

    Most statistical reconstruction methods for emission tomography are designed for data modeled as conditionally independent Poisson variates. In reality, due to scanner detectors, electronics and data processing, correlations are introduced into the data resulting in dependent variates. In general, these correlations are ignored because they are difficult to measure and lead to computationally challenging statistical reconstruction algorithms. This work addresses the second concern, seeking to simplify the reconstruction of correlated data and provide a more precise image estimate than the conventional independent methods. In general, correlated variates have a large non-diagonal covariance matrix that is computationally challenging to use as a weighting term in a reconstruction algorithm. This work proposes two methods to simplify the use of a non-diagonal covariance matrix as the weighting term by (a) limiting the number of dimensions in which the correlations are modeled and (b) adopting flexible, yet computationally tractable, models for correlation structure. We apply and test these methods with simple simulated PET data and data processed with the Fourier rebinning algorithm which include the one-dimensional correlations in the axial direction and the two-dimensional correlations in the transaxial directions. The methods are incorporated into a penalized weighted least-squares 2D reconstruction and compared with a conventional maximum a posteriori approach. PMID:17921576

  11. Photogrammetry for rapid prototyping: development of noncontact 3D reconstruction technologies

    NASA Astrophysics Data System (ADS)

    Knyaz, Vladimir A.

    2002-04-01

    An important stage of rapid prototyping technology is generating computer 3D model of an object to be reproduced. Wide variety of techniques for 3D model generation exists beginning with manual 3D models generation and finishing with full-automated reverse engineering system. The progress in CCD sensors and computers provides the background for integration of photogrammetry as an accurate 3D data source with CAD/CAM. The paper presents the results of developing photogrammetric methods for non-contact spatial coordinates measurements and generation of computer 3D model of real objects. The technology is based on object convergent images processing for calculating its 3D coordinates and surface reconstruction. The hardware used for spatial coordinates measurements is based on PC as central processing unit and video camera as image acquisition device. The original software for Windows 9X realizes the complete technology of 3D reconstruction for rapid input of geometry data in CAD/CAM systems. Technical characteristics of developed systems are given along with the results of applying for various tasks of 3D reconstruction. The paper describes the techniques used for non-contact measurements and the methods providing metric characteristics of reconstructed 3D model. Also the results of system application for 3D reconstruction of complex industrial objects are presented.

  12. A defocus-information-free autostereoscopic three-dimensional (3D) digital reconstruction method using direct extraction of disparity information (DEDI)

    NASA Astrophysics Data System (ADS)

    Li, Da; Cheung, Chifai; Zhao, Xing; Ren, Mingjun; Zhang, Juan; Zhou, Liqiu

    2016-10-01

    Autostereoscopy based three-dimensional (3D) digital reconstruction has been widely applied in the field of medical science, entertainment, design, industrial manufacture, precision measurement and many other areas. The 3D digital model of the target can be reconstructed based on the series of two-dimensional (2D) information acquired by the autostereoscopic system, which consists multiple lens and can provide information of the target from multiple angles. This paper presents a generalized and precise autostereoscopic three-dimensional (3D) digital reconstruction method based on Direct Extraction of Disparity Information (DEDI) which can be used to any transform autostereoscopic systems and provides accurate 3D reconstruction results through error elimination process based on statistical analysis. The feasibility of DEDI method has been successfully verified through a series of optical 3D digital reconstruction experiments on different autostereoscopic systems which is highly efficient to perform the direct full 3D digital model construction based on tomography-like operation upon every depth plane with the exclusion of the defocused information. With the absolute focused information processed by DEDI method, the 3D digital model of the target can be directly and precisely formed along the axial direction with the depth information.

  13. The Paris to Lexington Road Reconstruction Project.

    DOT National Transportation Integrated Search

    2001-09-01

    This report summarizes the effort to provide the Kentucky Transportation Cabinet with an evaluation of the results obtained for the Paris to Lexington Road Reconstruction Project from 1997 to 2001. A unique pre-qualification process was used for the ...

  14. Assessing the quality of restored images in optical long-baseline interferometry

    NASA Astrophysics Data System (ADS)

    Gomes, Nuno; Garcia, Paulo J. V.; Thiébaut, Éric

    2017-03-01

    Assessing the quality of aperture synthesis maps is relevant for benchmarking image reconstruction algorithms, for the scientific exploitation of data from optical long-baseline interferometers, and for the design/upgrade of new/existing interferometric imaging facilities. Although metrics have been proposed in these contexts, no systematic study has been conducted on the selection of a robust metric for quality assessment. This article addresses the question: what is the best metric to assess the quality of a reconstructed image? It starts by considering several metrics and selecting a few based on general properties. Then, a variety of image reconstruction cases are considered. The observational scenarios are phase closure and phase referencing at the Very Large Telescope Interferometer (VLTI), for a combination of two, three, four and six telescopes. End-to-end image reconstruction is accomplished with the MIRA software, and several merit functions are put to test. It is found that convolution by an effective point spread function is required for proper image quality assessment. The effective angular resolution of the images is superior to naive expectation based on the maximum frequency sampled by the array. This is due to the prior information used in the aperture synthesis algorithm and to the nature of the objects considered. The ℓ1-norm is the most robust of all considered metrics, because being linear it is less sensitive to image smoothing by high regularization levels. For the cases considered, this metric allows the implementation of automatic quality assessment of reconstructed images, with a performance similar to human selection.

  15. Astronomical data analysis software and systems I; Proceedings of the 1st Annual Conference, Tucson, AZ, Nov. 6-8, 1991

    NASA Technical Reports Server (NTRS)

    Worrall, Diana M. (Editor); Biemesderfer, Chris (Editor); Barnes, Jeannette (Editor)

    1992-01-01

    Consideration is given to a definition of a distribution format for X-ray data, the Einstein on-line system, the NASA/IPAC extragalactic database, COBE astronomical databases, Cosmic Background Explorer astronomical databases, the ADAM software environment, the Groningen Image Processing System, search for a common data model for astronomical data analysis systems, deconvolution for real and synthetic apertures, pitfalls in image reconstruction, a direct method for spectral and image restoration, and a discription of a Poisson imagery super resolution algorithm. Also discussed are multivariate statistics on HI and IRAS images, a faint object classification using neural networks, a matched filter for improving SNR of radio maps, automated aperture photometry of CCD images, interactive graphics interpreter, the ROSAT extreme ultra-violet sky survey, a quantitative study of optimal extraction, an automated analysis of spectra, applications of synthetic photometry, an algorithm for extra-solar planet system detection and data reduction facilities for the William Herschel telescope.

  16. Facile preparation of gold nanocages and hollow gold nanospheres via solvent thermal treatment and their surface plasmon resonance and photothermal properties.

    PubMed

    Wang, Haifei; Han, Jing; Lu, Wensheng; Zhang, Jianping; Li, Jinru; Jiang, Long

    2015-02-15

    Although template etching method is one of the most common ways of preparation of hollow gold nanostructures, this approach still requires further improvements to avoid the collapse of gold shells after the cores were removed. In this work, an improved template etching method, with which hollow gold nanostructure is fabricated by etching Polystyrene (PS) cores from PS@Au core-shell nanospheres with solvent thermal treatment in N,N-Dimethylformamide (DMF), is demonstrated. When PS cores were removed by a thermal treatment process, gold nanoshells reconstruct and the collapse of the nanoshells is avoided. Gold nanocages and hollow gold nanospheres are easily obtained from the various structures of PS@Au core-shell nanospheres. These hollow nanostructures represent special near infrared (NIR) optical property and photothermal property. Compared with hollow gold nanospheres, the gold nanocages show higher temperature increase at the same particle concentration. Copyright © 2014 Elsevier Inc. All rights reserved.

  17. Jini service to reconstruct tomographic data

    NASA Astrophysics Data System (ADS)

    Knoll, Peter; Mirzaei, S.; Koriska, K.; Koehn, H.

    2002-06-01

    A number of imaging systems rely on the reconstruction of a 3- dimensional model from its projections through the process of computed tomography (CT). In medical imaging, for example magnetic resonance imaging (MRI), positron emission tomography (PET), and Single Computer Tomography (SPECT) acquire two-dimensional projections of a three dimensional projections of a three dimensional object. In order to calculate the 3-dimensional representation of the object, i.e. its voxel distribution, several reconstruction algorithms have been developed. Currently, mainly two reconstruct use: the filtered back projection(FBP) and iterative methods. Although the quality of iterative reconstructed SPECT slices is better than that of FBP slices, such iterative algorithms are rarely used for clinical routine studies because of their low availability and increased reconstruction time. We used Jini and a self-developed iterative reconstructions algorithm to design and implement a Jini reconstruction service. With this service, the physician selects the patient study from a database and a Jini client automatically discovers the registered Jini reconstruction services in the department's Intranet. After downloading the proxy object the this Jini service, the SPECT acquisition data are reconstructed. The resulting transaxial slices are visualized using a Jini slice viewer, which can be used for various imaging modalities.

  18. Investigation of undersampling and reconstruction algorithm dependence on respiratory correlated 4D-MRI for online MR-guided radiation therapy

    NASA Astrophysics Data System (ADS)

    Mickevicius, Nikolai J.; Paulson, Eric S.

    2017-04-01

    The purpose of this work is to investigate the effects of undersampling and reconstruction algorithm on the total processing time and image quality of respiratory phase-resolved 4D MRI data. Specifically, the goal is to obtain quality 4D-MRI data with a combined acquisition and reconstruction time of five minutes or less, which we reasoned would be satisfactory for pre-treatment 4D-MRI in online MRI-gRT. A 3D stack-of-stars, self-navigated, 4D-MRI acquisition was used to scan three healthy volunteers at three image resolutions and two scan durations. The NUFFT, CG-SENSE, SPIRiT, and XD-GRASP reconstruction algorithms were used to reconstruct each dataset on a high performance reconstruction computer. The overall image quality, reconstruction time, artifact prevalence, and motion estimates were compared. The CG-SENSE and XD-GRASP reconstructions provided superior image quality over the other algorithms. The combination of a 3D SoS sequence and parallelized reconstruction algorithms using computing hardware more advanced than those typically seen on product MRI scanners, can result in acquisition and reconstruction of high quality respiratory correlated 4D-MRI images in less than five minutes.

  19. Semi-automated Image Processing for Preclinical Bioluminescent Imaging.

    PubMed

    Slavine, Nikolai V; McColl, Roderick W

    Bioluminescent imaging is a valuable noninvasive technique for investigating tumor dynamics and specific biological molecular events in living animals to better understand the effects of human disease in animal models. The purpose of this study was to develop and test a strategy behind automated methods for bioluminescence image processing from the data acquisition to obtaining 3D images. In order to optimize this procedure a semi-automated image processing approach with multi-modality image handling environment was developed. To identify a bioluminescent source location and strength we used the light flux detected on the surface of the imaged object by CCD cameras. For phantom calibration tests and object surface reconstruction we used MLEM algorithm. For internal bioluminescent sources we used the diffusion approximation with balancing the internal and external intensities on the boundary of the media and then determined an initial order approximation for the photon fluence we subsequently applied a novel iterative deconvolution method to obtain the final reconstruction result. We find that the reconstruction techniques successfully used the depth-dependent light transport approach and semi-automated image processing to provide a realistic 3D model of the lung tumor. Our image processing software can optimize and decrease the time of the volumetric imaging and quantitative assessment. The data obtained from light phantom and lung mouse tumor images demonstrate the utility of the image reconstruction algorithms and semi-automated approach for bioluminescent image processing procedure. We suggest that the developed image processing approach can be applied to preclinical imaging studies: characteristics of tumor growth, identify metastases, and potentially determine the effectiveness of cancer treatment.

  20. The NUMEN project: NUclear Matrix Elements for Neutrinoless double beta decay

    NASA Astrophysics Data System (ADS)

    Cappuzzello, F.; Agodi, C.; Cavallaro, M.; Carbone, D.; Tudisco, S.; Lo Presti, D.; Oliveira, J. R. B.; Finocchiaro, P.; Colonna, M.; Rifuggiato, D.; Calabretta, L.; Calvo, D.; Pandola, L.; Acosta, L.; Auerbach, N.; Bellone, J.; Bijker, R.; Bonanno, D.; Bongiovanni, D.; Borello-Lewin, T.; Boztosun, I.; Brunasso, O.; Burrello, S.; Calabrese, S.; Calanna, A.; Chávez Lomelí, E. R.; D'Agostino, G.; De Faria, P. N.; De Geronimo, G.; Delaunay, F.; Deshmukh, N.; Ferreira, J. L.; Fisichella, M.; Foti, A.; Gallo, G.; Garcia-Tecocoatzi, H.; Greco, V.; Hacisalihoglu, A.; Iazzi, F.; Introzzi, R.; Lanzalone, G.; Lay, J. A.; La Via, F.; Lenske, H.; Linares, R.; Litrico, G.; Longhitano, F.; Lubian, J.; Medina, N. H.; Mendes, D. R.; Moralles, M.; Muoio, A.; Pakou, A.; Petrascu, H.; Pinna, F.; Reito, S.; Russo, A. D.; Russo, G.; Santagati, G.; Santopinto, E.; Santos, R. B. B.; Sgouros, O.; da Silveira, M. A. G.; Solakci, S. O.; Souliotis, G.; Soukeras, V.; Spatafora, A.; Torresi, D.; Magana Vsevolodovna, R.; Yildirim, A.; Zagatto, V. A. B.

    2018-05-01

    The article describes the main achievements of the NUMEN project together with an updated and detailed overview of the related R&D activities and theoretical developments. NUMEN proposes an innovative technique to access the nuclear matrix elements entering the expression of the lifetime of the double beta decay by cross section measurements of heavy-ion induced Double Charge Exchange (DCE) reactions. Despite the fact that the two processes, namely neutrinoless double beta decay and DCE reactions, are triggered by the weak and strong interaction respectively, important analogies are suggested. The basic point is the coincidence of the initial and final state many-body wave functions in the two types of processes and the formal similarity of the transition operators. First experimental results obtained at the INFN-LNS laboratory for the 40Ca(18O,18Ne)40Ar reaction at 270MeV give an encouraging indication on the capability of the proposed technique to access relevant quantitative information. The main experimental tools for this project are the K800 Superconducting Cyclotron and MAGNEX spectrometer. The former is used for the acceleration of the required high resolution and low emittance heavy-ion beams and the latter is the large acceptance magnetic spectrometer for the detection of the ejectiles. The use of the high-order trajectory reconstruction technique, implemented in MAGNEX, allows to reach the experimental resolution and sensitivity required for the accurate measurement of the DCE cross sections at forward angles. However, the tiny values of such cross sections and the resolution requirements demand beam intensities much larger than those manageable with the present facility. The on-going upgrade of the INFN-LNS facilities in this perspective is part of the NUMEN project and will be discussed in the article.

Top