Sample records for reconstructed process facility

  1. Image-Based Reconstruction and Analysis of Dynamic Scenes in a Landslide Simulation Facility

    NASA Astrophysics Data System (ADS)

    Scaioni, M.; Crippa, J.; Longoni, L.; Papini, M.; Zanzi, L.

    2017-12-01

    The application of image processing and photogrammetric techniques to dynamic reconstruction of landslide simulations in a scaled-down facility is described. Simulations are also used here for active-learning purpose: students are helped understand how physical processes happen and which kinds of observations may be obtained from a sensor network. In particular, the use of digital images to obtain multi-temporal information is presented. On one side, using a multi-view sensor set up based on four synchronized GoPro 4 Black® cameras, a 4D (3D spatial position and time) reconstruction of the dynamic scene is obtained through the composition of several 3D models obtained from dense image matching. The final textured 4D model allows one to revisit in dynamic and interactive mode a completed experiment at any time. On the other side, a digital image correlation (DIC) technique has been used to track surface point displacements from the image sequence obtained from the camera in front of the simulation facility. While the 4D model may provide a qualitative description and documentation of the experiment running, DIC analysis output quantitative information such as local point displacements and velocities, to be related to physical processes and to other observations. All the hardware and software equipment adopted for the photogrammetric reconstruction has been based on low-cost and open-source solutions.

  2. Defining and reconstructing clinical processes based on IHE and BPMN 2.0.

    PubMed

    Strasser, Melanie; Pfeifer, Franz; Helm, Emmanuel; Schuler, Andreas; Altmann, Josef

    2011-01-01

    This paper describes the current status and the results of our process management system for defining and reconstructing clinical care processes, which contributes to compare, analyze and evaluate clinical processes and further to identify high cost tasks or stays. The system is founded on IHE, which guarantees standardized interfaces and interoperability between clinical information systems. At the heart of the system there is BPMN, a modeling notation and specification language, which allows the definition and execution of clinical processes. The system provides functionality to define healthcare information system independent clinical core processes and to execute the processes in a workflow engine. Furthermore, the reconstruction of clinical processes is done by evaluating an IHE audit log database, which records patient movements within a health care facility. The main goal of the system is to assist hospital operators and clinical process managers to detect discrepancies between defined and actual clinical processes and as well to identify main causes of high medical costs. Beyond that, the system can potentially contribute to reconstruct and improve clinical processes and enhance cost control and patient care quality.

  3. Reconstruction of 3d Objects of Assets and Facilities by Using Benchmark Points

    NASA Astrophysics Data System (ADS)

    Baig, S. U.; Rahman, A. A.

    2013-08-01

    Acquiring and modeling 3D geo-data of building assets and facility objects is one of the challenges. A number of methods and technologies are being utilized for this purpose. Total station, GPS, photogrammetric and terrestrial laser scanning are few of these technologies. In this paper, points commonly shared by potential facades of assets and facilities modeled from point clouds are identified. These points are useful for modeling process to reconstruct 3D models of assets and facilities stored to be used for management purposes. These models are segmented through different planes to produce accurate 2D plans. This novel method improves the efficiency and quality of construction of models of assets and facilities with the aim utilize in 3D management projects such as maintenance of buildings or group of items that need to be replaced, or renovated for new services.

  4. Study on beam geometry and image reconstruction algorithm in fast neutron computerized tomography at NECTAR facility

    NASA Astrophysics Data System (ADS)

    Guo, J.; Bücherl, T.; Zou, Y.; Guo, Z.

    2011-09-01

    Investigations on the fast neutron beam geometry for the NECTAR facility are presented. The results of MCNP simulations and experimental measurements of the beam distributions at NECTAR are compared. Boltzmann functions are used to describe the beam profile in the detection plane assuming the area source to be set up of large number of single neutron point sources. An iterative algebraic reconstruction algorithm is developed, realized and verified by both simulated and measured projection data. The feasibility for improved reconstruction in fast neutron computerized tomography at the NECTAR facility is demonstrated.

  5. 17 CFR 37.406 - Trade reconstruction.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 17 Commodity and Securities Exchanges 1 2014-04-01 2014-04-01 false Trade reconstruction. 37.406 Section 37.406 Commodity and Securities Exchanges COMMODITY FUTURES TRADING COMMISSION SWAP EXECUTION FACILITIES Monitoring of Trading and Trade Processing § 37.406 Trade reconstruction. The swap execution...

  6. Non-rigid Reconstruction of Casting Process with Temperature Feature

    NASA Astrophysics Data System (ADS)

    Lin, Jinhua; Wang, Yanjie; Li, Xin; Wang, Ying; Wang, Lu

    2017-09-01

    Off-line reconstruction of rigid scene has made a great progress in the past decade. However, the on-line reconstruction of non-rigid scene is still a very challenging task. The casting process is a non-rigid reconstruction problem, it is a high-dynamic molding process lacking of geometric features. In order to reconstruct the casting process robustly, an on-line fusion strategy is proposed for dynamic reconstruction of casting process. Firstly, the geometric and flowing feature of casting are parameterized in manner of TSDF (truncated signed distance field) which is a volumetric block, parameterized casting guarantees real-time tracking and optimal deformation of casting process. Secondly, data structure of the volume grid is extended to have temperature value, the temperature interpolation function is build to generate the temperature of each voxel. This data structure allows for dynamic tracking of temperature of casting during deformation stages. Then, the sparse RGB features is extracted from casting scene to search correspondence between geometric representation and depth constraint. The extracted color data guarantees robust tracking of flowing motion of casting. Finally, the optimal deformation of the target space is transformed into a nonlinear regular variational optimization problem. This optimization step achieves smooth and optimal deformation of casting process. The experimental results show that the proposed method can reconstruct the casting process robustly and reduce drift in the process of non-rigid reconstruction of casting.

  7. Spacelab Data Processing Facility

    NASA Technical Reports Server (NTRS)

    1983-01-01

    The capabilities of the Spacelab Data Processing Facility (SPDPF) are highlighted. The capturing, quality monitoring, processing, accounting, and forwarding of vital Spacelab data to various user facilities around the world are described.

  8. 40 CFR 60.560 - Applicability and designation of affected facilities.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... section in a polypropylene or polyethylene production process is a potential affected facility for both... constructed, modified, or reconstructed and, in some instances, on the type of production process. (i) The... reconstructed after January 10, 1989, regardless of the type of production process being used, is January 10...

  9. 40 CFR 60.560 - Applicability and designation of affected facilities.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... section in a polypropylene or polyethylene production process is a potential affected facility for both... constructed, modified, or reconstructed and, in some instances, on the type of production process. (i) The... reconstructed after January 10, 1989, regardless of the type of production process being used, is January 10...

  10. Defense Waste Processing Facility Process Enhancements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bricker, Jonathan

    2010-11-01

    Jonathan Bricker provides an overview of process enhancements currently being done at the Defense Waste Processing Facility (DWPF) at SRS. Some of these enhancements include: melter bubblers; reduction in water use, and alternate reductant.

  11. 40 CFR 52.279 - Food processing facilities.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 3 2012-07-01 2012-07-01 false Food processing facilities. 52.279... (CONTINUED) APPROVAL AND PROMULGATION OF IMPLEMENTATION PLANS California § 52.279 Food processing facilities... emissions from food processing facilities without any accompanying analyses demonstrating that these...

  12. 40 CFR 52.279 - Food processing facilities.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 3 2014-07-01 2014-07-01 false Food processing facilities. 52.279... (CONTINUED) APPROVAL AND PROMULGATION OF IMPLEMENTATION PLANS California § 52.279 Food processing facilities... emissions from food processing facilities without any accompanying analyses demonstrating that these...

  13. 40 CFR 52.279 - Food processing facilities.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 3 2011-07-01 2011-07-01 false Food processing facilities. 52.279... (CONTINUED) APPROVAL AND PROMULGATION OF IMPLEMENTATION PLANS California § 52.279 Food processing facilities... emissions from food processing facilities without any accompanying analyses demonstrating that these...

  14. 40 CFR 52.279 - Food processing facilities.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... emissions from food processing facilities without any accompanying analyses demonstrating that these... 40 Protection of Environment 3 2013-07-01 2013-07-01 false Food processing facilities. 52.279... (CONTINUED) APPROVAL AND PROMULGATION OF IMPLEMENTATION PLANS California § 52.279 Food processing facilities...

  15. Spacelab Data Processing Facility

    NASA Technical Reports Server (NTRS)

    1983-01-01

    The Spacelab Data Processing Facility (SDPF) processes, monitors, and accounts for the payload data from Spacelab and other Shuttle missions and forwards relevant data to various user facilities worldwide. The SLDPF is divided into the Spacelab Input Processing System (SIPS) and the Spacelab Output Processing System (SOPS). The SIPS division demultiplexes, synchronizes, time tags, quality checks, accounts for the data, and formats the data onto tapes. The SOPS division further edits, blocks, formats, and records the data on tape for shipment to users. User experiments must conform to the Spacelab's onboard High Rate Multiplexer (HRM) format for maximum process ability. Audio, analog, instrumentation, high density, experiment data, input/output data, quality control and accounting, and experimental channel tapes along with a variety of spacelab ancillary tapes are provided to the user by SLDPF.

  16. Craniofacial Reconstruction by a Cost-Efficient Template-Based Process Using 3D Printing

    PubMed Central

    Beiglboeck, Fabian; Honigmann, Philipp; Jaquiéry, Claude; Thieringer, Florian

    2017-01-01

    Summary: Craniofacial defects often result in aesthetic and functional deficits, which affect the patient’s psyche and wellbeing. Patient-specific implants remain the optimal solution, but their use is limited or impractical due to their high costs. This article describes a fast and cost-efficient workflow of in-house manufactured patient-specific implants for craniofacial reconstruction and cranioplasty. As a proof of concept, we present a case of reconstruction of a craniofacial defect with involvement of the supraorbital rim. The following hybrid manufacturing process combines additive manufacturing with silicone molding and an intraoperative, manual fabrication process. A computer-aided design template is 3D printed from thermoplastics by a fused deposition modeling 3D printer and then silicone molded manually. After sterilization of the patient-specific mold, it is used intraoperatively to produce an implant from polymethylmethacrylate. Due to the combination of these 2 straightforward processes, the procedure can be kept very simple, and no advanced equipment is needed, resulting in minimal financial expenses. The whole fabrication of the mold is performed within approximately 2 hours depending on the template’s size and volume. This reliable technique is easy to adopt and suitable for every health facility, especially those with limited financial resources in less privileged countries, enabling many more patients to profit from patient-specific treatment. PMID:29263977

  17. A novel data processing technique for image reconstruction of penumbral imaging

    NASA Astrophysics Data System (ADS)

    Xie, Hongwei; Li, Hongyun; Xu, Zeping; Song, Guzhou; Zhang, Faqiang; Zhou, Lin

    2011-06-01

    CT image reconstruction technique was applied to the data processing of the penumbral imaging. Compared with other traditional processing techniques for penumbral coded pinhole image such as Wiener, Lucy-Richardson and blind technique, this approach is brand new. In this method, the coded aperture processing method was used for the first time independent to the point spread function of the image diagnostic system. In this way, the technical obstacles was overcome in the traditional coded pinhole image processing caused by the uncertainty of point spread function of the image diagnostic system. Then based on the theoretical study, the simulation of penumbral imaging and image reconstruction was carried out to provide fairly good results. While in the visible light experiment, the point source of light was used to irradiate a 5mm×5mm object after diffuse scattering and volume scattering. The penumbral imaging was made with aperture size of ~20mm. Finally, the CT image reconstruction technique was used for image reconstruction to provide a fairly good reconstruction result.

  18. Safeguards Approaches for Black Box Processes or Facilities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Diaz-Marcano, Helly; Gitau, Ernest TN; Hockert, John

    2013-09-25

    The objective of this study is to determine whether a safeguards approach can be developed for “black box” processes or facilities. These are facilities where a State or operator may limit IAEA access to specific processes or portions of a facility; in other cases, the IAEA may be prohibited access to the entire facility. The determination of whether a black box process or facility is safeguardable is dependent upon the details of the process type, design, and layout; the specific limitations on inspector access; and the restrictions placed upon the design information that can be provided to the IAEA. Thismore » analysis identified the necessary conditions for safeguardability of black box processes and facilities.« less

  19. 15 CFR 923.13 - Energy facility planning process.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 15 Commerce and Foreign Trade 3 2010-01-01 2010-01-01 false Energy facility planning process. 923... RESOURCE MANAGEMENT COASTAL ZONE MANAGEMENT PROGRAM REGULATIONS Uses Subject to Management § 923.13 Energy facility planning process. The management program must contain a planning process for energy facilities...

  20. 15 CFR 923.13 - Energy facility planning process.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 15 Commerce and Foreign Trade 3 2012-01-01 2012-01-01 false Energy facility planning process. 923... RESOURCE MANAGEMENT COASTAL ZONE MANAGEMENT PROGRAM REGULATIONS Uses Subject to Management § 923.13 Energy facility planning process. The management program must contain a planning process for energy facilities...

  1. 15 CFR 923.13 - Energy facility planning process.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 15 Commerce and Foreign Trade 3 2013-01-01 2013-01-01 false Energy facility planning process. 923... RESOURCE MANAGEMENT COASTAL ZONE MANAGEMENT PROGRAM REGULATIONS Uses Subject to Management § 923.13 Energy facility planning process. The management program must contain a planning process for energy facilities...

  2. 15 CFR 923.13 - Energy facility planning process.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 15 Commerce and Foreign Trade 3 2011-01-01 2011-01-01 false Energy facility planning process. 923... RESOURCE MANAGEMENT COASTAL ZONE MANAGEMENT PROGRAM REGULATIONS Uses Subject to Management § 923.13 Energy facility planning process. The management program must contain a planning process for energy facilities...

  3. 15 CFR 923.13 - Energy facility planning process.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 15 Commerce and Foreign Trade 3 2014-01-01 2014-01-01 false Energy facility planning process. 923... RESOURCE MANAGEMENT COASTAL ZONE MANAGEMENT PROGRAM REGULATIONS Uses Subject to Management § 923.13 Energy facility planning process. The management program must contain a planning process for energy facilities...

  4. Laser materials processing facility

    NASA Technical Reports Server (NTRS)

    Haggerty, J. S.

    1982-01-01

    The laser materials processing facility and its capabilities are described. A CO2 laser with continuous wave, repetitive pulse, and shaped power-time cycles is employed. The laser heated crystal growth station was used to produce metal and metal oxide single crystals and for cutting and shaping experiments using Si3N4 to displace diamond shaping processes.

  5. Synthetic Fiber Production Facilities: New Source Performance Standards (NSPS)

    EPA Pesticide Factsheets

    These standards limits emissions of volatile organic compounds (VOC) from new and reconstructed synthetic fiber production facilities that use solvent-spinning processes. Includes rule history and summary.

  6. 40 CFR 63.1158 - Emission standards for new or reconstructed sources.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... Process Facilities and Hydrochloric Acid Regeneration Plants § 63.1158 Emission standards for new or... percent. (b) Hydrochloric acid regeneration plants. (1) No owner or operator of a new or reconstructed...

  7. 40 CFR 63.1158 - Emission standards for new or reconstructed sources.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... Process Facilities and Hydrochloric Acid Regeneration Plants § 63.1158 Emission standards for new or... percent. (b) Hydrochloric acid regeneration plants. (1) No owner or operator of a new or reconstructed...

  8. 40 CFR 63.1158 - Emission standards for new or reconstructed sources.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Process Facilities and Hydrochloric Acid Regeneration Plants § 63.1158 Emission standards for new or... percent. (b) Hydrochloric acid regeneration plants. (1) No owner or operator of a new or reconstructed...

  9. 40 CFR 63.1158 - Emission standards for new or reconstructed sources.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Process Facilities and Hydrochloric Acid Regeneration Plants § 63.1158 Emission standards for new or... percent. (b) Hydrochloric acid regeneration plants. (1) No owner or operator of a new or reconstructed...

  10. The Facilities Audit. A Process for Improving Facilities Conditions.

    ERIC Educational Resources Information Center

    Kaiser, Harvey H.

    The problems of deferred maintenance and decaying campus infrastructure have troubled higher education for the past two decades. This book, designed to be a tool for facilities managers, describes a process for inspecting and reporting conditions of buildings and infrastructure. The audit process is meant to be a routine part of maintenance…

  11. Region of interest processing for iterative reconstruction in x-ray computed tomography

    NASA Astrophysics Data System (ADS)

    Kopp, Felix K.; Nasirudin, Radin A.; Mei, Kai; Fehringer, Andreas; Pfeiffer, Franz; Rummeny, Ernst J.; Noël, Peter B.

    2015-03-01

    The recent advancements in the graphics card technology raised the performance of parallel computing and contributed to the introduction of iterative reconstruction methods for x-ray computed tomography in clinical CT scanners. Iterative maximum likelihood (ML) based reconstruction methods are known to reduce image noise and to improve the diagnostic quality of low-dose CT. However, iterative reconstruction of a region of interest (ROI), especially ML based, is challenging. But for some clinical procedures, like cardiac CT, only a ROI is needed for diagnostics. A high-resolution reconstruction of the full field of view (FOV) consumes unnecessary computation effort that results in a slower reconstruction than clinically acceptable. In this work, we present an extension and evaluation of an existing ROI processing algorithm. Especially improvements for the equalization between regions inside and outside of a ROI are proposed. The evaluation was done on data collected from a clinical CT scanner. The performance of the different algorithms is qualitatively and quantitatively assessed. Our solution to the ROI problem provides an increase in signal-to-noise ratio and leads to visually less noise in the final reconstruction. The reconstruction speed of our technique was observed to be comparable with other previous proposed techniques. The development of ROI processing algorithms in combination with iterative reconstruction will provide higher diagnostic quality in the near future.

  12. Liquid Argon TPC Signal Formation, Signal Processing and Hit Reconstruction

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baller, Bruce

    2017-03-11

    This document describes the early stage of the reconstruction chain that was developed for the ArgoNeuT and MicroBooNE experiments at Fermilab. These experiments study accelerator neutrino interactions that occur in a Liquid Argon Time Projection Chamber. Reconstructing the properties of particles produced in these interactions requires knowledge of the micro-physics processes that affect the creation and transport of ionization electrons to the readout system. A wire signal deconvolution technique was developed to convert wire signals to a standard form for hit reconstruction, to remove artifacts in the electronics chain and to remove coherent noise.

  13. Liquid argon TPC signal formation, signal processing and reconstruction techniques

    NASA Astrophysics Data System (ADS)

    Baller, B.

    2017-07-01

    This document describes a reconstruction chain that was developed for the ArgoNeuT and MicroBooNE experiments at Fermilab. These experiments study accelerator neutrino interactions that occur in a Liquid Argon Time Projection Chamber. Reconstructing the properties of particles produced in these interactions benefits from the knowledge of the micro-physics processes that affect the creation and transport of ionization electrons to the readout system. A wire signal deconvolution technique was developed to convert wire signals to a standard form for hit reconstruction, to remove artifacts in the electronics chain and to remove coherent noise. A unique clustering algorithm reconstructs line-like trajectories and vertices in two dimensions which are then matched to create of 3D objects. These techniques and algorithms are available to all experiments that use the LArSoft suite of software.

  14. 10 CFR 1016.9 - Processing security facility approval.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 10 Energy 4 2011-01-01 2011-01-01 false Processing security facility approval. 1016.9 Section 1016... § 1016.9 Processing security facility approval. The following receipt of an acceptable request for... granted pursuant to § 1016.6 of this part. ...

  15. 10 CFR 1016.9 - Processing security facility approval.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 10 Energy 4 2010-01-01 2010-01-01 false Processing security facility approval. 1016.9 Section 1016... § 1016.9 Processing security facility approval. The following receipt of an acceptable request for... granted pursuant to § 1016.6 of this part. ...

  16. [Application of joint reconstruction with autogenous coronoid process graft to treat temporomandibular joint ankylosis].

    PubMed

    Xie, Qing-tiao; Huang, Xuan-ping; Jiang, Xian-fang; Yang, Yuan-yuan; Li, Hua; Lin, Xi

    2013-08-01

    To evaluate the clinical effect of joint reconstruction by using autogenous coronoid process graft to treat temporomandibular joint(TMJ) ankylosis. Nine cases of TMJ ankylosis from September 2008 to September 2010 were surgically treated by joint reconstruction with autogenous coronoid process graft, using autogenous articular disc or prosthodontic membrane as interpositional material. Mouth opening, occlusion and cone beam CT(CBCT) were used for evaluation before and after surgery. Satisfactory mouth opening was achieved in all patients and no one got occlusal changes or reankylosis during follow-up. CBCT showed that coronoid process graft reached bone union with the ramus and turned to be round. It is effective to cure TMJ ankylosis through joint reconstruction with autogenous coronoid process graft.

  17. TEMPUS: A facility for containerless electromagnetic processing onboard spacelab

    NASA Technical Reports Server (NTRS)

    Lenski, H.; Willnecker, R.

    1990-01-01

    The electromagnetic containerless processing facility TEMPUS was recently assigned for a flight on the IML-2 mission. In comparison to the TEMPUS facility already flown on a sounding rocket, several improvements had to be implemented. These are in particular related to: safety; resource management; and the possibility to process different samples with different requirements in one mission. The basic design of this facility as well as the expected processing capabilities are presented. Two operational aspects turned out to strongly influence the facility design: control of the sample motion (first experimental results indicate that crew or ground interaction will be necessary to minimize residual sample motions during processing); and exchange of RF-coils (during processing in vacuum, evaporated sample materials will condense at the cold surface and may force a coil exchange, when a critical thickness is exceeded).

  18. An Application of Business Process Management to Health Care Facilities.

    PubMed

    Hassan, Mohsen M D

    The purpose of this article is to help health care facility managers and personnel identify significant elements of their facilities to address, and steps and actions to follow, when applying business process management to them. The ABPMP (Association of Business Process Management Professionals) life-cycle model of business process management is adopted, and steps from Lean, business process reengineering, and Six Sigma, and actions from operations management are presented to implement it. Managers of health care facilities can find in business process management a more comprehensive approach to improving their facilities than Lean, Six Sigma, business process reengineering, and ad hoc approaches that does not conflict with them because many of their elements can be included under its umbrella. Furthermore, the suggested application of business process management can guide and relieve them from selecting among these approaches, as well as provide them with specific steps and actions that they can follow. This article fills a gap in the literature by presenting a much needed comprehensive application of business process management to health care facilities that has specific steps and actions for implementation.

  19. Integration Process for Payloads in the Fluids and Combustion Facility

    NASA Technical Reports Server (NTRS)

    Free, James M.; Nall, Marsha M.

    2001-01-01

    The Fluids and Combustion Facility (FCF) is an ISS research facility located in the United States Laboratory (US Lab), Destiny. The FCF is a multi-discipline facility that performs microgravity research primarily in fluids physics science and combustion science. This facility remains on-orbit and provides accommodations to multi-user and Principal investigator (PI) unique hardware. The FCF is designed to accommodate 15 PI's per year. In order to allow for this number of payloads per year, the FCF has developed an end-to-end analytical and physical integration process. The process includes provision of integration tools, products and interface management throughout the life of the payload. The payload is provided with a single point of contact from the facility and works with that interface from PI selection through post flight processing. The process utilizes electronic tools for creation of interface documents/agreements, storage of payload data and rollup for facility submittals to ISS. Additionally, the process provides integration to and testing with flight-like simulators prior to payload delivery to KSC. These simulators allow the payload to test in the flight configuration and perform final facility interface and science verifications. The process also provides for support to the payload from the FCF through the Payload Safety Review Panel (PSRP). Finally, the process includes support in the development of operational products and the operation of the payload on-orbit.

  20. 77 FR 823 - Guidance for Fuel Cycle Facility Change Processes

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-01-06

    ... NUCLEAR REGULATORY COMMISSION [NRC-2009-0262] Guidance for Fuel Cycle Facility Change Processes... Fuel Cycle Facility Change Processes.'' This regulatory guide describes the types of changes for which fuel cycle facility licensees should seek prior approval from the NRC and discusses how licensees can...

  1. Reconstruction of dynamical systems from resampled point processes produced by neuron models

    NASA Astrophysics Data System (ADS)

    Pavlova, Olga N.; Pavlov, Alexey N.

    2018-04-01

    Characterization of dynamical features of chaotic oscillations from point processes is based on embedding theorems for non-uniformly sampled signals such as the sequences of interspike intervals (ISIs). This theoretical background confirms the ability of attractor reconstruction from ISIs generated by chaotically driven neuron models. The quality of such reconstruction depends on the available length of the analyzed dataset. We discuss how data resampling improves the reconstruction for short amount of data and show that this effect is observed for different types of mechanisms for spike generation.

  2. A Review of the Aging Process and Facilities Topic.

    PubMed

    Jornitz, Maik W

    2015-01-01

    Aging facilities have become a concern in the pharmaceutical and biopharmaceutical manufacturing industry, so much that task forces are formed by trade organizations to address the topic. Too often, examples of aging or obsolete equipment, unit operations, processes, or entire facilities have been encountered. Major contributors to this outcome are the failure to invest in new equipment, disregarding appropriate maintenance activities, and neglecting the implementation of modern technologies. In some cases, a production process is insufficiently modified to manufacture a new product in an existing process that was used to produce a phased-out product. In other instances, manufacturers expanded the facility or processes to fulfill increasing demand and the scaling occurred in a non-uniform manner, which led to non-optimal results. Regulatory hurdles of post-approval changes in the process may thwart companies' efforts to implement new technologies. As an example, some changes have required 4 years to gain global approval. This paper will address cases of aging processes and facilities aside from modernizing options. © PDA, Inc. 2015.

  3. Free energy reconstruction from steered dynamics without post-processing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Athenes, Manuel, E-mail: Manuel.Athenes@cea.f; Condensed Matter and Materials Division, Physics and Life Sciences Directorate, LLNL, Livermore, CA 94551; Marinica, Mihai-Cosmin

    2010-09-20

    Various methods achieving importance sampling in ensembles of nonequilibrium trajectories enable one to estimate free energy differences and, by maximum-likelihood post-processing, to reconstruct free energy landscapes. Here, based on Bayes theorem, we propose a more direct method in which a posterior likelihood function is used both to construct the steered dynamics and to infer the contribution to equilibrium of all the sampled states. The method is implemented with two steering schedules. First, using non-autonomous steering, we calculate the migration barrier of the vacancy in Fe-{alpha}. Second, using an autonomous scheduling related to metadynamics and equivalent to temperature-accelerated molecular dynamics, wemore » accurately reconstruct the two-dimensional free energy landscape of the 38-atom Lennard-Jones cluster as a function of an orientational bond-order parameter and energy, down to the solid-solid structural transition temperature of the cluster and without maximum-likelihood post-processing.« less

  4. [Clinical application of artificial condylar process for reconstructing temporomandibular joint].

    PubMed

    Huang, Xiangdao; Shao, Zhanying; Wang, Fasheng; Duan, Yi

    2012-01-01

    To assess the feasibility and clinical outcomes of artificial condylar process in reconstruction of the temporomandibular joint. Between January 2005 and January 2010, the reconstructions of the temporomandibular joints with artificial condylar process were performed in 10 cases (11 sides, including 7 left sides and 4 right sides). There were 7 males and 3 females with an average age of 50 years (range, 40-68 years). Mandibular condyle defects were caused by mandible tumor in 7 patients with a mean disease duration of 15 months (range, 9-24 months) and by bilateral condylar fractures in 3 patients with the disease duration of 2, 3, and 2 days respectively. According to Neff classification, there were type M and A in 1 case, type M and B in 1 case, and type M in one side and subcondylar fracture in the other side in 1 case. Incisions in all patients healed by first intention, and no complication occurred. All cases were followed up 1 to 4 years, showed facial symmetry and good occluding relation, and the mouth opening was 22-38 mm (mean, 30 mm). No temporomandibular joint clicking or pain and no recurrence of tumor were observed. Most of the artificial condylar process were in good position except 1 deviated from the correct angle slightly. All the patients could have diet normally. The results of temporomandibular joint reconstruction after tumor resection with artificial condylar process are good, but the clinical outcome for intracapsular condylar fracture is expected to be further verified.

  5. Material Processing Facility - Skylab Experiment M512

    NASA Technical Reports Server (NTRS)

    1972-01-01

    This chart details Skylab's Materials Processing Facility experiment (M512). This facility, located in the Multiple Docking Adapter, was developed for Skylab and accommodated 14 different experiments that were carried out during the three marned missions. The abilities to melt and mix without the contaminating effects of containers, to suppress thermal convection and buoyancy in fluids, and to take advantage of electrostatic and magnetic forces and otherwise masked by gravitation opened the way to new knowledge of material properties and processes. This beginning would ultimately lead to the production of valuable new materials for use on Earth.

  6. Insect pest management decisions in food processing facilities

    USDA-ARS?s Scientific Manuscript database

    Pest management decision making in food processing facilities such as flour mills, rice mills, human and pet food manufacturing facilities, distribution centers and warehouses, and retail stores is a challenging undertaking. Insect pest management programs require an understanding of the food facili...

  7. Skylab materials processing facility experiment developer's report

    NASA Technical Reports Server (NTRS)

    Parks, P. G.

    1975-01-01

    The development of the Skylab M512 Materials Processing Facility is traced from the design of a portable, self-contained electron beam welding system for terrestrial applications to the highly complex experiment system ultimately developed for three Skylab missions. The M512 experiment facility was designed to support six in-space experiments intended to explore the advantages of manufacturing materials in the near-zero-gravity environment of Earth orbit. Detailed descriptions of the M512 facility and related experiment hardware are provided, with discussions of hardware verification and man-machine interfaces included. An analysis of the operation of the facility and experiments during the three Skylab missions is presented, including discussions of the hardware performance, anomalies, and data returned to earth.

  8. Fast simulation of reconstructed phylogenies under global time-dependent birth-death processes.

    PubMed

    Höhna, Sebastian

    2013-06-01

    Diversification rates and patterns may be inferred from reconstructed phylogenies. Both the time-dependent and the diversity-dependent birth-death process can produce the same observed patterns of diversity over time. To develop and test new models describing the macro-evolutionary process of diversification, generic and fast algorithms to simulate under these models are necessary. Simulations are not only important for testing and developing models but play an influential role in the assessment of model fit. In the present article, I consider as the model a global time-dependent birth-death process where each species has the same rates but rates may vary over time. For this model, I derive the likelihood of the speciation times from a reconstructed phylogenetic tree and show that each speciation event is independent and identically distributed. This fact can be used to simulate efficiently reconstructed phylogenetic trees when conditioning on the number of species, the time of the process or both. I show the usability of the simulation by approximating the posterior predictive distribution of a birth-death process with decreasing diversification rates applied on a published bird phylogeny (family Cettiidae). The methods described in this manuscript are implemented in the R package TESS, available from the repository CRAN (http://cran.r-project.org/web/packages/TESS/). Supplementary data are available at Bioinformatics online.

  9. Exploration, Sampling, And Reconstruction of Free Energy Surfaces with Gaussian Process Regression.

    PubMed

    Mones, Letif; Bernstein, Noam; Csányi, Gábor

    2016-10-11

    Practical free energy reconstruction algorithms involve three separate tasks: biasing, measuring some observable, and finally reconstructing the free energy surface from those measurements. In more than one dimension, adaptive schemes make it possible to explore only relatively low lying regions of the landscape by progressively building up the bias toward the negative of the free energy surface so that free energy barriers are eliminated. Most schemes use the final bias as their best estimate of the free energy surface. We show that large gains in computational efficiency, as measured by the reduction of time to solution, can be obtained by separating the bias used for dynamics from the final free energy reconstruction itself. We find that biasing with metadynamics, measuring a free energy gradient estimator, and reconstructing using Gaussian process regression can give an order of magnitude reduction in computational cost.

  10. SPIDER image processing for single-particle reconstruction of biological macromolecules from electron micrographs

    PubMed Central

    Shaikh, Tanvir R; Gao, Haixiao; Baxter, William T; Asturias, Francisco J; Boisset, Nicolas; Leith, Ardean; Frank, Joachim

    2009-01-01

    This protocol describes the reconstruction of biological molecules from the electron micrographs of single particles. Computation here is performed using the image-processing software SPIDER and can be managed using a graphical user interface, termed the SPIDER Reconstruction Engine. Two approaches are described to obtain an initial reconstruction: random-conical tilt and common lines. Once an existing model is available, reference-based alignment can be used, a procedure that can be iterated. Also described is supervised classification, a method to look for homogeneous subsets when multiple known conformations of the molecule may coexist. PMID:19180078

  11. Integrated Main Propulsion System Performance Reconstruction Process/Models

    NASA Technical Reports Server (NTRS)

    Lopez, Eduardo; Elliott, Katie; Snell, Steven; Evans, Michael

    2013-01-01

    The Integrated Main Propulsion System (MPS) Performance Reconstruction process provides the MPS post-flight data files needed for postflight reporting to the project integration management and key customers to verify flight performance. This process/model was used as the baseline for the currently ongoing Space Launch System (SLS) work. The process utilizes several methodologies, including multiple software programs, to model integrated propulsion system performance through space shuttle ascent. It is used to evaluate integrated propulsion systems, including propellant tanks, feed systems, rocket engine, and pressurization systems performance throughout ascent based on flight pressure and temperature data. The latest revision incorporates new methods based on main engine power balance model updates to model higher mixture ratio operation at lower engine power levels.

  12. Intermediate view reconstruction using adaptive disparity search algorithm for real-time 3D processing

    NASA Astrophysics Data System (ADS)

    Bae, Kyung-hoon; Park, Changhan; Kim, Eun-soo

    2008-03-01

    In this paper, intermediate view reconstruction (IVR) using adaptive disparity search algorithm (ASDA) is for realtime 3-dimensional (3D) processing proposed. The proposed algorithm can reduce processing time of disparity estimation by selecting adaptive disparity search range. Also, the proposed algorithm can increase the quality of the 3D imaging. That is, by adaptively predicting the mutual correlation between stereo images pair using the proposed algorithm, the bandwidth of stereo input images pair can be compressed to the level of a conventional 2D image and a predicted image also can be effectively reconstructed using a reference image and disparity vectors. From some experiments, stereo sequences of 'Pot Plant' and 'IVO', it is shown that the proposed algorithm improves the PSNRs of a reconstructed image to about 4.8 dB by comparing with that of conventional algorithms, and reduces the Synthesizing time of a reconstructed image to about 7.02 sec by comparing with that of conventional algorithms.

  13. Reconstructing the interaction between dark energy and dark matter using Gaussian processes

    NASA Astrophysics Data System (ADS)

    Yang, Tao; Guo, Zong-Kuan; Cai, Rong-Gen

    2015-06-01

    We present a nonparametric approach to reconstruct the interaction between dark energy and dark matter directly from SNIa Union 2.1 data using Gaussian processes, which is a fully Bayesian approach for smoothing data. In this method, once the equation of state (w ) of dark energy is specified, the interaction can be reconstructed as a function of redshift. For the decaying vacuum energy case with w =-1 , the reconstructed interaction is consistent with the standard Λ CDM model, namely, there is no evidence for the interaction. This also holds for the constant w cases from -0.9 to -1.1 and for the Chevallier-Polarski-Linder (CPL) parametrization case. If the equation of state deviates obviously from -1 , the reconstructed interaction exists at 95% confidence level. This shows the degeneracy between the interaction and the equation of state of dark energy when they get constraints from the observational data.

  14. 76 FR 44049 - Guidance for Fuel Cycle Facility Change Processes

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-07-22

    ... NUCLEAR REGULATORY COMMISSION [NRC-2009-0262] Guidance for Fuel Cycle Facility Change Processes...-issued Draft Regulatory Guide, DG- 3037, ``Guidance for Fuel Cycle Facility Change Processes'' in the...-3037 from August 12, 2011 to September 16, 2011. DG-3037 describes the types of changes for fuel cycle...

  15. Development of digital reconstructed radiography software at new treatment facility for carbon-ion beam scanning of National Institute of Radiological Sciences.

    PubMed

    Mori, Shinichiro; Inaniwa, Taku; Kumagai, Motoki; Kuwae, Tsunekazu; Matsuzaki, Yuka; Furukawa, Takuji; Shirai, Toshiyuki; Noda, Koji

    2012-06-01

    To increase the accuracy of carbon ion beam scanning therapy, we have developed a graphical user interface-based digitally-reconstructed radiograph (DRR) software system for use in routine clinical practice at our center. The DRR software is used in particular scenarios in the new treatment facility to achieve the same level of geometrical accuracy at the treatment as at the imaging session. DRR calculation is implemented simply as the summation of CT image voxel values along the X-ray projection ray. Since we implemented graphics processing unit-based computation, the DRR images are calculated with a speed sufficient for the particular clinical practice requirements. Since high spatial resolution flat panel detector (FPD) images should be registered to the reference DRR images in patient setup process in any scenarios, the DRR images also needs higher spatial resolution close to that of FPD images. To overcome the limitation of the CT spatial resolution imposed by the CT voxel size, we applied image processing to improve the calculated DRR spatial resolution. The DRR software introduced here enabled patient positioning with sufficient accuracy for the implementation of carbon-ion beam scanning therapy at our center.

  16. SYRMEP Tomo Project: a graphical user interface for customizing CT reconstruction workflows.

    PubMed

    Brun, Francesco; Massimi, Lorenzo; Fratini, Michela; Dreossi, Diego; Billé, Fulvio; Accardo, Agostino; Pugliese, Roberto; Cedola, Alessia

    2017-01-01

    When considering the acquisition of experimental synchrotron radiation (SR) X-ray CT data, the reconstruction workflow cannot be limited to the essential computational steps of flat fielding and filtered back projection (FBP). More refined image processing is often required, usually to compensate artifacts and enhance the quality of the reconstructed images. In principle, it would be desirable to optimize the reconstruction workflow at the facility during the experiment (beamtime). However, several practical factors affect the image reconstruction part of the experiment and users are likely to conclude the beamtime with sub-optimal reconstructed images. Through an example of application, this article presents SYRMEP Tomo Project (STP), an open-source software tool conceived to let users design custom CT reconstruction workflows. STP has been designed for post-beamtime (off-line use) and for a new reconstruction of past archived data at user's home institution where simple computing resources are available. Releases of the software can be downloaded at the Elettra Scientific Computing group GitHub repository https://github.com/ElettraSciComp/STP-Gui.

  17. SRB Processing Facilities Media Event

    NASA Image and Video Library

    2016-03-01

    At the Rotation, Processing and Surge Facility (RPSF) at NASA’s Kennedy Space Center in Florida, members of the news media photograph the process as cranes are used to lift one of two pathfinders, or test versions, of solid rocket booster segments for NASA’s Space Launch System rocket. The Ground Systems Development and Operations Program and Jacobs Engineering, on the Test and Operations Support Contract, are preparing the booster segments, which are inert, for a series of lifts, moves and stacking operations to prepare for Exploration Mission-1, deep-space missions and the journey to Mars.

  18. Integration of TomoPy and the ASTRA toolbox for advanced processing and reconstruction of tomographic synchrotron data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pelt, Daniël M.; Gürsoy, Dogˇa; Palenstijn, Willem Jan

    2016-04-28

    The processing of tomographic synchrotron data requires advanced and efficient software to be able to produce accurate results in reasonable time. In this paper, the integration of two software toolboxes, TomoPy and the ASTRA toolbox, which, together, provide a powerful framework for processing tomographic data, is presented. The integration combines the advantages of both toolboxes, such as the user-friendliness and CPU-efficient methods of TomoPy and the flexibility and optimized GPU-based reconstruction methods of the ASTRA toolbox. It is shown that both toolboxes can be easily installed and used together, requiring only minor changes to existing TomoPy scripts. Furthermore, it ismore » shown that the efficient GPU-based reconstruction methods of the ASTRA toolbox can significantly decrease the time needed to reconstruct large datasets, and that advanced reconstruction methods can improve reconstruction quality compared with TomoPy's standard reconstruction method.« less

  19. Integration of TomoPy and the ASTRA toolbox for advanced processing and reconstruction of tomographic synchrotron data

    PubMed Central

    Pelt, Daniël M.; Gürsoy, Doǧa; Palenstijn, Willem Jan; Sijbers, Jan; De Carlo, Francesco; Batenburg, Kees Joost

    2016-01-01

    The processing of tomographic synchrotron data requires advanced and efficient software to be able to produce accurate results in reasonable time. In this paper, the integration of two software toolboxes, TomoPy and the ASTRA toolbox, which, together, provide a powerful framework for processing tomographic data, is presented. The integration combines the advantages of both toolboxes, such as the user-friendliness and CPU-efficient methods of TomoPy and the flexibility and optimized GPU-based reconstruction methods of the ASTRA toolbox. It is shown that both toolboxes can be easily installed and used together, requiring only minor changes to existing TomoPy scripts. Furthermore, it is shown that the efficient GPU-based reconstruction methods of the ASTRA toolbox can significantly decrease the time needed to reconstruct large datasets, and that advanced reconstruction methods can improve reconstruction quality compared with TomoPy’s standard reconstruction method. PMID:27140167

  20. AOF LTAO mode: reconstruction strategy and first test results

    NASA Astrophysics Data System (ADS)

    Oberti, Sylvain; Kolb, Johann; Le Louarn, Miska; La Penna, Paolo; Madec, Pierre-Yves; Neichel, Benoit; Sauvage, Jean-François; Fusco, Thierry; Donaldson, Robert; Soenke, Christian; Suárez Valles, Marcos; Arsenault, Robin

    2016-07-01

    GALACSI is the Adaptive Optics (AO) system serving the instrument MUSE in the framework of the Adaptive Optics Facility (AOF) project. Its Narrow Field Mode (NFM) is a Laser Tomography AO (LTAO) mode delivering high resolution in the visible across a small Field of View (FoV) of 7.5" diameter around the optical axis. From a reconstruction standpoint, GALACSI NFM intends to optimize the correction on axis by estimating the turbulence in volume via a tomographic process, then projecting the turbulence profile onto one single Deformable Mirror (DM) located in the pupil, close to the ground. In this paper, the laser tomographic reconstruction process is described. Several methods (virtual DM, virtual layer projection) are studied, under the constraint of a single matrix vector multiplication. The pseudo-synthetic interaction matrix model and the LTAO reconstructor design are analysed. Moreover, the reconstruction parameter space is explored, in particular the regularization terms. Furthermore, we present here the strategy to define the modal control basis and split the reconstruction between the Low Order (LO) loop and the High Order (HO) loop. Finally, closed loop performance obtained with a 3D turbulence generator will be analysed with respect to the most relevant system parameters to be tuned.

  1. A review of breast tomosynthesis. Part II. Image reconstruction, processing and analysis, and advanced applications

    PubMed Central

    Sechopoulos, Ioannis

    2013-01-01

    Many important post-acquisition aspects of breast tomosynthesis imaging can impact its clinical performance. Chief among them is the reconstruction algorithm that generates the representation of the three-dimensional breast volume from the acquired projections. But even after reconstruction, additional processes, such as artifact reduction algorithms, computer aided detection and diagnosis, among others, can also impact the performance of breast tomosynthesis in the clinical realm. In this two part paper, a review of breast tomosynthesis research is performed, with an emphasis on its medical physics aspects. In the companion paper, the first part of this review, the research performed relevant to the image acquisition process is examined. This second part will review the research on the post-acquisition aspects, including reconstruction, image processing, and analysis, as well as the advanced applications being investigated for breast tomosynthesis. PMID:23298127

  2. SRB Processing Facilities Media Event

    NASA Image and Video Library

    2016-03-01

    The right-hand aft skirt, one part of the aft booster assembly for NASA’s Space Launch System solid rocket boosters, is in view in a processing cell inside the Booster Fabrication Facility (BFF) at NASA’s Kennedy Space Center in Florida. Orbital ATK is a contractor for NASA’s Marshall Space Flight Center in Alabama, and operates the BFF to prepare aft booster segments and hardware for the SLS rocket boosters. The SLS rocket and Orion spacecraft will launch on Exploration Mission-1 in 2018. The Ground Systems Development and Operations Program is preparing the infrastructure to process and launch spacecraft for deep-space missions and the journey to Mars.

  3. Spacelab Data Processing Facility (SLDPF) quality assurance expert systems development

    NASA Technical Reports Server (NTRS)

    Kelly, Angelita C.; Basile, Lisa; Ames, Troy; Watson, Janice; Dallam, William

    1987-01-01

    Spacelab Data Processing Facility (SLDPF) expert system prototypes were developed to assist in the quality assurance of Spacelab and/or Attached Shuttle Payload (ASP) processed telemetry data. The SLDPF functions include the capturing, quality monitoring, processing, accounting, and forwarding of mission data to various user facilities. Prototypes for the two SLDPF functional elements, the Spacelab Output Processing System and the Spacelab Input Processing Element, are described. The prototypes have produced beneficial results including an increase in analyst productivity, a decrease in the burden of tedious analyses, the consistent evaluation of data, and the providing of concise historical records.

  4. Spacelab Data Processing Facility (SLDPF) quality assurance expert systems development

    NASA Technical Reports Server (NTRS)

    Kelly, Angelita C.; Basile, Lisa; Ames, Troy; Watson, Janice; Dallam, William

    1987-01-01

    Spacelab Data Processing Facility (SLDPF) expert system prototypes have been developed to assist in the quality assurance of Spacelab and/or Attached Shuttle Payload (ASP) processed telemetry data. SLDPF functions include the capturing, quality monitoring, processing, accounting, and forwarding of mission data to various user facilities. Prototypes for the two SLDPF functional elements, the Spacelab Output Processing System and the Spacelab Input Processing Element, are described. The prototypes have produced beneficial results including an increase in analyst productivity, a decrease in the burden of tedious analyses, the consistent evaluation of data, and the providing of concise historical records.

  5. A high-throughput system for high-quality tomographic reconstruction of large datasets at Diamond Light Source

    PubMed Central

    Atwood, Robert C.; Bodey, Andrew J.; Price, Stephen W. T.; Basham, Mark; Drakopoulos, Michael

    2015-01-01

    Tomographic datasets collected at synchrotrons are becoming very large and complex, and, therefore, need to be managed efficiently. Raw images may have high pixel counts, and each pixel can be multidimensional and associated with additional data such as those derived from spectroscopy. In time-resolved studies, hundreds of tomographic datasets can be collected in sequence, yielding terabytes of data. Users of tomographic beamlines are drawn from various scientific disciplines, and many are keen to use tomographic reconstruction software that does not require a deep understanding of reconstruction principles. We have developed Savu, a reconstruction pipeline that enables users to rapidly reconstruct data to consistently create high-quality results. Savu is designed to work in an ‘orthogonal’ fashion, meaning that data can be converted between projection and sinogram space throughout the processing workflow as required. The Savu pipeline is modular and allows processing strategies to be optimized for users' purposes. In addition to the reconstruction algorithms themselves, it can include modules for identification of experimental problems, artefact correction, general image processing and data quality assessment. Savu is open source, open licensed and ‘facility-independent’: it can run on standard cluster infrastructure at any institution. PMID:25939626

  6. Robust adaptive multichannel SAR processing based on covariance matrix reconstruction

    NASA Astrophysics Data System (ADS)

    Tan, Zhen-ya; He, Feng

    2018-04-01

    With the combination of digital beamforming (DBF) processing, multichannel synthetic aperture radar(SAR) systems in azimuth promise well in high-resolution and wide-swath imaging, whereas conventional processing methods don't take the nonuniformity of scattering coefficient into consideration. This paper brings up a robust adaptive Multichannel SAR processing method which utilizes the Capon spatial spectrum estimator to obtain the spatial spectrum distribution over all ambiguous directions first, and then the interference-plus-noise covariance Matrix is reconstructed based on definition to acquire the Multichannel SAR processing filter. The performance of processing under nonuniform scattering coefficient is promoted by this novel method and it is robust again array errors. The experiments with real measured data demonstrate the effectiveness and robustness of the proposed method.

  7. Reconstructing Space- and Energy-Dependent Exciton Generation in Solution-Processed Inverted Organic Solar Cells.

    PubMed

    Wang, Yuheng; Zhang, Yajie; Lu, Guanghao; Feng, Xiaoshan; Xiao, Tong; Xie, Jing; Liu, Xiaoyan; Ji, Jiahui; Wei, Zhixiang; Bu, Laju

    2018-04-25

    Photon absorption-induced exciton generation plays an important role in determining the photovoltaic properties of donor/acceptor organic solar cells with an inverted architecture. However, the reconstruction of light harvesting and thus exciton generation at different locations within organic inverted device are still not well resolved. Here, we investigate the film depth-dependent light absorption spectra in a small molecule donor/acceptor film. Including depth-dependent spectra into an optical transfer matrix method allows us to reconstruct both film depth- and energy-dependent exciton generation profiles, using which short-circuit current and external quantum efficiency of the inverted device are simulated and compared with the experimental measurements. The film depth-dependent spectroscopy, from which we are able to simultaneously reconstruct light harvesting profile, depth-dependent composition distribution, and vertical energy level variations, provides insights into photovoltaic process. In combination with appropriate material processing methods and device architecture, the method proposed in this work will help optimizing film depth-dependent optical/electronic properties for high-performance solar cells.

  8. Proposal for a new categorization of aseptic processing facilities based on risk assessment scores.

    PubMed

    Katayama, Hirohito; Toda, Atsushi; Tokunaga, Yuji; Katoh, Shigeo

    2008-01-01

    Risk assessment of aseptic processing facilities was performed using two published risk assessment tools. Calculated risk scores were compared with experimental test results, including environmental monitoring and media fill run results, in three different types of facilities. The two risk assessment tools used gave a generally similar outcome. However, depending on the tool used, variations were observed in the relative scores between the facilities. For the facility yielding the lowest risk scores, the corresponding experimental test results showed no contamination, indicating that these ordinal testing methods are insufficient to evaluate this kind of facility. A conventional facility having acceptable aseptic processing lines gave relatively high risk scores. The facility showing a rather high risk score demonstrated the usefulness of conventional microbiological test methods. Considering the significant gaps observed in calculated risk scores and in the ordinal microbiological test results between advanced and conventional facilities, we propose a facility categorization based on risk assessment. The most important risk factor in aseptic processing is human intervention. When human intervention is eliminated from the process by advanced hardware design, the aseptic processing facility can be classified into a new risk category that is better suited for assuring sterility based on a new set of criteria rather than on currently used microbiological analysis. To fully benefit from advanced technologies, we propose three risk categories for these aseptic facilities.

  9. Disposition of elderly patients after head and neck reconstruction.

    PubMed

    Hatcher, Jeanne L; Bell, Elizabeth Bradford; Browne, J Dale; Waltonen, Joshua D

    2013-11-01

    A patient's needs at discharge, particularly the need for nursing facility placement, may affect hospital length of stay and health care costs. The association between age and disposition after microvascular reconstruction of the head and neck has yet to be reported in the literature. To determine whether elderly patients are more likely to be discharged to a nursing or other care facility as opposed to returning home after microvascular reconstruction of the head and neck. From January 1, 2001, through December 31, 2010, patients undergoing microvascular reconstruction at an academic medical center were identified and their medical records systematically reviewed. During the study period, 457 patients were identified by Current Procedural Terminology codes for microvascular free tissue transfer for a head and neck defect regardless of cause. Seven patients were excluded for inadequate data on the postoperative disposition or American Society of Anesthesiologists (ASA) score. A total of 450 were included for analysis. Demographic and surgical data were collected, including the patient age, ASA score, and postoperative length of stay. These variables were then compared between groups of patients discharged to different posthospitalization care facilities. The mean age of participants was 59.1 years. Most patients (n = 386 [85.8%]) were discharged home with or without home health services. The mean age of those discharged home was 57.5 years; discharge to home was the reference for comparison and odds ratio (OR) calculation. For those discharged to a skilled nursing facility, mean age was 67.1 years (OR, 1.055; P < .001). Mean age of those discharged to a long-term acute care facility was 71.5 years (OR, 1.092; P = .002). Length of stay also affected the disposition to a skilled nursing facility (OR, 1.098), as did the ASA score (OR, 2.988). Elderly patients are less likely to be discharged home after free flap reconstruction. Age, ASA score, and length of stay are

  10. SRB Processing Facilities Media Event

    NASA Image and Video Library

    2016-03-01

    At the Rotation, Processing and Surge Facility (RPSF) at NASA’s Kennedy Space Center in Florida, members of the news media watch as cranes are used to lift one of two pathfinders, or test versions, of solid rocket booster segments for NASA’s Space Launch System rocket. The Ground Systems Development and Operations Program and Jacobs Engineering, on the Test and Operations Support Contract, are preparing the booster segments, which are inert, for a series of lifts, moves and stacking operations to prepare for Exploration Mission-1, deep-space missions and the journey to Mars.

  11. Providing security for automated process control systems at hydropower engineering facilities

    NASA Astrophysics Data System (ADS)

    Vasiliev, Y. S.; Zegzhda, P. D.; Zegzhda, D. P.

    2016-12-01

    This article suggests the concept of a cyberphysical system to manage computer security of automated process control systems at hydropower engineering facilities. According to the authors, this system consists of a set of information processing tools and computer-controlled physical devices. Examples of cyber attacks on power engineering facilities are provided, and a strategy of improving cybersecurity of hydropower engineering systems is suggested. The architecture of the multilevel protection of the automated process control system (APCS) of power engineering facilities is given, including security systems, control systems, access control, encryption, secure virtual private network of subsystems for monitoring and analysis of security events. The distinctive aspect of the approach is consideration of interrelations and cyber threats, arising when SCADA is integrated with the unified enterprise information system.

  12. High Vacuum Creep Facility in the Materials Processing Laboratory

    NASA Image and Video Library

    1973-01-21

    Technicians at work in the Materials Processing Laboratory’s Creep Facility at the National Aeronautics and Space Administration (NASA) Lewis Research Center. The technicians supported the engineers’ studies of refractory materials, metals, and advanced superalloys. The Materials Processing Laboratory contained laboratories and test areas equipped to prepare and develop these metals and materials. The ultra-high vacuum lab, seen in this photograph, contained creep and tensile test equipment. Creep testing is used to study a material’s ability to withstand long durations under constant pressure and temperatures. The equipment measured the strain over a long period of time. Tensile test equipment subjects the test material to strain until the material fails. The two tests were used to determine the strength and durability of different materials. The Materials Processing Laboratory also housed arc and electron beam melting furnaces, a hydraulic vertical extrusion press, compaction and forging equipment, and rolling mills and swagers. There were cryogenic and gas storage facilities and mechanical and oil diffusion vacuum pumps. The facility contained both instrumental and analytical chemistry laboratories for work on radioactive or toxic materials and the only shop to machine toxic materials in the Midwest.

  13. Manufacturing Demonstration Facility: Roll-to-Roll Processing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Datskos, Panos G; Joshi, Pooran C; List III, Frederick Alyious

    This Manufacturing Demonstration Facility (MDF)e roll-to-roll processing effort described in this report provided an excellent opportunity to investigate a number of advanced manufacturing approaches to achieve a path for low cost devices and sensors. Critical to this effort is the ability to deposit thin films at low temperatures using nanomaterials derived from nanofermentation. The overarching goal of this project was to develop roll-to-roll manufacturing processes of thin film deposition on low-cost flexible substrates for electronics and sensor applications. This project utilized ORNL s unique Pulse Thermal Processing (PTP) technologies coupled with non-vacuum low temperature deposition techniques, ORNL s clean roommore » facility, slot dye coating, drop casting, spin coating, screen printing and several other equipment including a Dimatix ink jet printer and a large-scale Kyocera ink jet printer. The roll-to-roll processing project had three main tasks: 1) develop and demonstrate zinc-Zn based opto-electronic sensors using low cost nanoparticulate structures manufactured in a related MDF Project using nanofermentation techniques, 2) evaluate the use of silver based conductive inks developed by project partner NovaCentrix for electronic device fabrication, and 3) demonstrate a suite of low cost printed sensors developed using non-vacuum deposition techniques which involved the integration of metal and semiconductor layers to establish a diverse sensor platform technology.« less

  14. Development and Validation of Pathogen Environmental Monitoring Programs for Small Cheese Processing Facilities.

    PubMed

    Beno, Sarah M; Stasiewicz, Matthew J; Andrus, Alexis D; Ralyea, Robert D; Kent, David J; Martin, Nicole H; Wiedmann, Martin; Boor, Kathryn J

    2016-12-01

    Pathogen environmental monitoring programs (EMPs) are essential for food processing facilities of all sizes that produce ready-to-eat food products exposed to the processing environment. We developed, implemented, and evaluated EMPs targeting Listeria spp. and Salmonella in nine small cheese processing facilities, including seven farmstead facilities. Individual EMPs with monthly sample collection protocols were designed specifically for each facility. Salmonella was detected in only one facility, with likely introduction from the adjacent farm indicated by pulsed-field gel electrophoresis data. Listeria spp. were isolated from all nine facilities during routine sampling. The overall Listeria spp. (other than Listeria monocytogenes ) and L. monocytogenes prevalences in the 4,430 environmental samples collected were 6.03 and 1.35%, respectively. Molecular characterization and subtyping data suggested persistence of a given Listeria spp. strain in seven facilities and persistence of L. monocytogenes in four facilities. To assess routine sampling plans, validation sampling for Listeria spp. was performed in seven facilities after at least 6 months of routine sampling. This validation sampling was performed by independent individuals and included collection of 50 to 150 samples per facility, based on statistical sample size calculations. Two of the facilities had a significantly higher frequency of detection of Listeria spp. during the validation sampling than during routine sampling, whereas two other facilities had significantly lower frequencies of detection. This study provides a model for a science- and statistics-based approach to developing and validating pathogen EMPs.

  15. Evaluation Aspects of Building Structures Reconstructed After a Failure or Catastrophe

    NASA Astrophysics Data System (ADS)

    Krentowski, Janusz R.; Knyziak, Piotr

    2017-10-01

    The article presents the characteristics of several steel structures, among others modernized industrial dye house, school sports hall, truck repair workshop, that have been rebuilt after a disaster or a catastrophe. The structures were analyzed in detail, and the evaluation and reconstruction processes were described. The emergencies that occurred during exploitation of the buildings were the result of multiple mistakes: incorrectly defined intervals between inspections, errors during periodic inspections, incorrect repair work recommendations. The concepts of reinforcement work implemented by the authors, enabling the long-term future failure-free operation of the objects, were presented. Recommendations for monitoring of the facilities, applied after reinforcement or reconstruction, have been formulated. The methodology for the implementation of specialized investigations, such as geodetic, optical, geological, chemical strength tests, both destructive and non-destructive, has been defined. The need to determine the limit values of deformations, deflections, damage or other faults of structural elements and the entire rebuilt facilities, as well as defining conditions for objects’ withdrawal from operation in subsequent exceptional situations was indicated.

  16. 42 CFR 82.32 - How will NIOSH make changes in scientific elements underlying the dose reconstruction process...

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 42 Public Health 1 2014-10-01 2014-10-01 false How will NIOSH make changes in scientific elements underlying the dose reconstruction process, based on scientific progress? 82.32 Section 82.32 Public Health... AND RELATED ACTIVITIES METHODS FOR CONDUCTING DOSE RECONSTRUCTION UNDER THE ENERGY EMPLOYEES...

  17. 42 CFR 82.32 - How will NIOSH make changes in scientific elements underlying the dose reconstruction process...

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 42 Public Health 1 2011-10-01 2011-10-01 false How will NIOSH make changes in scientific elements underlying the dose reconstruction process, based on scientific progress? 82.32 Section 82.32 Public Health... AND RELATED ACTIVITIES METHODS FOR CONDUCTING DOSE RECONSTRUCTION UNDER THE ENERGY EMPLOYEES...

  18. 42 CFR 82.32 - How will NIOSH make changes in scientific elements underlying the dose reconstruction process...

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 42 Public Health 1 2012-10-01 2012-10-01 false How will NIOSH make changes in scientific elements underlying the dose reconstruction process, based on scientific progress? 82.32 Section 82.32 Public Health... AND RELATED ACTIVITIES METHODS FOR CONDUCTING DOSE RECONSTRUCTION UNDER THE ENERGY EMPLOYEES...

  19. 42 CFR 82.32 - How will NIOSH make changes in scientific elements underlying the dose reconstruction process...

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 42 Public Health 1 2013-10-01 2013-10-01 false How will NIOSH make changes in scientific elements underlying the dose reconstruction process, based on scientific progress? 82.32 Section 82.32 Public Health... AND RELATED ACTIVITIES METHODS FOR CONDUCTING DOSE RECONSTRUCTION UNDER THE ENERGY EMPLOYEES...

  20. SRB Processing Facilities Media Event

    NASA Image and Video Library

    2016-03-01

    Members of the news media watch as a crane is used to move one of two pathfinders, or test versions, of solid rocket booster segments for NASA’s Space Launch System rocket to a test stand in the Rotation, Processing and Surge Facility at NASA’s Kennedy Space Center in Florida. Inside the RPSF, the Ground Systems Development and Operations Program and Jacobs Engineering, on the Test and Operations Support Contract, will prepare the booster segments, which are inert, for a series of lifts, moves and stacking operations to prepare for Exploration Mission-1, deep-space missions and the journey to Mars.

  1. SRB Processing Facilities Media Event

    NASA Image and Video Library

    2016-03-01

    Members of the news media view forward booster segments (painted green) for NASA’s Space Launch System rocket boosters inside the Booster Fabrication Facility (BFF) at NASA’s Kennedy Space Center in Florida. Orbital ATK is a contractor for NASA’s Marshall Space Flight Center in Alabama, and operates the BFF to prepare aft booster segments and hardware for the SLS rocket boosters. The SLS rocket and Orion spacecraft will launch on Exploration Mission-1 in 2018. The Ground Systems Development and Operations Program is preparing the infrastructure to process and launch spacecraft for deep-space missions and the journey to Mars.

  2. 40 CFR 63.1346 - Standards for new or reconstructed raw material dryers.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 11 2010-07-01 2010-07-01 true Standards for new or reconstructed raw... Industry Emission Standards and Operating Limits § 63.1346 Standards for new or reconstructed raw material dryers. (a) New or reconstructed raw material dryers located at facilities that are major sources can not...

  3. Development of experimental facilities for processing metallic crystals in orbit

    NASA Technical Reports Server (NTRS)

    Duncan, Bill J.

    1990-01-01

    This paper discusses the evolution, current status, and planning for facilities to exploit the microgravity environment of earth orbit in applied metallic materials science. Space-Shuttle based facilities and some precursor flight programs are reviewed. Current facility development programs and planned Space Station furnace capabilities are described. The reduced gravity levels available in earth orbit allow the processing of metallic materials without the disturbing influence of gravitationally induced thermal convection, stratification due to density differences in sample components, or the effects of hydrostatic pressure.

  4. Birth-death models and coalescent point processes: the shape and probability of reconstructed phylogenies.

    PubMed

    Lambert, Amaury; Stadler, Tanja

    2013-12-01

    Forward-in-time models of diversification (i.e., speciation and extinction) produce phylogenetic trees that grow "vertically" as time goes by. Pruning the extinct lineages out of such trees leads to natural models for reconstructed trees (i.e., phylogenies of extant species). Alternatively, reconstructed trees can be modelled by coalescent point processes (CPPs), where trees grow "horizontally" by the sequential addition of vertical edges. Each new edge starts at some random speciation time and ends at the present time; speciation times are drawn from the same distribution independently. CPPs lead to extremely fast computation of tree likelihoods and simulation of reconstructed trees. Their topology always follows the uniform distribution on ranked tree shapes (URT). We characterize which forward-in-time models lead to URT reconstructed trees and among these, which lead to CPP reconstructed trees. We show that for any "asymmetric" diversification model in which speciation rates only depend on time and extinction rates only depend on time and on a non-heritable trait (e.g., age), the reconstructed tree is CPP, even if extant species are incompletely sampled. If rates additionally depend on the number of species, the reconstructed tree is (only) URT (but not CPP). We characterize the common distribution of speciation times in the CPP description, and discuss incomplete species sampling as well as three special model cases in detail: (1) the extinction rate does not depend on a trait; (2) rates do not depend on time; (3) mass extinctions may happen additionally at certain points in the past. Copyright © 2013 Elsevier Inc. All rights reserved.

  5. Sustainable data policy for a data production facility: a work in (continual) progress

    NASA Astrophysics Data System (ADS)

    Ketcham, R. A.

    2017-12-01

    The University of Texas High-Resolution X-Ray Computed Tomography Facility (UTCT) has been producing volumetric data and data products of geological and other scientific specimens and engineering materials for over 20 years. Data volumes, both in terms of the size of individual data sets and overall facility production, have progressively grown and fluctuated near the upper boundary of what can be managed by contemporary workstations and lab-scale servers and network infrastructure, making data policy a preoccupation for our entire history. Although all projects have been archived since our first day of operation, policies on which data to keep (raw, reconstructed after corrections, processed) have varied, and been periodically revisited in consideration of the cost of curation and the likelihood of revisiting and reprocessing data when better techniques become available, such as improved artifact corrections or iterative tomographic reconstruction. Advances in instrumentation regularly make old data obsolete and more advantageous to reacquire, but the simple act of getting a sample to a scanning facility is a practical barrier that cannot be overlooked. In our experience, the main times that raw data have been revisited using improved processing to improve image quality were predictable, high-impact charismatic projects (e.g., Archaeopteryx, A. Afarensis "Lucy"). These cases actually provided the impetus for development of the new techniques (ring and beam hardening artifact reduction), which were subsequently incorporated into our data processing pipeline going forward but were rarely if ever retroactively applied to earlier data sets. The only other times raw data have been reprocessed were when reconstruction parameters were inappropriate, due to unnoticed sample features or human error, which are usually recognized fairly quickly. The optimal data retention policy thus remains an open question, although erring on the side of caution remains the default

  6. Nuclear Waste: Defense Waste Processing Facility-Cost, Schedule, and Technical Issues.

    DTIC Science & Technology

    1992-06-17

    gallons of high-level radioactive waste stored in underground tanks at the savannah major facility involved Is the Defense Waste Processing Facility ( DwPF ...As a result of concerns about potential problems with the DWPF and delays in its scheduled start-up, the Chairman of the Environment, Energy, and...Natural Resources Subcommittee, House Committee on Government Operations, asked GAO to review the status of the DWPF and other facilities. This report

  7. Meaning Reconstruction Process After Suicide: Life-Story of a Japanese Woman Who Lost Her Son to Suicide.

    PubMed

    Kawashima, Daisuke; Kawano, Kenji

    2017-09-01

    Although Japan has a high suicide rate, there is insufficient research on the experiences of suicide-bereaved individuals. We investigated the qualitative aspects of the meaning reconstruction process after a loss to suicide. We conducted a life-story interview using open-ended questions with one middle-aged Japanese woman who lost her son to suicide. We used a narrative approach to transcribe and code the participant's narratives for analysis. The analysis revealed three meaning groups that structured the participant's reactions to the suicide: making sense of her son's death and life, relationships with other people, and reconstruction of a bond with the deceased. The belief that death is not an eternal split and that there is a connection between the living and the deceased reduced the pain felt by our participant. Furthermore, the narratives worked as scaffolds in the meaning reconstruction process. We discuss our results in the light of cross-cultural differences in the grieving process.

  8. Occupancy mapping and surface reconstruction using local Gaussian processes with Kinect sensors.

    PubMed

    Kim, Soohwan; Kim, Jonghyuk

    2013-10-01

    Although RGB-D sensors have been successfully applied to visual SLAM and surface reconstruction, most of the applications aim at visualization. In this paper, we propose a noble method of building continuous occupancy maps and reconstructing surfaces in a single framework for both navigation and visualization. Particularly, we apply a Bayesian nonparametric approach, Gaussian process classification, to occupancy mapping. However, it suffers from high-computational complexity of O(n(3))+O(n(2)m), where n and m are the numbers of training and test data, respectively, limiting its use for large-scale mapping with huge training data, which is common with high-resolution RGB-D sensors. Therefore, we partition both training and test data with a coarse-to-fine clustering method and apply Gaussian processes to each local clusters. In addition, we consider Gaussian processes as implicit functions, and thus extract iso-surfaces from the scalar fields, continuous occupancy maps, using marching cubes. By doing that, we are able to build two types of map representations within a single framework of Gaussian processes. Experimental results with 2-D simulated data show that the accuracy of our approximated method is comparable to previous work, while the computational time is dramatically reduced. We also demonstrate our method with 3-D real data to show its feasibility in large-scale environments.

  9. SRB Processing Facilities Media Event

    NASA Image and Video Library

    2016-03-01

    Members of the news media view the high bay inside the Rotation, Processing and Surge Facility (RPSF) at NASA’s Kennedy Space Center in Florida. Inside the RPSF, engineers and technicians with Jacobs Engineering on the Test and Operations Support Contract, explain the various test stands. In the far corner is one of two pathfinders, or test versions, of solid rocket booster segments for NASA’s Space Launch System rocket. The Ground Systems Development and Operations Program and Jacobs are preparing the booster segments, which are inert, for a series of lifts, moves and stacking operations to prepare for Exploration Mission-1, deep-space missions and the journey to Mars.

  10. Critical Protection Item classification for a waste processing facility at Savannah River Site

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ades, M.J.; Garrett, R.J.

    1993-10-01

    This paper describes the methodology for Critical Protection Item (CPI) classification and its application to the Structures, Systems and Components (SSC) of a waste processing facility at the Savannah River Site (SRS). The WSRC methodology for CPI classification includes the evaluation of the radiological and non-radiological consequences resulting from postulated accidents at the waste processing facility and comparison of these consequences with allowable limits. The types of accidents considered include explosions and fire in the facility and postulated accidents due to natural phenomena, including earthquakes, tornadoes, and high velocity straight winds. The radiological analysis results indicate that CPIs are notmore » required at the waste processing facility to mitigate the consequences of radiological release. The non-radiological analysis, however, shows that the Waste Storage Tank (WST) and the dike spill containment structures around the formic acid tanks in the cold chemical feed area and waste treatment area of the facility should be identified as CPIs. Accident mitigation options are provided and discussed.« less

  11. 10 CFR 1016.8 - Approval for processing access permittees for security facility approval.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 10 Energy 4 2011-01-01 2011-01-01 false Approval for processing access permittees for security facility approval. 1016.8 Section 1016.8 Energy DEPARTMENT OF ENERGY (GENERAL PROVISIONS) SAFEGUARDING OF RESTRICTED DATA Physical Security § 1016.8 Approval for processing access permittees for security facility...

  12. 10 CFR 1016.8 - Approval for processing access permittees for security facility approval.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 10 Energy 4 2010-01-01 2010-01-01 false Approval for processing access permittees for security facility approval. 1016.8 Section 1016.8 Energy DEPARTMENT OF ENERGY (GENERAL PROVISIONS) SAFEGUARDING OF RESTRICTED DATA Physical Security § 1016.8 Approval for processing access permittees for security facility...

  13. Scientific approach and practical experience for reconstruction of waste water treatment plants in Russia

    NASA Astrophysics Data System (ADS)

    Makisha, Nikolay; Gogina, Elena

    2017-11-01

    Protection of water bodies has a strict dependence on reliable operation of engineering systems and facilities for water supply and sewage. The majority of these plants and stations has been constructed in 1970-1980's in accordance with rules and regulations of that time. So now most of them require reconstruction due to serious physical or/and technological wear. The current condition of water supply and sewage systems and facilities frequently means a hidden source of serious danger for normal life support and ecological safety of cities and towns. The article reveals an obtained experience and modern approaches for reconstruction of waste water and sludge treatment plants that proved their efficiency even if applied in limited conditions such as area limits, investments limits. The main directions of reconstruction: overhaul repair and partial modernization of existing facilities on the basis of initial project; - restoration and modernization of existing systems on the basis on the current documents and their current condition; upgrade of waste water treatment plants (WWTPs) performance on the basis of modern technologies and methods; reconstruction of sewage systems and facilities and treatment quality improvement.

  14. Blob-enhanced reconstruction technique

    NASA Astrophysics Data System (ADS)

    Castrillo, Giusy; Cafiero, Gioacchino; Discetti, Stefano; Astarita, Tommaso

    2016-09-01

    A method to enhance the quality of the tomographic reconstruction and, consequently, the 3D velocity measurement accuracy, is presented. The technique is based on integrating information on the objects to be reconstructed within the algebraic reconstruction process. A first guess intensity distribution is produced with a standard algebraic method, then the distribution is rebuilt as a sum of Gaussian blobs, based on location, intensity and size of agglomerates of light intensity surrounding local maxima. The blobs substitution regularizes the particle shape allowing a reduction of the particles discretization errors and of their elongation in the depth direction. The performances of the blob-enhanced reconstruction technique (BERT) are assessed with a 3D synthetic experiment. The results have been compared with those obtained by applying the standard camera simultaneous multiplicative reconstruction technique (CSMART) to the same volume. Several blob-enhanced reconstruction processes, both substituting the blobs at the end of the CSMART algorithm and during the iterations (i.e. using the blob-enhanced reconstruction as predictor for the following iterations), have been tested. The results confirm the enhancement in the velocity measurements accuracy, demonstrating a reduction of the bias error due to the ghost particles. The improvement is more remarkable at the largest tested seeding densities. Additionally, using the blobs distributions as a predictor enables further improvement of the convergence of the reconstruction algorithm, with the improvement being more considerable when substituting the blobs more than once during the process. The BERT process is also applied to multi resolution (MR) CSMART reconstructions, permitting simultaneously to achieve remarkable improvements in the flow field measurements and to benefit from the reduction in computational time due to the MR approach. Finally, BERT is also tested on experimental data, obtaining an increase of the

  15. Monte Carlo-based fluorescence molecular tomography reconstruction method accelerated by a cluster of graphic processing units.

    PubMed

    Quan, Guotao; Gong, Hui; Deng, Yong; Fu, Jianwei; Luo, Qingming

    2011-02-01

    High-speed fluorescence molecular tomography (FMT) reconstruction for 3-D heterogeneous media is still one of the most challenging problems in diffusive optical fluorescence imaging. In this paper, we propose a fast FMT reconstruction method that is based on Monte Carlo (MC) simulation and accelerated by a cluster of graphics processing units (GPUs). Based on the Message Passing Interface standard, we modified the MC code for fast FMT reconstruction, and different Green's functions representing the flux distribution in media are calculated simultaneously by different GPUs in the cluster. A load-balancing method was also developed to increase the computational efficiency. By applying the Fréchet derivative, a Jacobian matrix is formed to reconstruct the distribution of the fluorochromes using the calculated Green's functions. Phantom experiments have shown that only 10 min are required to get reconstruction results with a cluster of 6 GPUs, rather than 6 h with a cluster of multiple dual opteron CPU nodes. Because of the advantages of high accuracy and suitability for 3-D heterogeneity media with refractive-index-unmatched boundaries from the MC simulation, the GPU cluster-accelerated method provides a reliable approach to high-speed reconstruction for FMT imaging.

  16. Reduction of Environmental Listeria Using Gaseous Ozone in a Cheese Processing Facility.

    PubMed

    Eglezos, Sofroni; Dykes, Gary A

    2018-05-01

    A cheese processing facility seeking to reduce environmental Listeria colonization initiated a regime of ozonation across all production areas as an adjunct to its sanitation regimes. A total of 360 environmental samples from the facility were tested for Listeria over a 12-month period. A total of 15 areas before and 15 areas after ozonation were tested. Listeria isolations were significantly ( P < 0.001) reduced from 15.0% in the preozonation samples to 1.67% in the postozonation samples in all areas. No deleterious effects of ozonation were noted on the wall paneling, seals, synthetic floors, or cheese processing equipment. The ozonation regime was readily incorporated by sanitation staff into the existing good manufacturing practice program. The application of ozone may result in a significant reduction in the prevalence of Listeria in food processing facilities.

  17. Evaluation of the antipsychotic medication review process at four long-term facilities in Alberta.

    PubMed

    Birney, Arden; Charland, Paola; Cole, Mollie; Aslam Arain, Mubashir

    2016-01-01

    The goal of this evaluation was to understand how four long-term care (LTC) facilities in Alberta have implemented medication reviews for the Appropriate Use of Antipsychotics (AUA) initiative. We aimed to determine how interprofessional (IP) collaboration was incorporated in the antipsychotic medication reviews and how the reviews had been sustained. Four LTC facilities in Alberta participated in this evaluation. We conducted semistructured interviews with 18 facility staff and observed one antipsychotic medication review at each facility. We analyzed data according to the following key components that we identified as relevant to the antipsychotic medication reviews: the structure of the reviews, IP interactions between the staff members, and strategies for sustaining the reviews. The duration of antipsychotic medication reviews ranged from 1 to 1.5 hours. The number of professions in attendance ranged from 3 to 9; a pharmacist led the review at two sites, while a registered nurse led the review at one site and a nurse practitioner at the remaining site. The number of residents discussed during the review ranged from 6 to 20. The process at some facilities was highly IP, demonstrating each of the six IP competencies. Other facilities conducted the review in a less IP manner due to challenges of physician involvement and staff workload, particularly of health care aides. Facilities that had an nurse practitioner on site were more efficient with the process of implementing recommendations resulting from the medication reviews. The LTC facilities were successful in implementing the medication review process and the process seemed to be sustainable. A few challenges were observed in the implementation process at two facilities. IP practice moved forward the goals of the AUA initiative to reduce the inappropriate use of antipsychotics.

  18. Dialysis Facility Safety: Processes and Opportunities.

    PubMed

    Garrick, Renee; Morey, Rishikesh

    2015-01-01

    Unintentional human errors are the source of most safety breaches in complex, high-risk environments. The environment of dialysis care is extremely complex. Dialysis patients have unique and changing physiology, and the processes required for their routine care involve numerous open-ended interfaces between providers and an assortment of technologically advanced equipment. Communication errors, both within the dialysis facility and during care transitions, and lapses in compliance with policies and procedures are frequent areas of safety risk. Some events, such as air emboli and needle dislodgments occur infrequently, but are serious risks. Other adverse events include medication errors, patient falls, catheter and access-related infections, access infiltrations and prolonged bleeding. A robust safety system should evaluate how multiple, sequential errors might align to cause harm. Systems of care can be improved by sharing the results of root cause analyses, and "good catches." Failure mode effects and analyses can be used to proactively identify and mitigate areas of highest risk, and methods drawn from cognitive psychology, simulation training, and human factor engineering can be used to advance facility safety. © 2015 Wiley Periodicals, Inc.

  19. SRB Processing Facilities Media Event

    NASA Image and Video Library

    2016-03-01

    Members of the news media view the high bay inside the Rotation, Processing and Surge Facility (RPSF) at NASA’s Kennedy Space Center in Florida. Kerry Chreist, with Jacobs Engineering on the Test and Operations Support Contract, talks with a reporter about the booster segments for NASA’s Space Launch System (SLS) rocket. In the far corner, in the vertical position, is one of two pathfinders, or test versions, of solid rocket booster segments for the SLS rocket. The Ground Systems Development and Operations Program and Jacobs are preparing the booster segments, which are inert, for a series of lifts, moves and stacking operations to prepare for Exploration Mission-1, deep-space missions and the journey to Mars.

  20. SRB Processing Facilities Media Event

    NASA Image and Video Library

    2016-03-01

    Inside the Booster Fabrication Facility (BFF) at NASA’s Kennedy Space Center in Florida, members of the news media photograph a frustrum that will be stacked atop a forward skirt for one of NASA’s Space Launch System (SLS) solid rocket boosters. Orbital ATK is a contractor for NASA’s Marshall Space Flight Center in Alabama, and operates the BFF to prepare aft booster segments and hardware for the SLS solid rocket boosters. The SLS rocket and Orion spacecraft will launch on Exploration Mission-1 in 2018. The Ground Systems Development and Operations Program is preparing the infrastructure to process and launch spacecraft on deep-space missions and the journey to Mars.

  1. SRB Processing Facilities Media Event

    NASA Image and Video Library

    2016-03-01

    Inside the Booster Fabrication Facility (BFF) at NASA’s Kennedy Space Center in Florida, Jeff Cook, a thermal protection system specialist with Orbital ATK, displays a sample of the painted thermal protection system that is being applied to booster segments. Members of the news media toured the BFF. Orbital ATK is a contractor for NASA’s Marshall Space Flight Center in Alabama, and operates the BFF to prepare aft booster segments and hardware for the SLS rocket boosters. The SLS rocket and Orion spacecraft will launch on Exploration Mission-1 in 2018. The Ground Systems Development and Operations Program is preparing the infrastructure to process and launch spacecraft for deep-space missions and the journey to Mars.

  2. Process control and dosimetry in a multipurpose irradiation facility

    NASA Astrophysics Data System (ADS)

    Cabalfin, E. G.; Lanuza, L. G.; Solomon, H. M.

    1999-08-01

    Availability of the multipurpose irradiation facility at the Philippine Nuclear Research Institute has encouraged several local industries to use gamma radiation for sterilization or decontamination of various products. Prior to routine processing, dose distribution studies are undertaken for each product and product geometry. During routine irradiation, dosimeters are placed at the minimum and maximum dose positions of a process load.

  3. Stepwise training for reconstructive microsurgery: the journey to becoming a confident microsurgeon in singapore.

    PubMed

    Ramachandran, Savitha; Ong, Yee-Siang; Chin, Andrew Yh; Song, In-Chin; Ogden, Bryan; Tan, Bien-Keem

    2014-05-01

    Microsurgery training in Singapore began in 1980 with the opening of the Experimental Surgical Unit. Since then, the unit has continued to grow and have held microsurgical training courses biannually. The road to becoming a full-fledged reconstructive surgeon requires the mastering of both microvascular as well as flap raising techniques and requires time, patience and good training facilities. In Singapore, over the past 2 decades, we have had the opportunity to develop good training facilities and to refine our surgical education programmes in reconstructive microsurgery. In this article, we share our experience with training in reconstructive microsurgery.

  4. Stepwise Training for Reconstructive Microsurgery: The Journey to Becoming a Confident Microsurgeon in Singapore

    PubMed Central

    Ong, Yee-Siang; Chin, Andrew YH; Song, In-Chin; Ogden, Bryan; Tan, Bien-Keem

    2014-01-01

    Microsurgery training in Singapore began in 1980 with the opening of the Experimental Surgical Unit. Since then, the unit has continued to grow and have held microsurgical training courses biannually. The road to becoming a full-fledged reconstructive surgeon requires the mastering of both microvascular as well as flap raising techniques and requires time, patience and good training facilities. In Singapore, over the past 2 decades, we have had the opportunity to develop good training facilities and to refine our surgical education programmes in reconstructive microsurgery. In this article, we share our experience with training in reconstructive microsurgery. PMID:24883269

  5. STAR Data Reconstruction at NERSC/Cori, an adaptable Docker container approach for HPC

    NASA Astrophysics Data System (ADS)

    Mustafa, Mustafa; Balewski, Jan; Lauret, Jérôme; Porter, Jefferson; Canon, Shane; Gerhardt, Lisa; Hajdu, Levente; Lukascsyk, Mark

    2017-10-01

    As HPC facilities grow their resources, adaptation of classic HEP/NP workflows becomes a need. Linux containers may very well offer a way to lower the bar to exploiting such resources and at the time, help collaboration to reach vast elastic resources on such facilities and address their massive current and future data processing challenges. In this proceeding, we showcase STAR data reconstruction workflow at Cori HPC system at NERSC. STAR software is packaged in a Docker image and runs at Cori in Shifter containers. We highlight two of the typical end-to-end optimization challenges for such pipelines: 1) data transfer rate which was carried over ESnet after optimizing end points and 2) scalable deployment of conditions database in an HPC environment. Our tests demonstrate equally efficient data processing workflows on Cori/HPC, comparable to standard Linux clusters.

  6. The Generic Data Capture Facility

    NASA Technical Reports Server (NTRS)

    Connell, Edward B.; Barnes, William P.; Stallings, William H.

    1987-01-01

    The Generic Data Capture Facility, which can provide data capture support for a variety of different types of spacecraft while enabling operations costs to be carefully controlled, is discussed. The data capture functions, data protection, isolation of users from data acquisition problems, data reconstruction, and quality and accounting are addressed. The TDM and packet data formats utilized by the system are described, and the development of generic facilities is considered.

  7. SRB Processing Facilities Media Event

    NASA Image and Video Library

    2016-03-01

    Inside the Booster Fabrication Facility (BFF) at NASA’s Kennedy Space Center in Florida, members of the news media view a forward skirt that will be used on a solid rocket booster for NASA’s Space Launch System (SLS) rocket. Orbital ATK is a contractor for NASA’s Marshall Space Flight Center in Alabama, and operates the BFF to prepare aft booster segments and hardware for the SLS solid rocket boosters. Rick Serfozo, Orbital ATK Florida site director, talks to the media. The SLS rocket and Orion spacecraft will launch on Exploration Mission-1 in 2018. The Ground Systems Development and Operations Program is preparing the infrastructure to process and launch spacecraft for deep-space missions and the journey to Mars.

  8. SRB Processing Facilities Media Event

    NASA Image and Video Library

    2016-03-01

    Members of the news media view the high bay inside the Rotation, Processing and Surge Facility (RPSF) at NASA’s Kennedy Space Center in Florida. Kerry Chreist, with Jacobs Engineering on the Test and Operations Support Contract, explains the various test stands and how they will be used to prepare booster segments for NASA’s Space Launch System (SLS) rocket. In the far corner, in the vertical position, is one of two pathfinders, or test versions, of solid rocket booster segments for the SLS rocket. The Ground Systems Development and Operations Program and Jacobs are preparing the booster segments, which are inert, for a series of lifts, moves and stacking operations to prepare for Exploration Mission-1, deep-space missions and the journey to Mars.

  9. SRB Processing Facilities Media Event

    NASA Image and Video Library

    2016-03-01

    Members of the news media watch as two cranes are used to lift one of two pathfinders, or test versions, of solid rocket booster segments for NASA’s Space Launch System (SLS) rocket into the vertical position inside the Rotation, Processing and Surge Facility at NASA’s Kennedy Space Center in Florida. The pathfinder booster segment will be moved to the other end of the RPSF and secured on a test stand. The Ground Systems Development and Operations Program and Jacobs Engineering, on the Test and Operations Support Contract, will prepare the booster segments, which are inert, for a series of lifts, moves and stacking operations to prepare for Exploration Mission-1, deep-space missions and the journey to Mars.

  10. IRVE-II Post-Flight Trajectory Reconstruction

    NASA Technical Reports Server (NTRS)

    O'Keefe, Stephen A.; Bose, David M.

    2010-01-01

    NASA s Inflatable Re-entry Vehicle Experiment (IRVE) II successfully demonstrated an inflatable aerodynamic decelerator after being launched aboard a sounding rocket from Wallops Flight Facility (WFF). Preliminary day of flight data compared well with pre-flight Monte Carlo analysis, and a more complete trajectory reconstruction performed with an Extended Kalman Filter (EKF) approach followed. The reconstructed trajectory and comparisons to an attitude solution provided by NASA Sounding Rocket Operations Contract (NSROC) personnel at WFF are presented. Additional comparisons are made between the reconstructed trajectory and pre and post-flight Monte Carlo trajectory predictions. Alternative observations of the trajectory are summarized which leverage flight accelerometer measurements, the pre-flight aerodynamic database, and on-board flight video. Finally, analysis of the payload separation and aeroshell deployment events are presented. The flight trajectory is reconstructed to fidelity sufficient to assess overall project objectives related to flight dynamics and overall, IRVE-II flight dynamics are in line with expectations

  11. Overview of NORM and activities by a NORM licensed permanent decontamination and waste processing facility

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mirro, G.A.

    1997-02-01

    This paper presents an overview of issues related to handling NORM materials, and provides a description of a facility designed for the processing of NORM contaminated equipment. With regard to handling NORM materials the author discusses sources of NORM, problems, regulations and disposal options, potential hazards, safety equipment, and issues related to personnel protection. For the facility, the author discusses: description of the permanent facility; the operations of the facility; the license it has for handling specific radioactive material; operating and safety procedures; decontamination facilities on site; NORM waste processing capabilities; and offsite NORM services which are available.

  12. Lessons learned from the Siting Process of an Interim Storage Facility in Spain - 12024

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lamolla, Meritxell Martell

    2012-07-01

    On 29 December 2009, the Spanish government launched a site selection process to host a centralised interim storage facility for spent fuel and high-level radioactive waste. It was an unprecedented call for voluntarism among Spanish municipalities to site a controversial facility. Two nuclear municipalities, amongst a total of thirteen municipalities from five different regions, presented their candidatures to host the facility in their territories. For two years the government did not make a decision. Only in November 30, 2011, the new government elected on 20 November 2011 officially selected a non-nuclear municipality, Villar de Canas, for hosting this facility. Thismore » paper focuses on analysing the factors facilitating and hindering the siting of controversial facilities, in particular the interim storage facility in Spain. It demonstrates that involving all stakeholders in the decision-making process should not be underestimated. In the case of Spain, all regional governments where there were candidate municipalities willing to host the centralised interim storage facility, publicly opposed to the siting of the facility. (author)« less

  13. A compound reconstructed prediction model for nonstationary climate processes

    NASA Astrophysics Data System (ADS)

    Wang, Geli; Yang, Peicai

    2005-07-01

    Based on the idea of climate hierarchy and the theory of state space reconstruction, a local approximation prediction model with the compound structure is built for predicting some nonstationary climate process. By means of this model and the data sets consisting of north Indian Ocean sea-surface temperature, Asian zonal circulation index and monthly mean precipitation anomaly from 37 observation stations in the Inner Mongolia area of China (IMC), a regional prediction experiment for the winter precipitation of IMC is also carried out. When using the same sign ratio R between the prediction field and the actual field to measure the prediction accuracy, an averaged R of 63% given by 10 predictions samples is reached.

  14. Impact of Salt Waste Processing Facility Streams on the Nitric-Glycolic Flowsheet in the Chemical Processing Cell

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Martino, C.

    An evaluation of the previous Chemical Processing Cell (CPC) testing was performed to determine whether the planned concurrent operation, or “coupled” operations, of the Defense Waste Processing Facility (DWPF) with the Salt Waste Processing Facility (SWPF) has been adequately covered. Tests with the nitricglycolic acid flowsheet, which were both coupled and uncoupled with salt waste streams, included several tests that required extended boiling times. This report provides the evaluation of previous testing and the testing recommendation requested by Savannah River Remediation. The focus of the evaluation was impact on flammability in CPC vessels (i.e., hydrogen generation rate, SWPF solvent components,more » antifoam degradation products) and processing impacts (i.e., acid window, melter feed target, rheological properties, antifoam requirements, and chemical composition).« less

  15. Technology Readiness Assessment of Department of Energy Waste Processing Facilities

    DTIC Science & Technology

    2007-09-11

    Must Be Reliable, Robust, Flexible, and Durable 6 EM Is Piloting the TRA/AD2 Process Hanford Waste Treatment Plant ( WTP ) – The Initial Pilot Project...Evaluation WTP can only treat ~ ½ of the LAW in the time it will take to treat all the HLW. • There is a need for tank space that will get more urgent with...Facility before the WTP Pretreatment and High-Level Waste (HLW) Vitrification Facilities are available (Requires tank farm pretreatment capability) TRAs

  16. Facility siting as a decision process at the Savannah River Site

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wike, L.D.

    1995-12-31

    Site selection for new facilities at Savannah River Site (SRS) historically has been a process dependent only upon specific requirements of the facility. While this approach is normally well suited to engineering and operational concerns, it can have serious deficiencies in the modern era of regulatory oversight and compliance requirements. There are many issues related to the site selection for a facility that are not directly related to engineering or operational requirements; such environmental concerns can cause large schedule delays and budget impact,s thereby slowing or stopping the progress of a project. Some of the many concerns in locating amore » facility include: waste site avoidance, National Environmental Policy Act requirements, Clean Water Act, Clean Air Act, wetlands conservation, US Army Corps of Engineers considerations, US Fish and Wildlife Service statutes including threatened and endangered species issues, and State of South Carolina regulations, especially those of the Department of Health and Environmental Control. In addition, there are SRS restrictions on research areas set aside for National Environmental Research Park (NERP), Savannah River Ecology Laboratory, Savannah River Forest Station, University of South Carolina Institute of Archaeology and Anthropology, Southeastern Forest Experimental Station, and Savannah River Technology Center (SRTC) programs. As with facility operational needs, all of these siting considerations do not have equal importance. The purpose of this document is to review recent site selection exercises conducted for a variety of proposed facilities, develop the logic and basis for the methods employed, and standardize the process and terminology for future site selection efforts.« less

  17. 18 CFR 153.13 - Emergency reconstruction.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... loss of gas supply or capacity are applicable to facilities subject to section 3 of the Natural Gas Act... 18 Conservation of Power and Water Resources 1 2010-04-01 2010-04-01 false Emergency reconstruction. 153.13 Section 153.13 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY...

  18. SRB Processing Facilities Media Event

    NASA Image and Video Library

    2016-03-01

    Inside the Booster Fabrication Facility (BFF) at NASA’s Kennedy Space Center in Florida, members of the news media view the right-hand aft skirt that will be used on a solid rocket booster for NASA’s Space Launch System (SLS) rocket. Orbital ATK is contractor for NASA’s Marshall Space Flight Center in Alabama, and operates the BFF to prepare aft booster segments and hardware for the SLS solid rocket boosters. At far right, in the royal blue shirt, Rick Serfozo, Orbital ATK Florida site director, talks to the media. The SLS rocket and Orion spacecraft will launch on Exploration Mission-1 in 2018. The Ground Systems Development and Operations Program is preparing the infrastructure to process and launch spacecraft for deep-space missions and the journey to Mars.

  19. NASA Construction of Facilities Validation Processes - Total Building Commissioning (TBCx)

    NASA Technical Reports Server (NTRS)

    Hoover, Jay C.

    2004-01-01

    Key Atributes include: Total Quality Management (TQM) System that looks at all phases of a project. A team process that spans boundaries. A Commissioning Authority to lead the process. Commissioning requirements in contracts. Independent design review to verify compliance with Facility Project Requirements (FPR). Formal written Commissioning Plan with Documented Results. Functional performance testing (FPT) against the requirements document.

  20. Materials, Processes, and Facile Manufacturing for Bioresorbable Electronics: A Review.

    PubMed

    Yu, Xiaowei; Shou, Wan; Mahajan, Bikram K; Huang, Xian; Pan, Heng

    2018-05-07

    Bioresorbable electronics refer to a new class of advanced electronics that can completely dissolve or disintegrate with environmentally and biologically benign byproducts in water and biofluids. They have provided a solution to the growing electronic waste problem with applications in temporary usage of electronics such as implantable devices and environmental sensors. Bioresorbable materials such as biodegradable polymers, dissolvable conductors, semiconductors, and dielectrics are extensively studied, enabling massive progress of bioresorbable electronic devices. Processing and patterning of these materials are predominantly relying on vacuum-based fabrication methods so far. However, for the purpose of commercialization, nonvacuum, low-cost, and facile manufacturing/printing approaches are the need of the hour. Bioresorbable electronic materials are generally more chemically reactive than conventional electronic materials, which require particular attention in developing the low-cost manufacturing processes in ambient environment. This review focuses on material reactivity, ink availability, printability, and process compatibility for facile manufacturing of bioresorbable electronics. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  1. Spacelab Data Processing Facility (SLDPF) quality assurance expert systems development

    NASA Technical Reports Server (NTRS)

    Basile, Lisa R.; Kelly, Angelita C.

    1987-01-01

    The Spacelab Data Processing Facility (SLDPF) is an integral part of the Space Shuttle data network for missions that involve attached scientific payloads. Expert system prototypes were developed to aid in the performance of the quality assurance function of the Spacelab and/or Attached Shuttle Payloads processed telemetry data. The Spacelab Input Processing System (SIPS) and the Spacelab Output Processing System (SOPS), two expert systems, were developed to determine their feasibility and potential in the quality assurance of processed telemetry data. The capabilities and performance of these systems are discussed.

  2. WASS: an open-source stereo processing pipeline for sea waves 3D reconstruction

    NASA Astrophysics Data System (ADS)

    Bergamasco, Filippo; Benetazzo, Alvise; Torsello, Andrea; Barbariol, Francesco; Carniel, Sandro; Sclavo, Mauro

    2017-04-01

    Stereo 3D reconstruction of ocean waves is gaining more and more popularity in the oceanographic community. In fact, recent advances of both computer vision algorithms and CPU processing power can now allow the study of the spatio-temporal wave fields with unprecedented accuracy, especially at small scales. Even if simple in theory, multiple details are difficult to be mastered for a practitioner so that the implementation of a 3D reconstruction pipeline is in general considered a complex task. For instance, camera calibration, reliable stereo feature matching and mean sea-plane estimation are all factors for which a well designed implementation can make the difference to obtain valuable results. For this reason, we believe that the open availability of a well-tested software package that automates the steps from stereo images to a 3D point cloud would be a valuable addition for future researches in this area. We present WASS, a completely Open-Source stereo processing pipeline for sea waves 3D reconstruction, available at http://www.dais.unive.it/wass/. Our tool completely automates the recovery of dense point clouds from stereo images by providing three main functionalities. First, WASS can automatically recover the extrinsic parameters of the stereo rig (up to scale) so that no delicate calibration has to be performed on the field. Second, WASS implements a fast 3D dense stereo reconstruction procedure so that an accurate 3D point cloud can be computed from each stereo pair. We rely on the well-consolidated OpenCV library both for the image stereo rectification and disparity map recovery. Lastly, a set of 2D and 3D filtering techniques both on the disparity map and the produced point cloud are implemented to remove the vast majority of erroneous points that can naturally arise while analyzing the optically complex nature of the water surface (examples are sun-glares, large white-capped areas, fog and water areosol, etc). Developed to be as fast as possible, WASS

  3. Access to breast reconstruction after mastectomy and patient perspectives on reconstruction decision making.

    PubMed

    Morrow, Monica; Li, Yun; Alderman, Amy K; Jagsi, Reshma; Hamilton, Ann S; Graff, John J; Hawley, Sarah T; Katz, Steven J

    2014-10-01

    Most women undergoing mastectomy for breast cancer do not undergo breast reconstruction. To examine correlates of breast reconstruction after mastectomy and to determine if a significant unmet need for reconstruction exists. We used Surveillance, Epidemiology, and End Results registries from Los Angeles, California, and Detroit, Michigan, for rapid case ascertainment to identify a sample of women aged 20 to 79 years diagnosed as having ductal carcinoma in situ or stages I to III invasive breast cancer. Black and Latina women were oversampled to ensure adequate representation of racial/ethnic minorities. Eligible participants were able to complete a survey in English or Spanish. Of 3252 women sent the initial survey a median of 9 months after diagnosis, 2290 completed it. Those who remained disease free were surveyed 4 years later to determine the frequency of immediate and delayed reconstruction and patient attitudes toward the procedure; 1536 completed the follow-up survey. The 485 who remained disease free at follow-up underwent analysis. Disease-free survival of breast cancer. Breast reconstruction at any time after mastectomy and patient satisfaction with different aspects of the reconstruction decision-making process. Response rates in the initial and follow-up surveys were 73.1% and 67.7%, respectively (overall, 49.4%). Of 485 patients reporting mastectomy at the initial survey and remaining disease free, 24.8% underwent immediate and 16.8% underwent delayed reconstruction (total, 41.6%). Factors significantly associated with not undergoing reconstruction were black race (adjusted odds ratio [AOR], 2.16 [95% CI, 1.11-4.20]; P = .004), lower educational level (AOR, 4.49 [95% CI, 2.31-8.72]; P < .001), increased age (AOR in 10-year increments, 2.53 [95% CI, 1.77-3.61]; P < .001), major comorbidity (AOR, 2.27 [95% CI, 1.01-5.11]; P = .048), and chemotherapy (AOR, 1.82 [95% CI, 0.99-3.31]; P = .05). Only 13.3% of women were dissatisfied with the

  4. Analysis of an Optimized MLOS Tomographic Reconstruction Algorithm and Comparison to the MART Reconstruction Algorithm

    NASA Astrophysics Data System (ADS)

    La Foy, Roderick; Vlachos, Pavlos

    2011-11-01

    An optimally designed MLOS tomographic reconstruction algorithm for use in 3D PIV and PTV applications is analyzed. Using a set of optimized reconstruction parameters, the reconstructions produced by the MLOS algorithm are shown to be comparable to reconstructions produced by the MART algorithm for a range of camera geometries, camera numbers, and particle seeding densities. The resultant velocity field error calculated using PIV and PTV algorithms is further minimized by applying both pre and post processing to the reconstructed data sets.

  5. Evaluation of mercury in the liquid waste processing facilities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jain, Vijay; Shah, Hasmukh; Occhipinti, John E.

    2015-08-13

    This report provides a summary of Phase I activities conducted to support an Integrated Evaluation of Mercury in Liquid Waste System (LWS) Processing Facilities. Phase I activities included a review and assessment of the liquid waste inventory and chemical processing behavior of mercury using a system by system review methodology approach. Gaps in understanding mercury behavior as well as action items from the structured reviews are being tracked. 64% of the gaps and actions have been resolved.

  6. Cutting the Cost of New Community College Facilities: Streamlining the Facilities Approval Process. Commission on Innovation Policy Discussion Paper Number 3.

    ERIC Educational Resources Information Center

    BW Associates, Berkeley, CA.

    Intended to provide background information and preliminary options for the California Community Colleges' Commission on Innovation, this document proposes that approval processes for new facilities be simplified and that restrictions on the lease or purchase of off-campus facilities be eased. Following introductory materials detailing the…

  7. Building Information Modeling (BIM) Primer. Report 1: Facility Life-Cycle Process and Technology Innovation

    DTIC Science & Technology

    2012-08-01

    Building Information Modeling ( BIM ) Primer Report 1: Facility Life-cycle Process and Technology Innovation In fo...is unlimited. ERDC/ITL TR-12-2 August 2012 Building Information Modeling ( BIM ) Primer Report 1: Facility Life-cycle Process and Technology...and to enhance the quality of projects through the design, construction, and handover phases. Building Information Modeling ( BIM ) is a

  8. Reconstruction of bar {p}p events in PANDA

    NASA Astrophysics Data System (ADS)

    Spataro, S.

    2012-08-01

    The PANDA experiment will study anti-proton proton and anti-proton nucleus collisions in the HESR complex of the facility FAIR, in a beam momentum range from 2 GeV jc up to 15 GeV/c. In preparation for the experiment, a software framework based on ROOT (PandaRoot) is being developed for the simulation, reconstruction and analysis of physics events, running also on a GRID infrastructure. Detailed geometry descriptions and different realistic reconstruction algorithms are implemented, currently used for the realization of the Technical Design Reports. The contribution will report about the reconstruction capabilities of the Panda spectrometer, focusing mainly on the performances of the tracking system and the results for the analysis of physics benchmark channels.

  9. A Web simulation of medical image reconstruction and processing as an educational tool.

    PubMed

    Papamichail, Dimitrios; Pantelis, Evaggelos; Papagiannis, Panagiotis; Karaiskos, Pantelis; Georgiou, Evangelos

    2015-02-01

    Web educational resources integrating interactive simulation tools provide students with an in-depth understanding of the medical imaging process. The aim of this work was the development of a purely Web-based, open access, interactive application, as an ancillary learning tool in graduate and postgraduate medical imaging education, including a systematic evaluation of learning effectiveness. The pedagogic content of the educational Web portal was designed to cover the basic concepts of medical imaging reconstruction and processing, through the use of active learning and motivation, including learning simulations that closely resemble actual tomographic imaging systems. The user can implement image reconstruction and processing algorithms under a single user interface and manipulate various factors to understand the impact on image appearance. A questionnaire for pre- and post-training self-assessment was developed and integrated in the online application. The developed Web-based educational application introduces the trainee in the basic concepts of imaging through textual and graphical information and proceeds with a learning-by-doing approach. Trainees are encouraged to participate in a pre- and post-training questionnaire to assess their knowledge gain. An initial feedback from a group of graduate medical students showed that the developed course was considered as effective and well structured. An e-learning application on medical imaging integrating interactive simulation tools was developed and assessed in our institution.

  10. Image reconstruction: an overview for clinicians.

    PubMed

    Hansen, Michael S; Kellman, Peter

    2015-03-01

    Image reconstruction plays a critical role in the clinical use of magnetic resonance imaging (MRI). The MRI raw data is not acquired in image space and the role of the image reconstruction process is to transform the acquired raw data into images that can be interpreted clinically. This process involves multiple signal processing steps that each have an impact on the image quality. This review explains the basic terminology used for describing and quantifying image quality in terms of signal-to-noise ratio and point spread function. In this context, several commonly used image reconstruction components are discussed. The image reconstruction components covered include noise prewhitening for phased array data acquisition, interpolation needed to reconstruct square pixels, raw data filtering for reducing Gibbs ringing artifacts, Fourier transforms connecting the raw data with image space, and phased array coil combination. The treatment of phased array coils includes a general explanation of parallel imaging as a coil combination technique. The review is aimed at readers with no signal processing experience and should enable them to understand what role basic image reconstruction steps play in the formation of clinical images and how the resulting image quality is described. © 2014 Wiley Periodicals, Inc.

  11. Unity connecting module in the Space Station Processing Facility

    NASA Technical Reports Server (NTRS)

    1998-01-01

    Unity connecting module, part of the International Space Station, awaits processing in the Space Station Processing Facility (SSPF). On the end at the right can be seen the Pressurized Mating Adapter 2, which provides entry into the module. The Unity, scheduled to be launched on STS-88 in December 1998, will be mated to the Russian-built Zarya control module which will already be in orbit. STS-88 will be the first Space Shuttle launch for the International Space Station.

  12. AERIAL SHOWING COMPLETED REMOTE ANALYTICAL FACILITY (CPP627) ADJOINING FUEL PROCESSING ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    AERIAL SHOWING COMPLETED REMOTE ANALYTICAL FACILITY (CPP-627) ADJOINING FUEL PROCESSING BUILDING AND EXCAVATION FOR HOT PILOT PLANT TO RIGHT (CPP-640). INL PHOTO NUMBER NRTS-60-1221. J. Anderson, Photographer, 3/22/1960 - Idaho National Engineering Laboratory, Idaho Chemical Processing Plant, Fuel Reprocessing Complex, Scoville, Butte County, ID

  13. Simultaneous reconstruction of multiple depth images without off-focus points in integral imaging using a graphics processing unit.

    PubMed

    Yi, Faliu; Lee, Jieun; Moon, Inkyu

    2014-05-01

    The reconstruction of multiple depth images with a ray back-propagation algorithm in three-dimensional (3D) computational integral imaging is computationally burdensome. Further, a reconstructed depth image consists of a focus and an off-focus area. Focus areas are 3D points on the surface of an object that are located at the reconstructed depth, while off-focus areas include 3D points in free-space that do not belong to any object surface in 3D space. Generally, without being removed, the presence of an off-focus area would adversely affect the high-level analysis of a 3D object, including its classification, recognition, and tracking. Here, we use a graphics processing unit (GPU) that supports parallel processing with multiple processors to simultaneously reconstruct multiple depth images using a lookup table containing the shifted values along the x and y directions for each elemental image in a given depth range. Moreover, each 3D point on a depth image can be measured by analyzing its statistical variance with its corresponding samples, which are captured by the two-dimensional (2D) elemental images. These statistical variances can be used to classify depth image pixels as either focus or off-focus points. At this stage, the measurement of focus and off-focus points in multiple depth images is also implemented in parallel on a GPU. Our proposed method is conducted based on the assumption that there is no occlusion of the 3D object during the capture stage of the integral imaging process. Experimental results have demonstrated that this method is capable of removing off-focus points in the reconstructed depth image. The results also showed that using a GPU to remove the off-focus points could greatly improve the overall computational speed compared with using a CPU.

  14. Quantitative fractography by digital image processing: NIH Image macro tools for stereo pair analysis and 3-D reconstruction.

    PubMed

    Hein, L R

    2001-10-01

    A set of NIH Image macro programs was developed to make qualitative and quantitative analyses from digital stereo pictures produced by scanning electron microscopes. These tools were designed for image alignment, anaglyph representation, animation, reconstruction of true elevation surfaces, reconstruction of elevation profiles, true-scale elevation mapping and, for the quantitative approach, surface area and roughness calculations. Limitations on time processing, scanning techniques and programming concepts are also discussed.

  15. Process auditing in long term care facilities.

    PubMed

    Hewitt, S M; LeSage, J; Roberts, K L; Ellor, J R

    1985-01-01

    The ECC tool development and audit experiences indicated that there is promise in developing a process audit tool to monitor quality of care in nursing homes; moreover, the tool selected required only one hour per resident. Focusing on the care process and resident needs provided useful information for care providers at the unit level as well as for administrative personnel. Besides incorporating a more interdisciplinary focus, the revised tool needs to define support services most appropriate for nursing homes, includes items related to discharge planning and increases measurement of significant others' involvement in the care process. Future emphasis at the ECC will focus on developing intervention plans to maintain strengths and correct deficiencies identified in the audits. Various strategies to bring about desired changes in the quality of care will be evaluated through regular, periodic monitoring. Having a valid and reliable measure of quality of care as a tool will be an important step forward for LTC facilities.

  16. How work context affects operating room processes: using data mining and computer simulation to analyze facility and process design.

    PubMed

    Baumgart, André; Denz, Christof; Bender, Hans-Joachim; Schleppers, Alexander

    2009-01-01

    The complexity of the operating room (OR) requires that both structural (eg, department layout) and behavioral (eg, staff interactions) patterns of work be considered when developing quality improvement strategies. In our study, we investigated how these contextual factors influence outpatient OR processes and the quality of care delivered. The study setting was a German university-affiliated hospital performing approximately 6000 outpatient surgeries annually. During the 3-year-study period, the hospital significantly changed its outpatient OR facility layout from a decentralized (ie, ORs in adjacent areas of the building) to a centralized (ie, ORs in immediate vicinity of each other) design. To study the impact of the facility change on OR processes, we used a mixed methods approach, including process analysis, process modeling, and social network analysis of staff interactions. The change in facility layout was seen to influence OR processes in ways that could substantially affect patient outcomes. For example, we found a potential for more errors during handovers in the new centralized design due to greater interdependency between tasks and staff. Utilization of the mixed methods approach in our analysis, as compared with that of a single assessment method, enabled a deeper understanding of the OR work context and its influence on outpatient OR processes.

  17. Real-time implementing wavefront reconstruction for adaptive optics

    NASA Astrophysics Data System (ADS)

    Wang, Caixia; Li, Mei; Wang, Chunhong; Zhou, Luchun; Jiang, Wenhan

    2004-12-01

    The capability of real time wave-front reconstruction is important for an adaptive optics (AO) system. The bandwidth of system and the real-time processing ability of the wave-front processor is mainly affected by the speed of calculation. The system requires enough number of subapertures and high sampling frequency to compensate atmospheric turbulence. The number of reconstruction operation is increased accordingly. Since the performance of AO system improves with the decrease of calculation latency, it is necessary to study how to increase the speed of wavefront reconstruction. There are two methods to improve the real time of the reconstruction. One is to convert the wavefront reconstruction matrix, such as by wavelet or FFT. The other is enhancing the performance of the processing element. Analysis shows that the latency cutting is performed with the cost of reconstruction precision by the former method. In this article, the latter method is adopted. From the characteristic of the wavefront reconstruction algorithm, a systolic array by FPGA is properly designed to implement real-time wavefront reconstruction. The system delay is reduced greatly by the utilization of pipeline and parallel processing. The minimum latency of reconstruction is the reconstruction calculation of one subaperture.

  18. Simulation of mass storage systems operating in a large data processing facility

    NASA Technical Reports Server (NTRS)

    Holmes, R.

    1972-01-01

    A mass storage simulation program was written to aid system designers in the design of a data processing facility. It acts as a tool for measuring the overall effect on the facility of on-line mass storage systems, and it provides the means of measuring and comparing the performance of competing mass storage systems. The performance of the simulation program is demonstrated.

  19. "Proprietary Processed" Allografts: Clinical Outcomes and Biomechanical Properties in Anterior Cruciate Ligament Reconstruction.

    PubMed

    Roberson, Troy A; Abildgaard, Jeffrey T; Wyland, Douglas J; Siffri, Paul C; Geary, Stephen P; Hawkins, Richard J; Tokish, John M

    2017-11-01

    The processing of allograft tissues in anterior cruciate ligament (ACL) reconstruction continues to be controversial. While high-dose irradiation of grafts has received scrutiny for high failure rates, lower dose irradiation and "proprietary-based" nonirradiated sterilization techniques have become increasingly popular, with little in the literature to evaluate their outcomes. Recent studies have suggested that the specifics of allograft processing techniques may be a risk factor for higher failure rates. To assess these proprietary processes and their clinical outcomes and biomechanical properties. Systematic review. A systematic review was performed using searches of PubMed, EMBASE, Google Scholar, and Cochrane databases. English-language studies were identified with the following search terms: "allograft ACL reconstruction" (title/abstract), "novel allograft processing" (title/abstract), "allograft anterior cruciate ligament" (title/abstract), "anterior cruciate ligament allograft processing" (title/abstract), or "biomechanical properties anterior cruciate ligament allograft" (title/abstract). Duplicate studies, studies not providing the allograft processing technique, and those not containing the outcomes of interest were excluded. Outcomes of interest included outcome scores, complication and failure rates, and biomechanical properties of the processed allografts. Twenty-four studies (13 clinical, 11 biomechanical) met inclusion criteria for review. No demonstrable difference in patient-reported outcomes was appreciated between the processing techniques, with the exception of the Tutoplast process. The clinical failure rate of the Tutoplast process was unacceptably high (45% at 6 years), but no other difference was found between other processing techniques (BioCleanse: 5.4%; AlloTrue: 5.7%; MTF: 6.7%). Several studies did show an increased failure rate, but these studies either combined processing techniques or failed to delineate enough detail to allow a

  20. Bit error rate performance of Image Processing Facility high density tape recorders

    NASA Technical Reports Server (NTRS)

    Heffner, P.

    1981-01-01

    The Image Processing Facility at the NASA/Goddard Space Flight Center uses High Density Tape Recorders (HDTR's) to transfer high volume image data and ancillary information from one system to another. For ancillary information, it is required that very low bit error rates (BER's) accompany the transfers. The facility processes about 10 to the 11th bits of image data per day from many sensors, involving 15 independent processing systems requiring the use of HDTR's. When acquired, the 16 HDTR's offered state-of-the-art performance of 1 x 10 to the -6th BER as specified. The BER requirement was later upgraded in two steps: (1) incorporating data randomizing circuitry to yield a BER of 2 x 10 to the -7th and (2) further modifying to include a bit error correction capability to attain a BER of 2 x 10 to the -9th. The total improvement factor was 500 to 1. Attention is given here to the background, technical approach, and final results of these modifications. Also discussed are the format of the data recorded by the HDTR, the magnetic tape format, the magnetic tape dropout characteristics as experienced in the Image Processing Facility, the head life history, and the reliability of the HDTR's.

  1. 40 CFR 60.100 - Applicability, designation of affected facility, and reconstruction.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... petroleum refineries: fluid catalytic cracking unit catalyst regenerators, fuel gas combustion devices, and... petroleum refinery. (b) Any fluid catalytic cracking unit catalyst regenerator or fuel gas combustion device... regenerator under paragraph (b) of this section which commences construction, reconstruction, or modification...

  2. 40 CFR 60.100 - Applicability, designation of affected facility, and reconstruction.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... petroleum refineries: fluid catalytic cracking unit catalyst regenerators, fuel gas combustion devices, and... petroleum refinery. (b) Any fluid catalytic cracking unit catalyst regenerator or fuel gas combustion device... regenerator under paragraph (b) of this section which commences construction, reconstruction, or modification...

  3. Process cost and facility considerations in the selection of primary cell culture clarification technology.

    PubMed

    Felo, Michael; Christensen, Brandon; Higgins, John

    2013-01-01

    The bioreactor volume delineating the selection of primary clarification technology is not always easily defined. Development of a commercial scale process for the manufacture of therapeutic proteins requires scale-up from a few liters to thousands of liters. While the separation techniques used for protein purification are largely conserved across scales, the separation techniques for primary cell culture clarification vary with scale. Process models were developed to compare monoclonal antibody production costs using two cell culture clarification technologies. One process model was created for cell culture clarification by disc stack centrifugation with depth filtration. A second process model was created for clarification by multi-stage depth filtration. Analyses were performed to examine the influence of bioreactor volume, product titer, depth filter capacity, and facility utilization on overall operating costs. At bioreactor volumes <1,000 L, clarification using multi-stage depth filtration offers cost savings compared to clarification using centrifugation. For bioreactor volumes >5,000 L, clarification using centrifugation followed by depth filtration offers significant cost savings. For bioreactor volumes of ∼ 2,000 L, clarification costs are similar between depth filtration and centrifugation. At this scale, factors including facility utilization, available capital, ease of process development, implementation timelines, and process performance characterization play an important role in clarification technology selection. In the case study presented, a multi-product facility selected multi-stage depth filtration for cell culture clarification at the 500 and 2,000 L scales of operation. Facility implementation timelines, process development activities, equipment commissioning and validation, scale-up effects, and process robustness are examined. © 2013 American Institute of Chemical Engineers.

  4. PROCESS AND EQUIPMENT CHANGES FOR CLEANER PRODUCTION IN FEDERAL FACILITIES

    EPA Science Inventory

    The paper discusses process and equipment changes for cleaner production in federal facilities. During the 1990s, DoD and EPA conducted joint research and development, aimed at reducing the discharge of hazardous and toxic pollutants from military production and maintenance faci...

  5. Kinematic reconstruction in cardiovascular imaging.

    PubMed

    Bastarrika, G; Huebra Rodríguez, I J González de la; Calvo-Imirizaldu, M; Suárez Vega, V M; Alonso-Burgos, A

    2018-05-17

    Advances in clinical applications of computed tomography have been accompanied by improvements in advanced post-processing tools. In addition to multiplanar reconstructions, curved planar reconstructions, maximum intensity projections, and volumetric reconstructions, very recently kinematic reconstruction has been developed. This new technique, based on mathematical models that simulate the propagation of light beams through a volume of data, makes it possible to obtain very realistic three dimensional images. This article illustrates examples of kinematic reconstructions and compares them with classical volumetric reconstructions in patients with cardiovascular disease in a way that makes it easy to establish the differences between the two types of reconstruction. Kinematic reconstruction is a new method for representing three dimensional images that facilitates the explanation and comprehension of the findings. Copyright © 2018 SERAM. Publicado por Elsevier España, S.L.U. All rights reserved.

  6. A spectral image processing algorithm for evaluating the influence of the illuminants on the reconstructed reflectance

    NASA Astrophysics Data System (ADS)

    Toadere, Florin

    2017-12-01

    A spectral image processing algorithm that allows the illumination of the scene with different illuminants together with the reconstruction of the scene's reflectance is presented. Color checker spectral image and CIE A (warm light 2700 K), D65 (cold light 6500 K) and Cree TW Series LED T8 (4000 K) are employed for scene illumination. Illuminants used in the simulations have different spectra and, as a result of their illumination, the colors of the scene change. The influence of the illuminants on the reconstruction of the scene's reflectance is estimated. Demonstrative images and reflectance showing the operation of the algorithm are illustrated.

  7. Lander Trajectory Reconstruction computer program

    NASA Technical Reports Server (NTRS)

    Adams, G. L.; Bradt, A. J.; Ferguson, J. B.; Schnelker, H. J.

    1971-01-01

    The Lander Trajectory Reconstruction (LTR) computer program is a tool for analysis of the planetary entry trajectory and atmosphere reconstruction process for a lander or probe. The program can be divided into two parts: (1) the data generator and (2) the reconstructor. The data generator provides the real environment in which the lander or probe is presumed to find itself. The reconstructor reconstructs the entry trajectory and atmosphere using sensor data generated by the data generator and a Kalman-Schmidt consider filter. A wide variety of vehicle and environmental parameters may be either solved-for or considered in the filter process.

  8. Probabilistic Feasibility of the Reconstruction Process of Russian-Orthodox Churches

    NASA Astrophysics Data System (ADS)

    Chizhova, M.; Brunn, A.; Stilla, U.

    2016-06-01

    The cultural human heritage is important for the identity of following generations and has to be preserved in a suitable manner. In the course of time a lot of information about former cultural constructions has been lost because some objects were strongly damaged by natural erosion or on account of human work or were even destroyed. It is important to capture still available building parts of former buildings, mostly ruins. This data could be the basis for a virtual reconstruction. Laserscanning offers in principle the possibility to take up extensively surfaces of buildings in its actual status. In this paper we assume a priori given 3d-laserscanner data, 3d point cloud for the partly destroyed church. There are many well known algorithms, that describe different methods of extraction and detection of geometric primitives, which are recognized separately in 3d points clouds. In our work we put them in a common probabilistic framework, which guides the complete reconstruction process of complex buildings, in our case russian-orthodox churches. Churches are modeled with their functional volumetric components, enriched with a priori known probabilities, which are deduced from a database of russian-orthodox churches. Each set of components represents a complete church. The power of the new method is shown for a simulated dataset of 100 russian-orthodox churches.

  9. Reconstruction dynamics of recorded holograms in photochromic glass.

    PubMed

    Mihailescu, Mona; Pavel, Eugen; Nicolae, Vasile B

    2011-06-20

    We have investigated the dynamics of the record-erase process of holograms in photochromic glass using continuum Nd:YVO₄ laser radiation (λ=532 nm). A bidimensional microgrid pattern was formed and visualized in photochromic glass, and its diffraction efficiency decay versus time (during reconstruction step) gave us information (D, Δn) about the diffusion process inside the material. The recording and reconstruction processes were carried out in an off-axis setup, and the images of the reconstructed object were recorded by a CCD camera. Measurements realized on reconstructed object images using holograms recorded at a different incident power laser have shown a two-stage process involved in silver atom kinetics.

  10. STS-34 Galileo processing at KSC's SAEF-2 planetary spacecraft facility

    NASA Image and Video Library

    1989-07-21

    At the Kennedy Space Center's (KSC's) Spacecraft and Assembly Encapsulation Facility 2 (SAEF-2), the planetary spacecraft checkout facility, clean-suited technicians work on the Galileo spacecraft prior to moving it to the Vehicle Processing Facility (VPF) for mating with the inertial upper stage (IUS). Galileo is scheduled for launch aboard Atlantis, Orbiter Vehicle (OV) 104, on Space Shuttle Mission STS-34 in October 1989. It will be sent to the planet Jupiter, a journey which will taken more than six years to complete. In December 1995 as the two and one half ton spacecraft orbits Jupiter with its ten scientific instruments, a probe will be released to parachute into the Jovian atmosphere. NASA's Jet Propulsion Laboratory (JPL) manages the Galileo project. View provided by KSC.

  11. Local reconstruction in computed tomography of diffraction enhanced imaging

    NASA Astrophysics Data System (ADS)

    Huang, Zhi-Feng; Zhang, Li; Kang, Ke-Jun; Chen, Zhi-Qiang; Zhu, Pei-Ping; Yuan, Qing-Xi; Huang, Wan-Xia

    2007-07-01

    Computed tomography of diffraction enhanced imaging (DEI-CT) based on synchrotron radiation source has extremely high sensitivity of weakly absorbing low-Z samples in medical and biological fields. The authors propose a modified backprojection filtration(BPF)-type algorithm based on PI-line segments to reconstruct region of interest from truncated refraction-angle projection data in DEI-CT. The distribution of refractive index decrement in the sample can be directly estimated from its reconstruction images, which has been proved by experiments at the Beijing Synchrotron Radiation Facility. The algorithm paves the way for local reconstruction of large-size samples by the use of DEI-CT with small field of view based on synchrotron radiation source.

  12. High resolution x-ray CMT: Reconstruction methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, J.K.

    This paper qualitatively discusses the primary characteristics of methods for reconstructing tomographic images from a set of projections. These reconstruction methods can be categorized as either {open_quotes}analytic{close_quotes} or {open_quotes}iterative{close_quotes} techniques. Analytic algorithms are derived from the formal inversion of equations describing the imaging process, while iterative algorithms incorporate a model of the imaging process and provide a mechanism to iteratively improve image estimates. Analytic reconstruction algorithms are typically computationally more efficient than iterative methods; however, analytic algorithms are available for a relatively limited set of imaging geometries and situations. Thus, the framework of iterative reconstruction methods is better suited formore » high accuracy, tomographic reconstruction codes.« less

  13. Fast implementations of reconstruction-based scatter compensation in fully 3D SPECT image reconstruction

    NASA Astrophysics Data System (ADS)

    Kadrmas, Dan J.; Frey, Eric C.; Karimi, Seemeen S.; Tsui, Benjamin M. W.

    1998-04-01

    Accurate scatter compensation in SPECT can be performed by modelling the scatter response function during the reconstruction process. This method is called reconstruction-based scatter compensation (RBSC). It has been shown that RBSC has a number of advantages over other methods of compensating for scatter, but using RBSC for fully 3D compensation has resulted in prohibitively long reconstruction times. In this work we propose two new methods that can be used in conjunction with existing methods to achieve marked reductions in RBSC reconstruction times. The first method, coarse-grid scatter modelling, significantly accelerates the scatter model by exploiting the fact that scatter is dominated by low-frequency information. The second method, intermittent RBSC, further accelerates the reconstruction process by limiting the number of iterations during which scatter is modelled. The fast implementations were evaluated using a Monte Carlo simulated experiment of the 3D MCAT phantom with tracer, and also using experimentally acquired data with tracer. Results indicated that these fast methods can reconstruct, with fully 3D compensation, images very similar to those obtained using standard RBSC methods, and in reconstruction times that are an order of magnitude shorter. Using these methods, fully 3D iterative reconstruction with RBSC can be performed well within the realm of clinically realistic times (under 10 minutes for image reconstruction).

  14. Breast reconstruction after mastectomy at a comprehensive cancer center.

    PubMed

    Connors, Shahnjayla K; Goodman, Melody S; Myckatyn, Terence; Margenthaler, Julie; Gehlert, Sarah

    2016-01-01

    Breast reconstruction after mastectomy is an integral part of breast cancer treatment that positively impacts quality of life in breast cancer survivors. Although breast reconstruction rates have increased over time, African American women remain less likely to receive breast reconstruction compared to Caucasian women. National Cancer Institute-designated Comprehensive Cancer Centers, specialized institutions with more standardized models of cancer treatment, report higher breast reconstruction rates than primary healthcare facilities. Whether breast reconstruction disparities are reduced for women treated at comprehensive cancer centers is unclear. The purpose of this study was to further investigate breast reconstruction rates and determinants at a comprehensive cancer center in St. Louis, Missouri. Sociodemographic and clinical data were obtained for women who received mastectomy for definitive surgical treatment for breast cancer between 2000 and 2012. Logistic regression was used to identify factors associated with the receipt of breast reconstruction. We found a breast reconstruction rate of 54 % for the study sample. Women who were aged 55 and older, had public insurance, received unilateral mastectomy, and received adjuvant radiation therapy were significantly less likely to receive breast reconstruction. African American women were 30 % less likely to receive breast reconstruction than Caucasian women. These findings suggest that racial disparities in breast reconstruction persist in comprehensive cancer centers. Future research should further delineate the determinants of breast reconstruction disparities across various types of healthcare institutions. Only then can we develop interventions to ensure all eligible women have access to breast reconstruction and the improved quality of life it affords breast cancer survivors.

  15. Establishment of sequential software processing for a biomechanical model of mandibular reconstruction with custom-made plate.

    PubMed

    Li, Peng; Tang, Youchao; Li, Jia; Shen, Longduo; Tian, Weidong; Tang, Wei

    2013-09-01

    The aim of this study is to describe the sequential software processing of computed tomography (CT) dataset for reconstructing the finite element analysis (FEA) mandibular model with custom-made plate, and to provide a theoretical basis for clinical usage of this reconstruction method. A CT scan was done on one patient who had mandibular continuity defects. This CT dataset in DICOM format was imported into Mimics 10.0 software in which a three-dimensional (3-D) model of the facial skeleton was reconstructed and the mandible was segmented out. With Geomagic Studio 11.0, one custom-made plate and nine virtual screws were designed. All parts of the reconstructed mandible were converted into NURBS and saved as IGES format for importing into pro/E 4.0. After Boolean operation and assembly, the model was switched to ANSYS Workbench 12.0. Finally, after applying the boundary conditions and material properties, an analysis was performed. As results, a 3-D FEA model was successfully developed using the softwares above. The stress-strain distribution precisely indicated biomechanical performance of the reconstructed mandible on the normal occlusion load, without stress concentrated areas. The Von-Mises stress in all parts of the model, from the maximum value of 50.9MPa to the minimum value of 0.1MPa, was lower than the ultimate tensile strength. In conclusion, the described strategy could speedily and successfully produce a biomechanical model of a reconstructed mandible with custom-made plate. Using this FEA foundation, the custom-made plate may be improved for an optimal clinical outcome. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  16. Fire Hazard Assessment in Supporting Fire Protection System Design of a Chemical Process Facility

    DTIC Science & Technology

    1996-08-01

    CSDP/Studies/FireHaz –i– 3/28/97 FIRE HAZARD ASSESSMENT IN SUPPORTING FIRE PROTECTION SYSTEM DESIGN OF A CHEMICAL PROCESS FACILITY Ali Pezeshk...Joseph Chang, Dwight Hunt, and Peter Jahn Parsons Infrastructure & Technology Group, Inc. Pasadena, California 91124 ABSTRACT Because fires in a chemical ...Assessment in Supporting Fire Protection System Design of a Chemical Process Facility 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6

  17. Understanding reconstructed Dante spectra using high resolution spectroscopy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    May, M. J., E-mail: may13@llnl.gov; Widmann, K.; Kemp, G. E.

    2016-11-15

    The Dante is an 18 channel filtered diode array used at the National Ignition Facility (NIF) to measure the spectrally and temporally resolved radiation flux between 50 eV and 20 keV from various targets. The absolute flux is determined from the radiometric calibration of the x-ray diodes, filters, and mirrors and a reconstruction algorithm applied to the recorded voltages from each channel. The reconstructed spectra are very low resolution with features consistent with the instrument response and are not necessarily consistent with the spectral emission features from the plasma. Errors may exist between the reconstructed spectra and the actual emissionmore » features due to assumptions in the algorithm. Recently, a high resolution convex crystal spectrometer, VIRGIL, has been installed at NIF with the same line of sight as the Dante. Spectra from L-shell Ag and Xe have been recorded by both VIRGIL and Dante. Comparisons of these two spectroscopic measurements yield insights into the accuracy of the Dante reconstructions.« less

  18. Kuwaiti reconstruction project unprecedented in size, complexity

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tippee, B.

    1993-03-15

    There had been no challenge like it: a desert emirate ablaze; its main city sacked; the economically crucial oil industry devastated; countryside shrouded in smoke from oil well fires and littered with unexploded ordnance, disabled military equipment, and unignited crude oil. Like the well-documented effort that brought 749 burning wells under control in less than 7 months, Kuwaiti reconstruction had no precedent. Unlike the firefight, reconstruction is no-where complete. It nevertheless has placed two of three refineries back on stream, restored oil production to preinvasion levels, and repaired or rebuilt 17 of 26 oil field gathering stations. Most of themore » progress has come since the last well fire went out on Nov. 6, 1991. Expatriates in Kuwait since the days of Al-Awda- the return,' in Arabic- attribute much of the rapid progress under Al-Tameer- the reconstruction'- to decisions and preparations made while the well fires still raged. The article describes the planning for Al-Awda, reentering the country, drilling plans, facilities reconstruction, and special problems.« less

  19. Preliminary technical data summary No. 3 for the Defense Waste Processing Facility

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Landon, L.F.

    1980-05-01

    This document presents an update on the best information presently available for the purpose of establishing the basis for the design of a Defense Waste Processing Facility. Objective of this project is to provide a facility to fix the radionuclides present in Savannah River Plant (SRP) high-level liquid waste in a high-integrity form (glass). Flowsheets and material balances reflect the alternate CAB case including the incorporation of low-level supernate in concrete. (DLC)

  20. Establishing a cGMP pancreatic islet processing facility: the first experience in Iran.

    PubMed

    Larijani, Bagher; Arjmand, Babak; Amoli, Mahsa M; Ao, Ziliang; Jafarian, Ali; Mahdavi-Mazdah, Mitra; Ghanaati, Hossein; Baradar-Jalili, Reza; Sharghi, Sasan; Norouzi-Javidan, Abbas; Aghayan, Hamid Reza

    2012-12-01

    It has been predicted that one of the greatest increase in prevalence of diabetes will happen in the Middle East bear in the next decades. The aim of standard therapeutic strategies for diabetes is better control of complications. In contrast, some new strategies like cell and gene therapy have aimed to cure the disease. In recent years, significant progress has occurred in beta-cell replacement therapies with a progressive improvement of short-term and long term outcomes. In year 2005, considering the impact of the disease in Iran and the promising results of the Edmonton protocol, the funding for establishing a current Good Manufacturing Practice (cGMP) islet processing facility by Endocrinology and Metabolism Research Center was approved by Tehran University of Medical Sciences. Several islet isolations were performed following establishment of cGMP facility and recruitment of all required equipments for process validation and experimental purpose. Finally the first successful clinical islet isolation and transplantation was performed in September 2010. In spite of a high cost of the procedure it is considered beneficial and may prevent long term complications and the costs associated with secondary cares. In this article we will briefly describe our experience in setting up a cGMP islet processing facility which can provide valuable information for regional countries interested to establish similar facilities.

  1. Pre-operative CT angiography and three-dimensional image post processing for deep inferior epigastric perforator flap breast reconstructive surgery.

    PubMed

    Lam, D L; Mitsumori, L M; Neligan, P C; Warren, B H; Shuman, W P; Dubinsky, T J

    2012-12-01

    Autologous breast reconstructive surgery with deep inferior epigastric artery (DIEA) perforator flaps has become the mainstay for breast reconstructive surgery. CT angiography and three-dimensional image post processing can depict the number, size, course and location of the DIEA perforating arteries for the pre-operative selection of the best artery to use for the tissue flap. Knowledge of the location and selection of the optimal perforating artery shortens operative times and decreases patient morbidity.

  2. An iterative reduced field-of-view reconstruction for periodically rotated overlapping parallel lines with enhanced reconstruction (PROPELLER) MRI.

    PubMed

    Lin, Jyh-Miin; Patterson, Andrew J; Chang, Hing-Chiu; Gillard, Jonathan H; Graves, Martin J

    2015-10-01

    To propose a new reduced field-of-view (rFOV) strategy for iterative reconstructions in a clinical environment. Iterative reconstructions can incorporate regularization terms to improve the image quality of periodically rotated overlapping parallel lines with enhanced reconstruction (PROPELLER) MRI. However, the large amount of calculations required for full FOV iterative reconstructions has posed a huge computational challenge for clinical usage. By subdividing the entire problem into smaller rFOVs, the iterative reconstruction can be accelerated on a desktop with a single graphic processing unit (GPU). This rFOV strategy divides the iterative reconstruction into blocks, based on the block-diagonal dominant structure. A near real-time reconstruction system was developed for the clinical MR unit, and parallel computing was implemented using the object-oriented model. In addition, the Toeplitz method was implemented on the GPU to reduce the time required for full interpolation. Using the data acquired from the PROPELLER MRI, the reconstructed images were then saved in the digital imaging and communications in medicine format. The proposed rFOV reconstruction reduced the gridding time by 97%, as the total iteration time was 3 s even with multiple processes running. A phantom study showed that the structure similarity index for rFOV reconstruction was statistically superior to conventional density compensation (p < 0.001). In vivo study validated the increased signal-to-noise ratio, which is over four times higher than with density compensation. Image sharpness index was improved using the regularized reconstruction implemented. The rFOV strategy permits near real-time iterative reconstruction to improve the image quality of PROPELLER images. Substantial improvements in image quality metrics were validated in the experiments. The concept of rFOV reconstruction may potentially be applied to other kinds of iterative reconstructions for shortened reconstruction duration.

  3. Terrestrial laser scanning for biomass assessment and tree reconstruction: improved processing efficiency

    NASA Astrophysics Data System (ADS)

    Alboabidallah, Ahmed; Martin, John; Lavender, Samantha; Abbott, Victor

    2017-09-01

    Terrestrial Laser Scanning (TLS) processing for biomass mapping involves large data volumes, and often includes relatively slow 3D object fitting steps that increase the processing time. This study aimed to test new features that can speed up the overall processing time. A new type of 3D voxel is used, where the horizontal layers are parallel to the Digital Terrain Model. This voxel type allows procedures to extract tree diameters using just one layer, but still gives direct tree-height estimations. Layer intersection is used to emphasize the trunks as upright standing objects, which are detected in the spatially segmented intersection of the breast-height voxels and then extended upwards and downwards. The diameters were calculated by fitting elliptical cylinders to the laser points in the detected trunk segments. Non-trunk segments, used in sub-tree- structures, were found using the parent-child relationships between successive layers. The branches were reconstructed by skeletonizing each sub-tree branch, and the biomass was distributed statistically amongst the weighted skeletons. The procedure was applied to nine plots within the UK. The average correlation coefficients between reconstructed and directly measured tree diameters, heights and branches were R2 = 0.92, 0.97 and 0.59 compared to 0.91, 0.95, and 0.63 when cylindrical fitting was used. The average time to apply the method reduced from 5hrs:18mins per plot, for the conventional methods, to 2hrs:24mins when the same hardware and software libraries were used with the 3D voxels. These results indicate that this 3D voxel method can produce, much more quickly, results of a similar accuracy that would improve efficiency if applied to projects with large volume TLS datasets.

  4. 18 CFR 157.21 - Pre-filing procedures and review process for LNG terminal facilities and other natural gas...

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... the pre-filing review of any pipeline or other natural gas facilities, including facilities not... from the subject LNG terminal facilities to the existing natural gas pipeline infrastructure. (b) Other... and review process for LNG terminal facilities and other natural gas facilities prior to filing of...

  5. Defense Waste Processing Facility Nitric- Glycolic Flowsheet Chemical Process Cell Chemistry: Part 2

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zamecnik, J.; Edwards, T.

    The conversions of nitrite to nitrate, the destruction of glycolate, and the conversion of glycolate to formate and oxalate were modeled for the Nitric-Glycolic flowsheet using data from Chemical Process Cell (CPC) simulant runs conducted by Savannah River National Laboratory (SRNL) from 2011 to 2016. The goal of this work was to develop empirical correlation models to predict these values from measureable variables from the chemical process so that these quantities could be predicted a-priori from the sludge or simulant composition and measurable processing variables. The need for these predictions arises from the need to predict the REDuction/OXidation (REDOX) statemore » of the glass from the Defense Waste Processing Facility (DWPF) melter. This report summarizes the work on these correlations based on the aforementioned data. Previous work on these correlations was documented in a technical report covering data from 2011-2015. This current report supersedes this previous report. Further refinement of the models as additional data are collected is recommended.« less

  6. DOE Coal Gasification Multi-Test Facility: fossil fuel processing technical/professional services

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hefferan, J.K.; Lee, G.Y.; Boesch, L.P.

    1979-07-13

    A conceptual design, including process descriptions, heat and material balances, process flow diagrams, utility requirements, schedule, capital and operating cost estimate, and alternative design considerations, is presented for the DOE Coal Gasification Multi-Test Facility (GMTF). The GMTF, an engineering scale facility, is to provide a complete plant into which different types of gasifiers and conversion/synthesis equipment can be readily integrated for testing in an operational environment at relatively low cost. The design allows for operation of several gasifiers simultaneously at a total coal throughput of 2500 tons/day; individual gasifiers operate at up to 1200 tons/day and 600 psig using airmore » or oxygen. Ten different test gasifiers can be in place at the facility, but only three can be operated at one time. The GMTF can produce a spectrum of saleable products, including low Btu, synthesis and pipeline gases, hydrogen (for fuel cells or hydrogasification), methanol, gasoline, diesel and fuel oils, organic chemicals, and electrical power (potentially). In 1979 dollars, the base facility requires a $288 million capital investment for common-use units, $193 million for four gasification units and four synthesis units, and $305 million for six years of operation. Critical reviews of detailed vendor designs are appended for a methanol synthesis unit, three entrained flow gasifiers, a fluidized bed gasifier, and a hydrogasifier/slag-bath gasifier.« less

  7. Downgrading Nuclear Facilities to Radiological Facilities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jarry, Jeffrey F.; Farr, Jesse Oscar; Duran, Leroy

    2015-08-01

    Based on inventory reductions and the use of alternate storage facilities, the Sandia National Laboratories (SNL) downgraded 4 SNL Hazard Category 3 (HC-3) nuclear facilities to less-than-HC-3 radiological facilities. SNL’s Waste Management and Pollution Prevention Department (WMPPD) managed the HC-3 nuclear facilities and implemented the downgrade. This paper will examine the downgrade process,

  8. 40 CFR 372.20 - Process for modifying covered chemicals and facilities.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... chemicals and facilities. 372.20 Section 372.20 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) SUPERFUND, EMERGENCY PLANNING, AND COMMUNITY RIGHT-TO-KNOW PROGRAMS TOXIC CHEMICAL RELEASE REPORTING: COMMUNITY RIGHT-TO-KNOW Reporting Requirements § 372.20 Process for modifying covered chemicals...

  9. 10 CFR 70.64 - Requirements for new facilities or new processes at existing facilities.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... behavior of items relied on for safety. (b) Facility and system design and facility layout must be based on... existing facilities. (a) Baseline design criteria. Each prospective applicant or licensee shall address the following baseline design criteria in the design of new facilities. Each existing licensee shall address the...

  10. 10 CFR 70.64 - Requirements for new facilities or new processes at existing facilities.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... behavior of items relied on for safety. (b) Facility and system design and facility layout must be based on... existing facilities. (a) Baseline design criteria. Each prospective applicant or licensee shall address the following baseline design criteria in the design of new facilities. Each existing licensee shall address the...

  11. 10 CFR 70.64 - Requirements for new facilities or new processes at existing facilities.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... behavior of items relied on for safety. (b) Facility and system design and facility layout must be based on... existing facilities. (a) Baseline design criteria. Each prospective applicant or licensee shall address the following baseline design criteria in the design of new facilities. Each existing licensee shall address the...

  12. 10 CFR 70.64 - Requirements for new facilities or new processes at existing facilities.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... behavior of items relied on for safety. (b) Facility and system design and facility layout must be based on... existing facilities. (a) Baseline design criteria. Each prospective applicant or licensee shall address the following baseline design criteria in the design of new facilities. Each existing licensee shall address the...

  13. Energy determination in industrial X-ray processing facilities

    NASA Astrophysics Data System (ADS)

    Cleland, M. R.; Gregoire, O.; Stichelbaut, F.; Gomola, I.; Galloway, R. A.; Schlecht, J.

    2005-12-01

    In industrial irradiation facilities, the determination of maximum photon or electron energy is important for regulated processes, such as food irradiation, and for assurance of treatment reproducibility. With electron beam irradiators, this has been done by measuring the depth-dose distribution in a homogeneous material. For X-ray irradiators, an analogous method has not yet been recommended. This paper describes a procedure suitable for typical industrial irradiation processes, which is based on common practice in the field of therapeutic X-ray treatment. It utilizes a measurement of the slope of the exponential attenuation curve of X-rays in a thick stack of polyethylene plates. Monte Carlo simulations and experimental tests have been performed to verify the suitability and accuracy of the method between 3 MeV and 8 MeV.

  14. General view from outside the Orbiter Processing Facility at the ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    General view from outside the Orbiter Processing Facility at the Kennedy Space Center with the bay doors open as the Orbiter Discovery is atop the transport vehicle prepared to be moved over to the Vehicle Assembly Building. - Space Transportation System, Orbiter Discovery (OV-103), Lyndon B. Johnson Space Center, 2101 NASA Parkway, Houston, Harris County, TX

  15. 3D TEM reconstruction and segmentation process of laminar bio-nanocomposites

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Iturrondobeitia, M., E-mail: maider.iturrondobeitia@ehu.es; Okariz, A.; Fernandez-Martinez, R.

    2015-03-30

    The microstructure of laminar bio-nanocomposites (Poly (lactic acid)(PLA)/clay) depends on the amount of clay platelet opening after integration with the polymer matrix and determines the final properties of the material. Transmission electron microscopy (TEM) technique is the only one that can provide a direct observation of the layer dispersion and the degree of exfoliation. However, the orientation of the clay platelets, which affects the final properties, is practically immeasurable from a single 2D TEM image. This issue can be overcome using transmission electron tomography (ET), a technique that allows the complete 3D characterization of the structure, including the measurement ofmore » the orientation of clay platelets, their morphology and their 3D distribution. ET involves a 3D reconstruction of the study volume and a subsequent segmentation of the study object. Currently, accurate segmentation is performed manually, which is inefficient and tedious. The aim of this work is to propose an objective/automated segmentation methodology process of a 3D TEM tomography reconstruction. In this method the segmentation threshold is optimized by minimizing the variation of the dimensions of the segmented objects and matching the segmented V{sub clay} (%) and the actual one. The method is first validated using a fictitious set of objects, and then applied on a nanocomposite.« less

  16. Applications of nonlocal means algorithm in low-dose X-ray CT image processing and reconstruction: a review

    PubMed Central

    Zhang, Hao; Zeng, Dong; Zhang, Hua; Wang, Jing; Liang, Zhengrong

    2017-01-01

    Low-dose X-ray computed tomography (LDCT) imaging is highly recommended for use in the clinic because of growing concerns over excessive radiation exposure. However, the CT images reconstructed by the conventional filtered back-projection (FBP) method from low-dose acquisitions may be severely degraded with noise and streak artifacts due to excessive X-ray quantum noise, or with view-aliasing artifacts due to insufficient angular sampling. In 2005, the nonlocal means (NLM) algorithm was introduced as a non-iterative edge-preserving filter to denoise natural images corrupted by additive Gaussian noise, and showed superior performance. It has since been adapted and applied to many other image types and various inverse problems. This paper specifically reviews the applications of the NLM algorithm in LDCT image processing and reconstruction, and explicitly demonstrates its improving effects on the reconstructed CT image quality from low-dose acquisitions. The effectiveness of these applications on LDCT and their relative performance are described in detail. PMID:28303644

  17. Evaluating Contextual Processing in Diffusion MRI: Application to Optic Radiation Reconstruction for Epilepsy Surgery

    PubMed Central

    Tax, Chantal M. W.; Duits, Remco; Vilanova, Anna; ter Haar Romeny, Bart M.; Hofman, Paul; Wagner, Louis; Leemans, Alexander; Ossenblok, Pauly

    2014-01-01

    Diffusion MRI and tractography allow for investigation of the architectural configuration of white matter in vivo, offering new avenues for applications like presurgical planning. Despite the promising outlook, there are many pitfalls that complicate its use for (clinical) application. Amongst these are inaccuracies in the geometry of the diffusion profiles on which tractography is based, and poor alignment with neighboring profiles. Recently developed contextual processing techniques, including enhancement and well-posed geometric sharpening, have shown to result in sharper and better aligned diffusion profiles. However, the research that has been conducted up to now is mainly of theoretical nature, and so far these techniques have only been evaluated by visual inspection of the diffusion profiles. In this work, the method is evaluated in a clinically relevant application: the reconstruction of the optic radiation for epilepsy surgery. For this evaluation we have developed a framework in which we incorporate a novel scoring procedure for individual pathways. We demonstrate that, using enhancement and sharpening, the extraction of an anatomically plausible reconstruction of the optic radiation from a large amount of probabilistic pathways is greatly improved in three healthy controls, where currently used methods fail to do so. Furthermore, challenging reconstructions of the optic radiation in three epilepsy surgery candidates with extensive brain lesions demonstrate that it is beneficial to integrate these methods in surgical planning. PMID:25077946

  18. Evaluating contextual processing in diffusion MRI: application to optic radiation reconstruction for epilepsy surgery.

    PubMed

    Tax, Chantal M W; Duits, Remco; Vilanova, Anna; ter Haar Romeny, Bart M; Hofman, Paul; Wagner, Louis; Leemans, Alexander; Ossenblok, Pauly

    2014-01-01

    Diffusion MRI and tractography allow for investigation of the architectural configuration of white matter in vivo, offering new avenues for applications like presurgical planning. Despite the promising outlook, there are many pitfalls that complicate its use for (clinical) application. Amongst these are inaccuracies in the geometry of the diffusion profiles on which tractography is based, and poor alignment with neighboring profiles. Recently developed contextual processing techniques, including enhancement and well-posed geometric sharpening, have shown to result in sharper and better aligned diffusion profiles. However, the research that has been conducted up to now is mainly of theoretical nature, and so far these techniques have only been evaluated by visual inspection of the diffusion profiles. In this work, the method is evaluated in a clinically relevant application: the reconstruction of the optic radiation for epilepsy surgery. For this evaluation we have developed a framework in which we incorporate a novel scoring procedure for individual pathways. We demonstrate that, using enhancement and sharpening, the extraction of an anatomically plausible reconstruction of the optic radiation from a large amount of probabilistic pathways is greatly improved in three healthy controls, where currently used methods fail to do so. Furthermore, challenging reconstructions of the optic radiation in three epilepsy surgery candidates with extensive brain lesions demonstrate that it is beneficial to integrate these methods in surgical planning.

  19. Accelerating Advanced MRI Reconstructions on GPUs

    PubMed Central

    Stone, S.S.; Haldar, J.P.; Tsao, S.C.; Hwu, W.-m.W.; Sutton, B.P.; Liang, Z.-P.

    2008-01-01

    Computational acceleration on graphics processing units (GPUs) can make advanced magnetic resonance imaging (MRI) reconstruction algorithms attractive in clinical settings, thereby improving the quality of MR images across a broad spectrum of applications. This paper describes the acceleration of such an algorithm on NVIDIA’s Quadro FX 5600. The reconstruction of a 3D image with 1283 voxels achieves up to 180 GFLOPS and requires just over one minute on the Quadro, while reconstruction on a quad-core CPU is twenty-one times slower. Furthermore, relative to the true image, the error exhibited by the advanced reconstruction is only 12%, while conventional reconstruction techniques incur error of 42%. PMID:21796230

  20. Accelerating Advanced MRI Reconstructions on GPUs.

    PubMed

    Stone, S S; Haldar, J P; Tsao, S C; Hwu, W-M W; Sutton, B P; Liang, Z-P

    2008-10-01

    Computational acceleration on graphics processing units (GPUs) can make advanced magnetic resonance imaging (MRI) reconstruction algorithms attractive in clinical settings, thereby improving the quality of MR images across a broad spectrum of applications. This paper describes the acceleration of such an algorithm on NVIDIA's Quadro FX 5600. The reconstruction of a 3D image with 128(3) voxels achieves up to 180 GFLOPS and requires just over one minute on the Quadro, while reconstruction on a quad-core CPU is twenty-one times slower. Furthermore, relative to the true image, the error exhibited by the advanced reconstruction is only 12%, while conventional reconstruction techniques incur error of 42%.

  1. Convex Accelerated Maximum Entropy Reconstruction

    PubMed Central

    Worley, Bradley

    2016-01-01

    Maximum entropy (MaxEnt) spectral reconstruction methods provide a powerful framework for spectral estimation of nonuniformly sampled datasets. Many methods exist within this framework, usually defined based on the magnitude of a Lagrange multiplier in the MaxEnt objective function. An algorithm is presented here that utilizes accelerated first-order convex optimization techniques to rapidly and reliably reconstruct nonuniformly sampled NMR datasets using the principle of maximum entropy. This algorithm – called CAMERA for Convex Accelerated Maximum Entropy Reconstruction Algorithm – is a new approach to spectral reconstruction that exhibits fast, tunable convergence in both constant-aim and constant-lambda modes. A high-performance, open source NMR data processing tool is described that implements CAMERA, and brief comparisons to existing reconstruction methods are made on several example spectra. PMID:26894476

  2. Measurements of methane emissions from natural gas gathering facilities and processing plants: measurement methods

    DOE PAGES

    Roscioli, J. R.; Yacovitch, T. I.; Floerchinger, C.; ...

    2015-05-07

    Increased natural gas production in recent years has spurred intense interest in methane (CH 4) emissions associated with its production, gathering, processing, transmission, and distribution. Gathering and processing facilities (G&P facilities) are unique in that the wide range of gas sources (shale, coal-bed, tight gas, conventional, etc.) results in a wide range of gas compositions, which in turn requires an array of technologies to prepare the gas for pipeline transmission and distribution. We present an overview and detailed description of the measurement method and analysis approach used during a 20-week field campaign studying CH 4 emissions from the natural gasmore » G&P facilities between October 2013 and April 2014. Dual-tracer flux measurements and on-site observations were used to address the magnitude and origins of CH 4 emissions from these facilities. The use of a second tracer as an internal standard revealed plume-specific uncertainties in the measured emission rates of 20–47%, depending upon plume classification. Furthermore, combining downwind methane, ethane (C 2H 6), carbon monoxide (CO), carbon dioxide (CO 2), and tracer gas measurements with on-site tracer gas release allows for quantification of facility emissions and in some cases a more detailed picture of source locations.« less

  3. EEG source reconstruction reveals frontal-parietal dynamics of spatial conflict processing.

    PubMed

    Cohen, Michael X; Ridderinkhof, K Richard

    2013-01-01

    Cognitive control requires the suppression of distracting information in order to focus on task-relevant information. We applied EEG source reconstruction via time-frequency linear constrained minimum variance beamforming to help elucidate the neural mechanisms involved in spatial conflict processing. Human subjects performed a Simon task, in which conflict was induced by incongruence between spatial location and response hand. We found an early (∼200 ms post-stimulus) conflict modulation in stimulus-contralateral parietal gamma (30-50 Hz), followed by a later alpha-band (8-12 Hz) conflict modulation, suggesting an early detection of spatial conflict and inhibition of spatial location processing. Inter-regional connectivity analyses assessed via cross-frequency coupling of theta (4-8 Hz), alpha, and gamma power revealed conflict-induced shifts in cortical network interactions: Congruent trials (relative to incongruent trials) had stronger coupling between frontal theta and stimulus-contrahemifield parietal alpha/gamma power, whereas incongruent trials had increased theta coupling between medial frontal and lateral frontal regions. These findings shed new light into the large-scale network dynamics of spatial conflict processing, and how those networks are shaped by oscillatory interactions.

  4. EEG Source Reconstruction Reveals Frontal-Parietal Dynamics of Spatial Conflict Processing

    PubMed Central

    Cohen, Michael X; Ridderinkhof, K. Richard

    2013-01-01

    Cognitive control requires the suppression of distracting information in order to focus on task-relevant information. We applied EEG source reconstruction via time-frequency linear constrained minimum variance beamforming to help elucidate the neural mechanisms involved in spatial conflict processing. Human subjects performed a Simon task, in which conflict was induced by incongruence between spatial location and response hand. We found an early (∼200 ms post-stimulus) conflict modulation in stimulus-contralateral parietal gamma (30–50 Hz), followed by a later alpha-band (8–12 Hz) conflict modulation, suggesting an early detection of spatial conflict and inhibition of spatial location processing. Inter-regional connectivity analyses assessed via cross-frequency coupling of theta (4–8 Hz), alpha, and gamma power revealed conflict-induced shifts in cortical network interactions: Congruent trials (relative to incongruent trials) had stronger coupling between frontal theta and stimulus-contrahemifield parietal alpha/gamma power, whereas incongruent trials had increased theta coupling between medial frontal and lateral frontal regions. These findings shed new light into the large-scale network dynamics of spatial conflict processing, and how those networks are shaped by oscillatory interactions. PMID:23451201

  5. Orbiter processing facility service platform failure and redesign

    NASA Technical Reports Server (NTRS)

    Harris, Jesse L.

    1988-01-01

    In a high bay of the Orbiter Processing Facility (OPF) at the Kennedy Space Center, technicians were preparing the space shuttle orbiter Discovery for rollout to the Vehicle Assembly Building (VAB). A service platform, commonly referred to as an OPF Bucket, was being retracted when it suddenly fell, striking a technician and impacting Discovery's payload bay door. A critical component in the OPF Bucket hoist system had failed, allowing the platform to fall. The incident was thoroughly investigated by both NASA and Lockheed, revealing many design deficiencies within the system. The deficiencies and the design changes made to correct them are reviewed.

  6. Hardware development process for Human Research facility applications

    NASA Astrophysics Data System (ADS)

    Bauer, Liz

    2000-01-01

    The simple goal of the Human Research Facility (HRF) is to conduct human research experiments on the International Space Station (ISS) astronauts during long-duration missions. This is accomplished by providing integration and operation of the necessary hardware and software capabilities. A typical hardware development flow consists of five stages: functional inputs and requirements definition, market research, design life cycle through hardware delivery, crew training, and mission support. The purpose of this presentation is to guide the audience through the early hardware development process: requirement definition through selecting a development path. Specific HRF equipment is used to illustrate the hardware development paths. .

  7. Defining event reconstruction of digital crime scenes.

    PubMed

    Carrier, Brian D; Spafford, Eugene H

    2004-11-01

    Event reconstruction plays a critical role in solving physical crimes by explaining why a piece of physical evidence has certain characteristics. With digital crimes, the current focus has been on the recognition and identification of digital evidence using an object's characteristics, but not on the identification of the events that caused the characteristics. This paper examines digital event reconstruction and proposes a process model and procedure that can be used for a digital crime scene. The model has been designed so that it can apply to physical crime scenes, can support the unique aspects of a digital crime scene, and can be implemented in software to automate part of the process. We also examine the differences between physical event reconstruction and digital event reconstruction.

  8. Evaluation of Mercury in Liquid Waste Processing Facilities - Phase I Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jain, V.; Occhipinti, J.; Shah, H.

    2015-07-01

    This report provides a summary of Phase I activities conducted to support an Integrated Evaluation of Mercury in Liquid Waste System (LWS) Processing Facilities. Phase I activities included a review and assessment of the liquid waste inventory and chemical processing behavior of mercury using a system by system review methodology approach. Gaps in understanding mercury behavior as well as action items from the structured reviews are being tracked. 64% of the gaps and actions have been resolved.

  9. Evaluation of mercury in liquid waste processing facilities - Phase I report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jain, V.; Occhipinti, J. E.; Shah, H.

    2015-07-01

    This report provides a summary of Phase I activities conducted to support an Integrated Evaluation of Mercury in Liquid Waste System (LWS) Processing Facilities. Phase I activities included a review and assessment of the liquid waste inventory and chemical processing behavior of mercury using a system by system review methodology approach. Gaps in understanding mercury behavior as well as action items from the structured reviews are being tracked. 64% of the gaps and actions have been resolved.

  10. The Establishment of a New Friction Stir Welding Process Development Facility at NASA/MSFC

    NASA Technical Reports Server (NTRS)

    Carter, Robert W.

    2009-01-01

    Full-scale weld process development is being performed at MSFC to develop the tools, fixtures, and facilities necessary for Ares I production. Full scale development in-house at MSFC fosters technical acuity within the NASA engineering community, and allows engineers to identify and correct tooling and equipment shortcomings before they become problems on the production floor. Finally, while the new weld process development facility is currently being outfitted in support of Ares I development, it has been established to support all future Constellation Program needs. In particular, both the RWT and VWT were sized with the larger Ares V hardware in mind.

  11. Project C-018H, 242-A Evaporator/PUREX Plant Process Condensate Treatment Facility, functional design criteria. Revision 3

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sullivan, N.

    1995-05-02

    This document provides the Functional Design Criteria (FDC) for Project C-018H, the 242-A Evaporator and Plutonium-Uranium Extraction (PUREX) Plant Condensate Treatment Facility (Also referred to as the 200 Area Effluent Treatment Facility [ETF]). The project will provide the facilities to treat and dispose of the 242-A Evaporator process condensate (PC), the Plutonium-Uranium Extraction (PUREX) Plant process condensate (PDD), and the PUREX Plant ammonia scrubber distillate (ASD).

  12. Role of Sports Facilities in the Process of Revitalization of Brownfields

    NASA Astrophysics Data System (ADS)

    Taraszkiewicz, Karolina; Nyka, Lucyna

    2017-10-01

    The paper gives an evidence that building a large sports facility can generate beneficial urban space transformation and a significant improvement in the dilapidated urban areas. On the basis of theoretical investigations and case studies it can be proved that sports facilities introduced to urban brownfields could be considered one of the best known large scale revitalization methods. Large urban spaces surrounding sport facilities such as stadiums and other sports arenas create excellent conditions for designing additional recreational function, such as parks and other green areas. Since sports venues are very often located on brownfields and post-industrial spaces, there are usually well related with canals, rivers and other water routes or reservoirs. Such spaces become attractors for large groups of people. This, in effect initiate the process of introducing housing estates to the area and gradually the development of multifunctional urban structure. As research shows such process of favourable urban transformation could be based on implementing several important preconditions. One of the most significant one is the formation of the new communication infrastructure, which links newly formed territories with the well-structured urban core. Well planned program of the new sports facilities is also a very important factor. As research shows multifunctional large sports venues may function in the city as a new kind of public space that stimulates new genres of social relations, offers entertainment and free time activities, not necessarily related with sport. This finally leads to the creation of new jobs and more general improvement of a widely understood image of the district, growing appreciation for the emerging new location and consequently new investments in the neighbouring areas. The research gives new evidence to the ongoing discussion on the drawbacks and benefits of placing stadiums and sports arenas in the urban core.

  13. Gemini Observatory base facility operations: systems engineering process and lessons learned

    NASA Astrophysics Data System (ADS)

    Serio, Andrew; Cordova, Martin; Arriagada, Gustavo; Adamson, Andy; Close, Madeline; Coulson, Dolores; Nitta, Atsuko; Nunez, Arturo

    2016-08-01

    Gemini North Observatory successfully began nighttime remote operations from the Hilo Base Facility control room in November 2015. The implementation of the Gemini North Base Facility Operations (BFO) products was a great learning experience for many of our employees, including the author of this paper, the BFO Systems Engineer. In this paper we focus on the tailored Systems Engineering processes used for the project, the various software tools used in project support, and finally discuss the lessons learned from the Gemini North implementation. This experience and the lessons learned will be used both to aid our implementation of the Gemini South BFO in 2016, and in future technical projects at Gemini Observatory.

  14. A review of GPU-based medical image reconstruction.

    PubMed

    Després, Philippe; Jia, Xun

    2017-10-01

    Tomographic image reconstruction is a computationally demanding task, even more so when advanced models are used to describe a more complete and accurate picture of the image formation process. Such advanced modeling and reconstruction algorithms can lead to better images, often with less dose, but at the price of long calculation times that are hardly compatible with clinical workflows. Fortunately, reconstruction tasks can often be executed advantageously on Graphics Processing Units (GPUs), which are exploited as massively parallel computational engines. This review paper focuses on recent developments made in GPU-based medical image reconstruction, from a CT, PET, SPECT, MRI and US perspective. Strategies and approaches to get the most out of GPUs in image reconstruction are presented as well as innovative applications arising from an increased computing capacity. The future of GPU-based image reconstruction is also envisioned, based on current trends in high-performance computing. Copyright © 2017 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  15. Electromagnetic containerless processing requirements and recommended facility concept and capabilities for space lab

    NASA Technical Reports Server (NTRS)

    Frost, R. T.; Bloom, H. L.; Napaluch, L. J.; Stockhoff, E. H.; Wouch, G.

    1974-01-01

    Containerless melting, reaction, and solidification experiments and processes which potentially can lead to new understanding of material science and production of new or improved materials in the weightless space environment are reviewed in terms of planning for spacelab. Most of the experiments and processes discussed are amenable to the employment of electromagnetic position control and electromagnetic induction or electron beam heating and melting. The spectrum of relevant properties of materials, which determine requirements for a space laboratory electromagnetic containerless processing facility are reviewed. Appropriate distributions and associated coil structures are analyzed and compared on the basis of efficiency, for providing the functions of position sensing, control, and induction heating. Several coil systems are found capable of providing these functions. Exchangeable modular coils in appropriate sizes are recommended to achieve the maximum power efficiencies, for a wide range of specimen sizes and resistivities, in order to conserve total facility power.

  16. Updates in Head and Neck Reconstruction.

    PubMed

    Largo, Rene D; Garvey, Patrick B

    2018-02-01

    After reading this article, the participant should be able to: 1. Have a basic understanding of virtual planning, rapid prototype modeling, three-dimensional printing, and computer-assisted design and manufacture. 2. Understand the principles of combining virtual planning and vascular mapping. 3. Understand principles of flap choice and design in preoperative planning of free osteocutaneous flaps in mandible and midface reconstruction. 4. Discuss advantages and disadvantages of computer-assisted design and manufacture in reconstruction of advanced oncologic mandible and midface defects. Virtual planning and rapid prototype modeling are increasingly used in head and neck reconstruction with the aim of achieving superior surgical outcomes in functionally and aesthetically critical areas of the head and neck compared with conventional reconstruction. The reconstructive surgeon must be able to understand this rapidly-advancing technology, along with its advantages and disadvantages. There is no limit to the degree to which patient-specific data may be integrated into the virtual planning process. For example, vascular mapping can be incorporated into virtual planning of mandible or midface reconstruction. Representative mandible and midface cases are presented to illustrate the process of virtual planning. Although virtual planning has become helpful in head and neck reconstruction, its routine use may be limited by logistic challenges, increased acquisition costs, and limited flexibility for intraoperative modifications. Nevertheless, the authors believe that the superior functional and aesthetic results realized with virtual planning outweigh the limitations.

  17. Advanced technologies for maintenance of electrical systems and equipment at the Savannah River Site Defense Waste Processing Facility

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Husler, R.O.; Weir, T.J.

    1991-01-01

    An enhanced maintenance program is being established to characterize and monitor cables, components, and process response at the Savannah River Site, Defense Waste Processing Facility. This facility was designed and constructed to immobilize the radioactive waste currently stored in underground storage tanks and is expected to begin operation in 1993. The plant is initiating the program to baseline and monitor instrument and control (I C) and electrical equipment, remote process equipment, embedded instrument and control cables, and in-cell jumper cables used in the facility. This program is based on the electronic characterization and diagnostic (ECAD) system which was modified tomore » include process response analysis and to meet rigid Department of Energy equipment requirements. The system consists of computer-automated, state-of-the-art electronics. The data that are gathered are stored in a computerized database for analysis, trending, and troubleshooting. It is anticipated that the data which are gathered and trended will aid in life extension for the facility.« less

  18. Advanced technologies for maintenance of electrical systems and equipment at the Savannah River Site Defense Waste Processing Facility

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Husler, R.O.; Weir, T.J.

    1991-12-31

    An enhanced maintenance program is being established to characterize and monitor cables, components, and process response at the Savannah River Site, Defense Waste Processing Facility. This facility was designed and constructed to immobilize the radioactive waste currently stored in underground storage tanks and is expected to begin operation in 1993. The plant is initiating the program to baseline and monitor instrument and control (I&C) and electrical equipment, remote process equipment, embedded instrument and control cables, and in-cell jumper cables used in the facility. This program is based on the electronic characterization and diagnostic (ECAD) system which was modified to includemore » process response analysis and to meet rigid Department of Energy equipment requirements. The system consists of computer-automated, state-of-the-art electronics. The data that are gathered are stored in a computerized database for analysis, trending, and troubleshooting. It is anticipated that the data which are gathered and trended will aid in life extension for the facility.« less

  19. Facility cost analysis in outpatient plastic surgery: implications for the academic health center.

    PubMed

    Pacella, Salvatore J; Comstock, Matthew C; Kuzon, William M

    2008-04-01

    The authors examined the economic patterns of outpatient aesthetic and reconstructive plastic surgical procedures performed within an academic health center. For fiscal years 2003 and 2004, the University of Michigan Health System's accounting database was queried to identify all outpatient plastic surgery cases (aesthetic and reconstructive) from four surgical facilities. Total facility charges, cost, revenue, and margin were calculated for each case. Contribution margin (total revenue minus variable direct cost) was compared with total case time to determine average contribution margin per operating suite case minute for subsets of aesthetic and reconstructive procedures. A total of 3603 cases (3457 reconstructive and 146 aesthetic) were identified. Payer mix included Blue Cross (36.7 percent), health maintenance organization (28.7 percent), other commercial payers (17.4 percent), Medicare/Medicaid (13.5 percent), and self-pay (3.7 percent). The most profitable cases were reconstructive laser procedures ($66.20; n = 361), scar revision ($36.01; n = 25), and facial trauma ($32.17; n = 64). The least profitable were hand arthroplasty ($13.93; n = 35), arthroscopy ($17.25; n = 15), and breast reduction ($17.46; n = 210). Aesthetic procedures (n = 144) yielded a significantly higher contribution margin per case minute ($24.21) compared with reconstructive procedures ($22.28; n = 3093) (p = 0.01). Plastic surgical cases performed at dedicated ambulatory surgery centers ($28.60; n = 1477) yielded significantly higher contribution margin per case minute compared with those performed at hospital-based facilities ($25.58; n = 2123) (p < 0.01). Use of standardized accounting (contribution margin per case minute) can be a strategically effective method for determining the most profitable and appropriate case mix. Within academic health centers, aesthetic surgery can be a profitable enterprise; dedicated ambulatory surgery centers yield higher profitability.

  20. Unity connecting module viewed from above in the Space Station Processing Facility

    NASA Technical Reports Server (NTRS)

    1998-01-01

    The Unity connecting module is viewed from above while it awaits processing in the Space Station Processing Facility (SSPF). On the side can be seen the connecting hatch. The Unity, scheduled to be launched on STS-88 in December 1998, will be mated to the Russian-built Zarya control module which will already be in orbit. STS-88 will be the first Space Shuttle launch for the International Space Station.

  1. Design of a lunar propellant processing facility. NASA/USRA advanced program

    NASA Technical Reports Server (NTRS)

    Batra, Rajesh; Bell, Jason; Campbell, J. Matt; Cash, Tom; Collins, John; Dailey, Brian; France, Angelique; Gareau, Will; Gleckler, Mark; Hamilton, Charles

    1993-01-01

    Mankind's exploration of space will eventually lead to the establishment of a permanent human presence on the Moon. Essential to the economic viability of such an undertaking will be prudent utilization of indigenous lunar resources. The design of a lunar propellant processing system is presented. The system elements include facilities for ore processing, ice transportation, water splitting, propellant storage, personnel and materials transportation, human habitation, power generation, and communications. The design scenario postulates that ice is present in the lunar polar regions, and that an initial lunar outpost was established. Mining, ore processing, and water transportation operations are located in the polar regions. Water processing and propellant storage facilities are positioned near the equator. A general description of design operations is outlined below. Regolith containing the ice is mined from permanently-shaded polar craters. Water is separated from the ore using a microwave processing technique, and refrozen into projectiles for launch to the equatorial site via railgun. A mass-catching device retrieves the ice. This ice is processed using fractional distillation to remove impurities, and the purified liquid water is fed to an electrolytic cell that splits the water into vaporous hydrogen and oxygen. The hydrogen and oxygen are condensed and stored separately in a tank farm. Electric power for all operations is supplied by SP-100 nuclear reactors. Transportation of materials and personnel is accomplished primarily using chemical rockets. Modular living habitats are used which provide flexibility for the placement and number of personnel. A communications system consisting of lunar surface terminals, a lunar relay satellite, and terrestrial surface stations provides capabilities for continuous Moon-Moon and Moon-Earth transmissions of voice, picture, and data.

  2. Design and Evaluation of Wood Processing Facilities Using Object-Oriented Simulation

    Treesearch

    D. Earl Kline; Philip A. Araman

    1992-01-01

    Managers of hardwood processing facilities need timely information on which to base important decisions such as when to add costly equipment or how to improve profitability subject to time-varying demands. The overall purpose of this paper is to introduce a tool that can effectively provide such timely information. A simulation/animation modeling procedure is described...

  3. 7 CFR 319.40-8 - Processing at facilities operating under compliance agreements.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... compliance agreement shall specify the requirements necessary to prevent spread of plant pests from the facility, requirements to ensure the processing method effectively destroys plant pests, and the requirements for the application of chemical materials in accordance with part 305 of this chapter. The...

  4. 7 CFR 319.40-8 - Processing at facilities operating under compliance agreements.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... compliance agreement shall specify the requirements necessary to prevent spread of plant pests from the facility, requirements to ensure the processing method effectively destroys plant pests, and the requirements for the application of chemical materials in accordance with part 305 of this chapter. The...

  5. Shortening and Angulation for Soft-Tissue Reconstruction of Extremity Wounds in a Combat Support Hospital

    DTIC Science & Technology

    2009-08-01

    l\\I ILIT\\R\\’ ’\\ I EDICt E. 174. K:83X. 2009 Shortening and Angulation for Soft-Tissue Reconstruction of Extremity Wounds in a Combat Support...team in theater. Thereafter. they can be rapidly evacuated to treatment facilities in their respective countries for definitive reconstruct ion of...cripl "’"’ rccCI\\ ec.J ft•r re’ 1ew 1n ovcmb.:r 2008. The revbe<.l manu,cnpl "a’ accepted tor publicauon 1n May 2()()9. 838 vide a reconstructive

  6. A knowledge acquisition process to analyse operational problems in solid waste management facilities.

    PubMed

    Dokas, Ioannis M; Panagiotakopoulos, Demetrios C

    2006-08-01

    The available expertise on managing and operating solid waste management (SWM) facilities varies among countries and among types of facilities. Few experts are willing to record their experience, while few researchers systematically investigate the chains of events that could trigger operational failures in a facility; expertise acquisition and dissemination, in SWM, is neither popular nor easy, despite the great need for it. This paper presents a knowledge acquisition process aimed at capturing, codifying and expanding reliable expertise and propagating it to non-experts. The knowledge engineer (KE), the person performing the acquisition, must identify the events (or causes) that could trigger a failure, determine whether a specific event could trigger more than one failure, and establish how various events are related among themselves and how they are linked to specific operational problems. The proposed process, which utilizes logic diagrams (fault trees) widely used in system safety and reliability analyses, was used for the analysis of 24 common landfill operational problems. The acquired knowledge led to the development of a web-based expert system (Landfill Operation Management Advisor, http://loma.civil.duth.gr), which estimates the occurrence possibility of operational problems, provides advice and suggests solutions.

  7. Time-of-flight PET image reconstruction using origin ensembles.

    PubMed

    Wülker, Christian; Sitek, Arkadiusz; Prevrhal, Sven

    2015-03-07

    The origin ensemble (OE) algorithm is a novel statistical method for minimum-mean-square-error (MMSE) reconstruction of emission tomography data. This method allows one to perform reconstruction entirely in the image domain, i.e. without the use of forward and backprojection operations. We have investigated the OE algorithm in the context of list-mode (LM) time-of-flight (TOF) PET reconstruction. In this paper, we provide a general introduction to MMSE reconstruction, and a statistically rigorous derivation of the OE algorithm. We show how to efficiently incorporate TOF information into the reconstruction process, and how to correct for random coincidences and scattered events. To examine the feasibility of LM-TOF MMSE reconstruction with the OE algorithm, we applied MMSE-OE and standard maximum-likelihood expectation-maximization (ML-EM) reconstruction to LM-TOF phantom data with a count number typically registered in clinical PET examinations. We analyzed the convergence behavior of the OE algorithm, and compared reconstruction time and image quality to that of the EM algorithm. In summary, during the reconstruction process, MMSE-OE contrast recovery (CRV) remained approximately the same, while background variability (BV) gradually decreased with an increasing number of OE iterations. The final MMSE-OE images exhibited lower BV and a slightly lower CRV than the corresponding ML-EM images. The reconstruction time of the OE algorithm was approximately 1.3 times longer. At the same time, the OE algorithm can inherently provide a comprehensive statistical characterization of the acquired data. This characterization can be utilized for further data processing, e.g. in kinetic analysis and image registration, making the OE algorithm a promising approach in a variety of applications.

  8. Time-of-flight PET image reconstruction using origin ensembles

    NASA Astrophysics Data System (ADS)

    Wülker, Christian; Sitek, Arkadiusz; Prevrhal, Sven

    2015-03-01

    The origin ensemble (OE) algorithm is a novel statistical method for minimum-mean-square-error (MMSE) reconstruction of emission tomography data. This method allows one to perform reconstruction entirely in the image domain, i.e. without the use of forward and backprojection operations. We have investigated the OE algorithm in the context of list-mode (LM) time-of-flight (TOF) PET reconstruction. In this paper, we provide a general introduction to MMSE reconstruction, and a statistically rigorous derivation of the OE algorithm. We show how to efficiently incorporate TOF information into the reconstruction process, and how to correct for random coincidences and scattered events. To examine the feasibility of LM-TOF MMSE reconstruction with the OE algorithm, we applied MMSE-OE and standard maximum-likelihood expectation-maximization (ML-EM) reconstruction to LM-TOF phantom data with a count number typically registered in clinical PET examinations. We analyzed the convergence behavior of the OE algorithm, and compared reconstruction time and image quality to that of the EM algorithm. In summary, during the reconstruction process, MMSE-OE contrast recovery (CRV) remained approximately the same, while background variability (BV) gradually decreased with an increasing number of OE iterations. The final MMSE-OE images exhibited lower BV and a slightly lower CRV than the corresponding ML-EM images. The reconstruction time of the OE algorithm was approximately 1.3 times longer. At the same time, the OE algorithm can inherently provide a comprehensive statistical characterization of the acquired data. This characterization can be utilized for further data processing, e.g. in kinetic analysis and image registration, making the OE algorithm a promising approach in a variety of applications.

  9. The Relationship Between Patients' Personality Traits and Breast Reconstruction Process.

    PubMed

    Faragó-Magrini, Sandra; Aubá, Cristina; Camargo, Cristina; Laspra, Carmen; Hontanilla, Bernardo

    2018-06-01

    Breast reconstruction after mastectomy is a part of breast cancer treatment. There is a lack of data regarding the impact of reconstruction over psychological traits and quality of life. The aim of this study is to evaluate personality changes in patients who underwent reconstructive surgery. Thirty-seven women underwent breast reconstruction. These women took the Crown-Crisp Experiential Index before and after the different procedures. The questionnaire analyzes: (a) the satisfaction level with personal relationships before and after surgery, and the level of satisfaction with surgical results and (b) personality index. Comparisons of preoperative and postoperative personality traits were made by using the Crown-Crisp test and analyzed by Chi-square test. Correlations between preoperative concerns and CCEI traits and correlations between physical aspects and Crown-Crisp, both preoperatively and postoperatively, were performed using the Spearman test. We found statistically significant differences in the following traits: anxiety anticipating possible technique failures (p = 0.01); cancer recurrence (p = 0.04); dissatisfaction with results (p = 0.02); phobic anxiety for possible technique failure (p = 0.03); obsessionality with possible technique failure (p = 0.01); preoccupations around cancer recurrence (p = 0.01) and dissatisfaction with results (p = 0.03); somatic of technique failure (p = 0.05); and finally, depression and hysteria traits in response to surgical procedures except anesthesia. This prospective study suggests that personality traits define perceptions of body image, which has an influence over quality of life and satisfaction with results. This journal requires that authors assign a level of evidence to each article. For a full description of these Evidence-Based Medicine ratings, please refer to the Table of Contents or the online Instructions to Authors www.springer.com/00266 .

  10. Reconstruction method for data protection in telemedicine systems

    NASA Astrophysics Data System (ADS)

    Buldakova, T. I.; Suyatinov, S. I.

    2015-03-01

    In the report the approach to protection of transmitted data by creation of pair symmetric keys for the sensor and the receiver is offered. Since biosignals are unique for each person, their corresponding processing allows to receive necessary information for creation of cryptographic keys. Processing is based on reconstruction of the mathematical model generating time series that are diagnostically equivalent to initial biosignals. Information about the model is transmitted to the receiver, where the restoration of physiological time series is performed using the reconstructed model. Thus, information about structure and parameters of biosystem model received in the reconstruction process can be used not only for its diagnostics, but also for protection of transmitted data in telemedicine complexes.

  11. Lattice Boltzmann simulation of the gas-solid adsorption process in reconstructed random porous media.

    PubMed

    Zhou, L; Qu, Z G; Ding, T; Miao, J Y

    2016-04-01

    The gas-solid adsorption process in reconstructed random porous media is numerically studied with the lattice Boltzmann (LB) method at the pore scale with consideration of interparticle, interfacial, and intraparticle mass transfer performances. Adsorbent structures are reconstructed in two dimensions by employing the quartet structure generation set approach. To implement boundary conditions accurately, all the porous interfacial nodes are recognized and classified into 14 types using a proposed universal program called the boundary recognition and classification program. The multiple-relaxation-time LB model and single-relaxation-time LB model are adopted to simulate flow and mass transport, respectively. The interparticle, interfacial, and intraparticle mass transfer capacities are evaluated with the permeability factor and interparticle transfer coefficient, Langmuir adsorption kinetics, and the solid diffusion model, respectively. Adsorption processes are performed in two groups of adsorbent media with different porosities and particle sizes. External and internal mass transfer resistances govern the adsorption system. A large porosity leads to an early time for adsorption equilibrium because of the controlling factor of external resistance. External and internal resistances are dominant at small and large particle sizes, respectively. Particle size, under which the total resistance is minimum, ranges from 3 to 7 μm with the preset parameters. Pore-scale simulation clearly explains the effect of both external and internal mass transfer resistances. The present paper provides both theoretical and practical guidance for the design and optimization of adsorption systems.

  12. Lattice Boltzmann simulation of the gas-solid adsorption process in reconstructed random porous media

    NASA Astrophysics Data System (ADS)

    Zhou, L.; Qu, Z. G.; Ding, T.; Miao, J. Y.

    2016-04-01

    The gas-solid adsorption process in reconstructed random porous media is numerically studied with the lattice Boltzmann (LB) method at the pore scale with consideration of interparticle, interfacial, and intraparticle mass transfer performances. Adsorbent structures are reconstructed in two dimensions by employing the quartet structure generation set approach. To implement boundary conditions accurately, all the porous interfacial nodes are recognized and classified into 14 types using a proposed universal program called the boundary recognition and classification program. The multiple-relaxation-time LB model and single-relaxation-time LB model are adopted to simulate flow and mass transport, respectively. The interparticle, interfacial, and intraparticle mass transfer capacities are evaluated with the permeability factor and interparticle transfer coefficient, Langmuir adsorption kinetics, and the solid diffusion model, respectively. Adsorption processes are performed in two groups of adsorbent media with different porosities and particle sizes. External and internal mass transfer resistances govern the adsorption system. A large porosity leads to an early time for adsorption equilibrium because of the controlling factor of external resistance. External and internal resistances are dominant at small and large particle sizes, respectively. Particle size, under which the total resistance is minimum, ranges from 3 to 7 μm with the preset parameters. Pore-scale simulation clearly explains the effect of both external and internal mass transfer resistances. The present paper provides both theoretical and practical guidance for the design and optimization of adsorption systems.

  13. See-through Detection and 3D Reconstruction Using Terahertz Leaky-Wave Radar Based on Sparse Signal Processing

    NASA Astrophysics Data System (ADS)

    Murata, Koji; Murano, Kosuke; Watanabe, Issei; Kasamatsu, Akifumi; Tanaka, Toshiyuki; Monnai, Yasuaki

    2018-02-01

    We experimentally demonstrate see-through detection and 3D reconstruction using terahertz leaky-wave radar based on sparse signal processing. The application of terahertz waves to radar has received increasing attention in recent years for its potential to high-resolution and see-through detection. Among others, the implementation using a leaky-wave antenna is promising for compact system integration with beam steering capability based on frequency sweep. However, the use of a leaky-wave antenna poses a challenge on signal processing. Since a leaky-wave antenna combines the entire signal captured by each part of the aperture into a single output, the conventional array signal processing assuming access to a respective antenna element is not applicable. In this paper, we apply an iterative recovery algorithm "CoSaMP" to signals acquired with terahertz leaky-wave radar for clutter mitigation and aperture synthesis. We firstly demonstrate see-through detection of target location even when the radar is covered with an opaque screen, and therefore, the radar signal is disturbed by clutter. Furthermore, leveraging the robustness of the algorithm against noise, we also demonstrate 3D reconstruction of distributed targets by synthesizing signals collected from different orientations. The proposed approach will contribute to the smart implementation of terahertz leaky-wave radar.

  14. Jini service to reconstruct tomographic data

    NASA Astrophysics Data System (ADS)

    Knoll, Peter; Mirzaei, S.; Koriska, K.; Koehn, H.

    2002-06-01

    A number of imaging systems rely on the reconstruction of a 3- dimensional model from its projections through the process of computed tomography (CT). In medical imaging, for example magnetic resonance imaging (MRI), positron emission tomography (PET), and Single Computer Tomography (SPECT) acquire two-dimensional projections of a three dimensional projections of a three dimensional object. In order to calculate the 3-dimensional representation of the object, i.e. its voxel distribution, several reconstruction algorithms have been developed. Currently, mainly two reconstruct use: the filtered back projection(FBP) and iterative methods. Although the quality of iterative reconstructed SPECT slices is better than that of FBP slices, such iterative algorithms are rarely used for clinical routine studies because of their low availability and increased reconstruction time. We used Jini and a self-developed iterative reconstructions algorithm to design and implement a Jini reconstruction service. With this service, the physician selects the patient study from a database and a Jini client automatically discovers the registered Jini reconstruction services in the department's Intranet. After downloading the proxy object the this Jini service, the SPECT acquisition data are reconstructed. The resulting transaxial slices are visualized using a Jini slice viewer, which can be used for various imaging modalities.

  15. Data reduction complex analog-to-digital data processing requirements for onsite test facilities

    NASA Technical Reports Server (NTRS)

    Debbrecht, J. D.

    1976-01-01

    The analog to digital processing requirements of onsite test facilities are described. The source and medium of all input data to the Data Reduction Complex (DRC) and the destination and medium of all output products of the analog-to-digital processing are identified. Additionally, preliminary input and output data formats are presented along with the planned use of the output products.

  16. An ecological perspective of Listeria monocytogenes biofilms in food processing facilities.

    PubMed

    Valderrama, Wladir B; Cutter, Catherine N

    2013-01-01

    Listeria monocytogenes can enter the food chain at virtually any point. However, food processing environments seem to be of particular importance. From an ecological point of view, food processing facilities are microbial habitats that are constantly disturbed by cleaning and sanitizing procedures. Although L. monocytogenes is considered ubiquitous in nature, it is important to recognize that not all L. monocytogenes strains appear to be equally distributed; the distribution of the organism seems to be related to certain habitats. Currently, no direct evidence exists that L. monocytogenes-associated biofilms have played a role in food contamination or foodborne outbreaks, likely because biofilm isolation and identification are not part of an outbreak investigation, or the definition of biofilm is unclear. Because L. monocytogenes is known to colonize surfaces, we suggest that contamination patterns may be studied in the context of how biofilm formation is influenced by the environment within food processing facilities. In this review, direct and indirect epidemiological and phenotypic evidence of lineage-related biofilm formation capacity to specific ecological niches will be discussed. A critical view on the development of the biofilm concept, focused on the practical implications, strengths, and weaknesses of the current definitions also is discussed. The idea that biofilm formation may be an alternative surrogate for microbial fitness is proposed. Furthermore, current research on the influence of environmental factors on biofilm formation is discussed.

  17. Accelerated Compressed Sensing Based CT Image Reconstruction.

    PubMed

    Hashemi, SayedMasoud; Beheshti, Soosan; Gill, Patrick R; Paul, Narinder S; Cobbold, Richard S C

    2015-01-01

    In X-ray computed tomography (CT) an important objective is to reduce the radiation dose without significantly degrading the image quality. Compressed sensing (CS) enables the radiation dose to be reduced by producing diagnostic images from a limited number of projections. However, conventional CS-based algorithms are computationally intensive and time-consuming. We propose a new algorithm that accelerates the CS-based reconstruction by using a fast pseudopolar Fourier based Radon transform and rebinning the diverging fan beams to parallel beams. The reconstruction process is analyzed using a maximum-a-posterior approach, which is transformed into a weighted CS problem. The weights involved in the proposed model are calculated based on the statistical characteristics of the reconstruction process, which is formulated in terms of the measurement noise and rebinning interpolation error. Therefore, the proposed method not only accelerates the reconstruction, but also removes the rebinning and interpolation errors. Simulation results are shown for phantoms and a patient. For example, a 512 × 512 Shepp-Logan phantom when reconstructed from 128 rebinned projections using a conventional CS method had 10% error, whereas with the proposed method the reconstruction error was less than 1%. Moreover, computation times of less than 30 sec were obtained using a standard desktop computer without numerical optimization.

  18. Accelerated Compressed Sensing Based CT Image Reconstruction

    PubMed Central

    Hashemi, SayedMasoud; Beheshti, Soosan; Gill, Patrick R.; Paul, Narinder S.; Cobbold, Richard S. C.

    2015-01-01

    In X-ray computed tomography (CT) an important objective is to reduce the radiation dose without significantly degrading the image quality. Compressed sensing (CS) enables the radiation dose to be reduced by producing diagnostic images from a limited number of projections. However, conventional CS-based algorithms are computationally intensive and time-consuming. We propose a new algorithm that accelerates the CS-based reconstruction by using a fast pseudopolar Fourier based Radon transform and rebinning the diverging fan beams to parallel beams. The reconstruction process is analyzed using a maximum-a-posterior approach, which is transformed into a weighted CS problem. The weights involved in the proposed model are calculated based on the statistical characteristics of the reconstruction process, which is formulated in terms of the measurement noise and rebinning interpolation error. Therefore, the proposed method not only accelerates the reconstruction, but also removes the rebinning and interpolation errors. Simulation results are shown for phantoms and a patient. For example, a 512 × 512 Shepp-Logan phantom when reconstructed from 128 rebinned projections using a conventional CS method had 10% error, whereas with the proposed method the reconstruction error was less than 1%. Moreover, computation times of less than 30 sec were obtained using a standard desktop computer without numerical optimization. PMID:26167200

  19. A New Concept: Use of Negotiations in the Hazardous Waste Facility Permitting Process in New Mexico

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Johnson, G.J.; Rose, W.M.; Domenici, P.V.

    This paper describes a unique negotiation process leading to authorization of the U.S. Department of Energy (DOE) to manage and dispose remote-handled (RH) transuranic (TRU) mixed wastes at the Waste Isolation Pilot Plant (WIPP). The negotiation process involved multiple entities and individuals brought together under authority of the New Mexico Environment Department (NMED) to discuss and resolve technical and facility operational issues flowing from an NMED-issued hazardous waste facility Draft Permit. The novel negotiation process resulted in numerous substantive changes to the Draft Permit, which were ultimately memorialised in a 'Draft Permit as Changed'. This paper discusses various aspects ofmore » the negotiation process, including events leading to the negotiations, regulatory basis for the negotiations, negotiation participants, and benefits of the process. (authors)« less

  20. 7 CFR 319.40-8 - Processing at facilities operating under compliance agreements.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... plant pests from the facility, requirements to ensure the processing method effectively destroys plant pests, and the requirements for the application of chemical materials in accordance with part 305 of... and Budget under control number 0579-0049) [60 FR 27674, May 25, 1995, as amended at 69 FR 52418, Aug...

  1. 7 CFR 319.40-8 - Processing at facilities operating under compliance agreements.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... plant pests from the facility, requirements to ensure the processing method effectively destroys plant pests, and the requirements for the application of chemical materials in accordance with part 305 of... and Budget under control number 0579-0049) [60 FR 27674, May 25, 1995, as amended at 69 FR 52418, Aug...

  2. 7 CFR 319.40-8 - Processing at facilities operating under compliance agreements.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... plant pests from the facility, requirements to ensure the processing method effectively destroys plant pests, and the requirements for the application of chemical materials in accordance with part 305 of... and Budget under control number 0579-0049) [60 FR 27674, May 25, 1995, as amended at 69 FR 52418, Aug...

  3. Propeller flap reconstruction of abdominal defects: review of the literature and case report.

    PubMed

    Scaglioni, Mario F; Giuseppe, Alberto Di; Chang, Edward I

    2015-01-01

    The abdominal wall is perfused anteriorly by the superior and deep epigastric vessels with a smaller contribution from the superficial system. The lateral abdominal wall is perfused predominantly from perforators arising from the intercostal vessels. Reconstruction of soft tissue defects involving the abdomen presents a difficult challenge for reconstructive surgeons. Pedicle perforator propeller flaps can be used to reconstruct defects of the abdomen, and here we present a thorough review of the literature as well as a case illustrating the perforasome propeller flap concept. A patient underwent resection for dermatofibrosarcoma protuberans resulting in a large defect of the epigastric soft tissue. A propeller flap was designed based on a perforator arising from the superior deep epigastric vessels and was rotated 90° into the defect allowing primary closure of the donor site. The patient healed uneventfully and was without recurrent disease 37 months following reconstruction. Perforator propeller flaps can be used successfully in reconstruction of abdominal defects and should be incorporated into the armamentarium of reconstructive microsurgeons already facile with perforator dissections. © 2014 Wiley Periodicals, Inc.

  4. Reconstruction for time-domain in vivo EPR 3D multigradient oximetric imaging--a parallel processing perspective.

    PubMed

    Dharmaraj, Christopher D; Thadikonda, Kishan; Fletcher, Anthony R; Doan, Phuc N; Devasahayam, Nallathamby; Matsumoto, Shingo; Johnson, Calvin A; Cook, John A; Mitchell, James B; Subramanian, Sankaran; Krishna, Murali C

    2009-01-01

    Three-dimensional Oximetric Electron Paramagnetic Resonance Imaging using the Single Point Imaging modality generates unpaired spin density and oxygen images that can readily distinguish between normal and tumor tissues in small animals. It is also possible with fast imaging to track the changes in tissue oxygenation in response to the oxygen content in the breathing air. However, this involves dealing with gigabytes of data for each 3D oximetric imaging experiment involving digital band pass filtering and background noise subtraction, followed by 3D Fourier reconstruction. This process is rather slow in a conventional uniprocessor system. This paper presents a parallelization framework using OpenMP runtime support and parallel MATLAB to execute such computationally intensive programs. The Intel compiler is used to develop a parallel C++ code based on OpenMP. The code is executed on four Dual-Core AMD Opteron shared memory processors, to reduce the computational burden of the filtration task significantly. The results show that the parallel code for filtration has achieved a speed up factor of 46.66 as against the equivalent serial MATLAB code. In addition, a parallel MATLAB code has been developed to perform 3D Fourier reconstruction. Speedup factors of 4.57 and 4.25 have been achieved during the reconstruction process and oximetry computation, for a data set with 23 x 23 x 23 gradient steps. The execution time has been computed for both the serial and parallel implementations using different dimensions of the data and presented for comparison. The reported system has been designed to be easily accessible even from low-cost personal computers through local internet (NIHnet). The experimental results demonstrate that the parallel computing provides a source of high computational power to obtain biophysical parameters from 3D EPR oximetric imaging, almost in real-time.

  5. Attractor reconstruction for non-linear systems: a methodological note

    USGS Publications Warehouse

    Nichols, J.M.; Nichols, J.D.

    2001-01-01

    Attractor reconstruction is an important step in the process of making predictions for non-linear time-series and in the computation of certain invariant quantities used to characterize the dynamics of such series. The utility of computed predictions and invariant quantities is dependent on the accuracy of attractor reconstruction, which in turn is determined by the methods used in the reconstruction process. This paper suggests methods by which the delay and embedding dimension may be selected for a typical delay coordinate reconstruction. A comparison is drawn between the use of the autocorrelation function and mutual information in quantifying the delay. In addition, a false nearest neighbor (FNN) approach is used in minimizing the number of delay vectors needed. Results highlight the need for an accurate reconstruction in the computation of the Lyapunov spectrum and in prediction algorithms.

  6. Event Reconstruction for Many-core Architectures using Java

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Graf, Norman A.; /SLAC

    Although Moore's Law remains technically valid, the performance enhancements in computing which traditionally resulted from increased CPU speeds ended years ago. Chip manufacturers have chosen to increase the number of core CPUs per chip instead of increasing clock speed. Unfortunately, these extra CPUs do not automatically result in improvements in simulation or reconstruction times. To take advantage of this extra computing power requires changing how software is written. Event reconstruction is globally serial, in the sense that raw data has to be unpacked first, channels have to be clustered to produce hits before those hits are identified as belonging tomore » a track or shower, tracks have to be found and fit before they are vertexed, etc. However, many of the individual procedures along the reconstruction chain are intrinsically independent and are perfect candidates for optimization using multi-core architecture. Threading is perhaps the simplest approach to parallelizing a program and Java includes a powerful threading facility built into the language. We have developed a fast and flexible reconstruction package (org.lcsim) written in Java that has been used for numerous physics and detector optimization studies. In this paper we present the results of our studies on optimizing the performance of this toolkit using multiple threads on many-core architectures.« less

  7. Onboard experiment data support facility. Task 2 report: Definition of onboard processing requirements

    NASA Technical Reports Server (NTRS)

    1976-01-01

    The onboard experiment data support facility (OEDSF) will provide data processing support to various experiment payloads on board the space shuttle. The OEDSF study will define the conceptual design and generate specifications for an OEDSF which will meet the following objectives: (1) provide a cost-effective approach to end-to-end processing requirements, (2) service multiple disciplines (3) satisfy user needs, (4) reduce the amount and improve the quality of data collected, stored and processed, and (5) embody growth capacity.

  8. Plastic Surgery Challenges in War Wounded I: Flap-Based Extremity Reconstruction

    PubMed Central

    Sabino, Jennifer M.; Slater, Julia; Valerio, Ian L.

    2016-01-01

    Scope and Significance: Reconstruction of traumatic injuries requiring tissue transfer begins with aggressive resuscitation and stabilization. Systematic advances in acute casualty care at the point of injury have improved survival and allowed for increasingly complex treatment before definitive reconstruction at tertiary medical facilities outside the combat zone. As a result, the complexity of the limb salvage algorithm has increased over 14 years of combat activities in Iraq and Afghanistan. Problem: Severe poly-extremity trauma in combat casualties has led to a large number of extremity salvage cases. Advanced reconstructive techniques coupled with regenerative medicine applications have played a critical role in the restoration, recovery, and rehabilitation of functional limb salvage. Translational Relevance: The past 14 years of war trauma have increased our understanding of tissue transfer for extremity reconstruction in the treatment of combat casualties. Injury patterns, flap choice, and reconstruction timing are critical variables to consider for optimal outcomes. Clinical Relevance: Subacute reconstruction with specifically chosen flap tissue and donor site location based on individual injuries result in successful tissue transfer, even in critically injured patients. These considerations can be combined with regenerative therapies to optimize massive wound coverage and limb salvage form and function in previously active patients. Summary: Traditional soft tissue reconstruction is integral in the treatment of war extremity trauma. Pedicle and free flaps are a critically important part of the reconstructive ladder for salvaging extreme extremity injuries that are seen as a result of the current practice of war. PMID:27679751

  9. Photoacoustic image reconstruction via deep learning

    NASA Astrophysics Data System (ADS)

    Antholzer, Stephan; Haltmeier, Markus; Nuster, Robert; Schwab, Johannes

    2018-02-01

    Applying standard algorithms to sparse data problems in photoacoustic tomography (PAT) yields low-quality images containing severe under-sampling artifacts. To some extent, these artifacts can be reduced by iterative image reconstruction algorithms which allow to include prior knowledge such as smoothness, total variation (TV) or sparsity constraints. These algorithms tend to be time consuming as the forward and adjoint problems have to be solved repeatedly. Further, iterative algorithms have additional drawbacks. For example, the reconstruction quality strongly depends on a-priori model assumptions about the objects to be recovered, which are often not strictly satisfied in practical applications. To overcome these issues, in this paper, we develop direct and efficient reconstruction algorithms based on deep learning. As opposed to iterative algorithms, we apply a convolutional neural network, whose parameters are trained before the reconstruction process based on a set of training data. For actual image reconstruction, a single evaluation of the trained network yields the desired result. Our presented numerical results (using two different network architectures) demonstrate that the proposed deep learning approach reconstructs images with a quality comparable to state of the art iterative reconstruction methods.

  10. Recent National Transonic Facility Test Process Improvements (Invited)

    NASA Technical Reports Server (NTRS)

    Kilgore, W. A.; Balakrishna, S.; Bobbitt, C. W., Jr.; Adcock, J. B.

    2001-01-01

    This paper describes the results of two recent process improvements; drag feed-forward Mach number control and simultaneous force/moment and pressure testing, at the National Transonic Facility. These improvements have reduced the duration and cost of testing. The drag feed-forward Mach number control reduces the Mach number settling time by using measured model drag in the Mach number control algorithm. Simultaneous force/moment and pressure testing allows simultaneous collection of force/moment and pressure data without sacrificing data quality thereby reducing the overall testing time. Both improvements can be implemented at any wind tunnel. Additionally the NTF is working to develop and implement continuous pitch as a testing option as an additional method to reduce costs and maintain data quality.

  11. Recent National Transonic Facility Test Process Improvements (Invited)

    NASA Technical Reports Server (NTRS)

    Kilgore, W. A.; Balakrishna, S.; Bobbitt, C. W., Jr.; Adcock, J. B.

    2001-01-01

    This paper describes the results of two recent process improvements; drag feed-forward Mach number control and simultaneous force/moment and pressure testing, at the National Transonic Facility. These improvements have reduced the duration and cost of testing. The drag feedforward Mach number control reduces the Mach number settling time by using measured model drag in the Mach number control algorithm. Simultaneous force/moment and pressure testing allows simultaneous collection of force/moment and pressure data without sacrificing data quality thereby reducing the overall testing time. Both improvements can be implemented at any wind tunnel. Additionally the NTF is working to develop and implement continuous pitch as a testing option as an additional method to reduce costs and maintain data quality.

  12. Team processes in airway facilities operations control centers.

    DOT National Transportation Integrated Search

    2000-07-01

    In October 2000, the Airway Facilities organization plans to transition the National Airspace System (NAS) monitoring responsibilities to three regional Operations Control Centers (OCCs). Teams in these facilities will be different from those that cu...

  13. Spatio-temporal distribution of stored-product inects around food processing and storage facilities

    USDA-ARS?s Scientific Manuscript database

    Grain storage and processing facilities consist of a landscape of indoor and outdoor habitats that can potentially support stored-product insect pests, and understanding patterns of species diversity and spatial distribution in the landscape surrounding structures can provide insight into how the ou...

  14. Project management plan, Waste Receiving and Processing Facility, Module 1, Project W-026

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Starkey, J.G.

    1993-05-01

    The Hanford Waste Receiving and Processing Facility Module 1 Project (WRAP 1) has been established to support the retrieval and final disposal of approximately 400K grams of plutonium and quantities of hazardous components currently stored in drums at the Hanford Site.

  15. Orbiter processing facility: Access platforms Kennedy Space Center, Florida, from challenge to achievement

    NASA Technical Reports Server (NTRS)

    Haratunian, M.

    1985-01-01

    A system of access platforms and equipment within the space shuttle orbiter processing facility at Kennedy Space Center is described. The design challenges of the platforms, including clearance envelopes, load criteria, and movement, are discussed. Various applications of moveable platforms are considered.

  16. Breast Reconstruction with Implants

    MedlinePlus

    ... implants is a complex procedure performed by a plastic surgeon. The breast reconstruction process can start at ... doctor may recommend that you meet with a plastic surgeon. Consult a plastic surgeon who's board certified ...

  17. Current strategies with 1-stage prosthetic breast reconstruction

    PubMed Central

    2015-01-01

    Background 1-stage prosthetic breast reconstruction is gaining traction as a preferred method of breast reconstruction in select patients who undergo mastectomy for cancer or prevention. Methods Critical elements to the procedure including patient selection, technique, surgical judgment, and postoperative care were reviewed. Results Outcomes series reveal that in properly selected patients, direct-to-implant (DTI) reconstruction has similar low rates of complications and high rates of patient satisfaction compared to traditional 2-stage reconstruction. Conclusions 1-stage prosthetic breast reconstruction may be the procedure of choice in select patients undergoing mastectomy. Advantages include the potential for the entire reconstructive process to be complete in one surgery, the quick return to normal activities, and lack of donor site morbidity. PMID:26005643

  18. 42 CFR 82.4 - How Will DOL Use the Results of the NIOSH Dose Reconstructions?

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... reconstruction results together with information on cancer diagnosis and other personal information provided to... probability that the cancer of the covered employee was caused by radiation exposure at a covered facility of...

  19. 42 CFR 82.4 - How Will DOL Use the Results of the NIOSH Dose Reconstructions?

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... reconstruction results together with information on cancer diagnosis and other personal information provided to... probability that the cancer of the covered employee was caused by radiation exposure at a covered facility of...

  20. 42 CFR 82.4 - How Will DOL Use the Results of the NIOSH Dose Reconstructions?

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... reconstruction results together with information on cancer diagnosis and other personal information provided to... probability that the cancer of the covered employee was caused by radiation exposure at a covered facility of...

  1. 42 CFR 82.4 - How Will DOL Use the Results of the NIOSH Dose Reconstructions?

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... reconstruction results together with information on cancer diagnosis and other personal information provided to... probability that the cancer of the covered employee was caused by radiation exposure at a covered facility of...

  2. 42 CFR 82.4 - How Will DOL Use the Results of the NIOSH Dose Reconstructions?

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... reconstruction results together with information on cancer diagnosis and other personal information provided to... probability that the cancer of the covered employee was caused by radiation exposure at a covered facility of...

  3. Digital reconstruction of Young's fringes using Fresnel transformation

    NASA Astrophysics Data System (ADS)

    Kulenovic, Rudi; Song, Yaozu; Renninger, P.; Groll, Manfred

    1997-11-01

    This paper deals with the digital numerical reconstruction of Young's fringes from laser speckle photography by means of the Fresnel-transformation. The physical model of the optical reconstruction of a specklegram is a near-field Fresnel-diffraction phenomenon which can be mathematically described by the Fresnel-transformation. Therefore, the interference phenomena can be directly calculated by a microcomputer.If additional a CCD-camera is used for specklegram recording the measurement procedure and evaluation process can be completely carried out in a digital way. Compared with conventional laser speckle photography no holographic plates, no wet development process and no optical specklegram reconstruction are needed. These advantages reveal a wide future in scientific and engineering applications. The basic principle of the numerical reconstruction is described, the effects of experimental parameters of Young's fringes are analyzed and representative results are presented.

  4. Reconstructing Historical VOC Concentrations in Drinking Water for Epidemiological Studies at a U.S. Military Base: Summary of Results

    PubMed Central

    Maslia, Morris L.; Aral, Mustafa M.; Ruckart, Perri Z.; Bove, Frank J.

    2017-01-01

    A U.S. government health agency conducted epidemiological studies to evaluate whether exposures to drinking water contaminated with volatile organic compounds (VOC) at U.S. Marine Corps Base Camp Lejeune, North Carolina, were associated with increased health risks to children and adults. These health studies required knowledge of contaminant concentrations in drinking water—at monthly intervals—delivered to family housing, barracks, and other facilities within the study area. Because concentration data were limited or unavailable during much of the period of contamination (1950s–1985), the historical reconstruction process was used to quantify estimates of monthly mean contaminant-specific concentrations. This paper integrates many efforts, reports, and papers into a synthesis of the overall approach to, and results from, a drinking-water historical reconstruction study. Results show that at the Tarawa Terrace water treatment plant (WTP) reconstructed (simulated) tetrachloroethylene (PCE) concentrations reached a maximum monthly average value of 183 micrograms per liter (μg/L) compared to a one-time maximum measured value of 215 μg/L and exceeded the U.S. Environmental Protection Agency’s current maximum contaminant level (MCL) of 5 μg/L during the period November 1957–February 1987. At the Hadnot Point WTP, reconstructed trichloroethylene (TCE) concentrations reached a maximum monthly average value of 783 μg/L compared to a one-time maximum measured value of 1400 μg/L during the period August 1953–December 1984. The Hadnot Point WTP also provided contaminated drinking water to the Holcomb Boulevard housing area continuously prior to June 1972, when the Holcomb Boulevard WTP came on line (maximum reconstructed TCE concentration of 32 μg/L) and intermittently during the period June 1972–February 1985 (maximum reconstructed TCE concentration of 66 μg/L). Applying the historical reconstruction process to quantify contaminant-specific monthly

  5. Paleobathymetric Reconstruction of Ross Sea: seismic data processing and regional reflectors mapping

    NASA Astrophysics Data System (ADS)

    Olivo, Elisabetta; De Santis, Laura; Wardell, Nigel; Geletti, Riccardo; Busetti, Martina; Sauli, Chiara; Bergamasco, Andrea; Colleoni, Florence; Vanzella, Walter; Sorlien, Christopher; Wilson, Doug; De Conto, Robert; Powell, Ross; Bart, Phil; Luyendyk, Bruce

    2017-04-01

    PURPOSE: New maps of some major unconformities of the Ross Sea have been reconstructed, by using seismic data grids, combined with the acoustic velocities from previous works, from new and reprocessed seismic profiles. This work is carried out with the support of PNRA and in the frame of the bilateral Italy-USA project GLAISS (Global Sea Level Rise & Antarctic Ice Sheet Stability predictions), funded by the Ministry of Foreign Affairs. Paleobathymetric maps of 30, 14 and 4 million years ago, three 'key moments' for the glacial history of the Antarctic Ice Sheet, coinciding with global climatic changes. The paleobathymetric maps will then be used for numeric simulations focused on the width and thickness of the Ross Sea Ice Sheet. PRELIMINARY RESULTS: The first step was to create TWT maps of three main unconformity (RSU6, RSU4, and RSU2) of Ross Sea, revisiting and updating the ANTOSTRAT maps, through the interpretation of sedimentary bodies and erosional features, used to infer active or old processes along the slope, we identified the main seismic unconformities. We used the HIS Kingdom academic license. The different groups contribution was on the analysis of the Eastern Ross Sea continental slope and rise (OGS), of the Central Basin (KOPRI) of the western and central Ross Sea (Univ. of Santa Barbara and OGS), where new drill sites and seismic profiles were collected after the publication of the ANTOSTRAT maps. Than we joined our interpretation with previous interpretations. We examined previous processing of several seismic lines and all the old acoustic velocity analysis. In addiction we reprocessed some lines in order to have a higher data coverage. Then, combining the TWT maps of the unconformity with the old and new speed data we created new depth maps of the study area. The new depth maps will then be used for reconstructing the paleobathymetry of the Ross Sea by applying backstripping technique.

  6. Cost-effectiveness analysis of the most common orthopaedic surgery procedures: knee arthroscopy and knee anterior cruciate ligament reconstruction.

    PubMed

    Lubowitz, James H; Appleby, David

    2011-10-01

    The purpose of this study was to determine the cost-effectiveness of knee arthroscopy and anterior cruciate ligament (ACL) reconstruction. Retrospective analysis of prospectively collected data from a single-surgeon, institutional review board-approved outcomes registry included 2 cohorts: surgically treated knee arthroscopy and ACL reconstruction patients. Our outcome measure is cost-effectiveness (cost of a quality-adjusted life-year [QALY]). The QALY is calculated by multiplying difference in health-related quality of life, before and after treatment, by life expectancy. Health-related quality of life is measured by use of the Quality of Well-Being scale, which has been validated for cost-effectiveness analysis. Costs are facility charges per the facility cost-to-charges ratio plus surgeon fee. Sensitivity analyses are performed to determine the effect of variations in costs or outcomes. There were 93 knee arthroscopy and 35 ACL reconstruction patients included at a mean follow-up of 2.1 years. Cost per QALY was $5,783 for arthroscopy and $10,326 for ACL reconstruction (2009 US dollars). Sensitivity analysis shows that our results are robust (relatively insensitive) to variations in costs or outcomes. Knee arthroscopy and knee ACL reconstruction are very cost-effective. Copyright © 2011 Arthroscopy Association of North America. Published by Elsevier Inc. All rights reserved.

  7. Petroleum and hazardous material releases from industrial facilities associated with Hurricane Katrina.

    PubMed

    Santella, Nicholas; Steinberg, Laura J; Sengul, Hatice

    2010-04-01

    Hurricane Katrina struck an area dense with industry, causing numerous releases of petroleum and hazardous materials. This study integrates information from a number of sources to describe the frequency, causes, and effects of these releases in order to inform analysis of risk from future hurricanes. Over 200 onshore releases of hazardous chemicals, petroleum, or natural gas were reported. Storm surge was responsible for the majority of petroleum releases and failure of storage tanks was the most common mechanism of release. Of the smaller number of hazardous chemical releases reported, many were associated with flaring from plant startup, shutdown, or process upset. In areas impacted by storm surge, 10% of the facilities within the Risk Management Plan (RMP) and Toxic Release Inventory (TRI) databases and 28% of SIC 1311 facilities experienced accidental releases. In areas subject only to hurricane strength winds, a lower fraction (1% of RMP and TRI and 10% of SIC 1311 facilities) experienced a release while 1% of all facility types reported a release in areas that experienced tropical storm strength winds. Of industrial facilities surveyed, more experienced indirect disruptions such as displacement of workers, loss of electricity and communication systems, and difficulty acquiring supplies and contractors for operations or reconstruction (55%), than experienced releases. To reduce the risk of hazardous material releases and speed the return to normal operations under these difficult conditions, greater attention should be devoted to risk-based facility design and improved prevention and response planning.

  8. Human Engineering Operations and Habitability Assessment: A Process for Advanced Life Support Ground Facility Testbeds

    NASA Technical Reports Server (NTRS)

    Connolly, Janis H.; Arch, M.; Elfezouaty, Eileen Schultz; Novak, Jennifer Blume; Bond, Robert L. (Technical Monitor)

    1999-01-01

    Design and Human Engineering (HE) processes strive to ensure that the human-machine interface is designed for optimal performance throughout the system life cycle. Each component can be tested and assessed independently to assure optimal performance, but it is not until full integration that the system and the inherent interactions between the system components can be assessed as a whole. HE processes (which are defining/app lying requirements for human interaction with missions/systems) are included in space flight activities, but also need to be included in ground activities and specifically, ground facility testbeds such as Bio-Plex. A unique aspect of the Bio-Plex Facility is the integral issue of Habitability which includes qualities of the environment that allow humans to work and live. HE is a process by which Habitability and system performance can be assessed.

  9. Israeli mothers' meaning reconstruction in the aftermath of homicide.

    PubMed

    Mahat-Shamir, Michal; Leichtentritt, Ronit D

    2016-01-01

    This study is the first to our knowledge to provide an in-depth account of the meanings reconstructed by bereaved Israeli mothers of homicide victims. Homicide survivors tend to receive little or no support from society; this is especially true in Israel, where homicide victims are a neglected population whose voice is socially muted. Constructivist theories have informed understanding of grief, emphasizing the role of meaning reconstruction in adaptation to bereavement, as well as the role of social support in the process of meaning reconstruction. We derived 3 prototypes of meaning from interviews of 12 bereaved mothers: the existential paradox; a bifurcated worldview; and oppression, mortification, and humiliation. Most informants used all 3 prototypes in the process of reconstructing meaning, describing changes in the perception of themselves, the world, and society. However, change was also accompanied by continuity, because participants did not abandon their former worldview while adopting a new one. The findings suggest that meaning reconstruction in the aftermath of homicide is a unique, multifaceted, and contradictory process. Implications for practice are outlined. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  10. Facilities | Bioenergy | NREL

    Science.gov Websites

    Facilities Facilities At NREL's state-of-the-art bioenergy research facilities, researchers design options. Photo of interior of industrial, two-story building with high-bay, piping, and large processing

  11. 3D reconstruction techniques made easy: know-how and pictures.

    PubMed

    Luccichenti, Giacomo; Cademartiri, Filippo; Pezzella, Francesca Romana; Runza, Giuseppe; Belgrano, Manuel; Midiri, Massimo; Sabatini, Umberto; Bastianello, Stefano; Krestin, Gabriel P

    2005-10-01

    Three-dimensional reconstructions represent a visual-based tool for illustrating the basis of three-dimensional post-processing such as interpolation, ray-casting, segmentation, percentage classification, gradient calculation, shading and illumination. The knowledge of the optimal scanning and reconstruction parameters facilitates the use of three-dimensional reconstruction techniques in clinical practise. The aim of this article is to explain the principles of multidimensional image processing in a pictorial way and the advantages and limitations of the different possibilities of 3D visualisation.

  12. [Application of Fourier transform profilometry in 3D-surface reconstruction].

    PubMed

    Shi, Bi'er; Lu, Kuan; Wang, Yingting; Li, Zhen'an; Bai, Jing

    2011-08-01

    With the improvement of system frame and reconstruction methods in fluorescent molecules tomography (FMT), the FMT technology has been widely used as an important experimental tool in biomedical research. It is necessary to get the 3D-surface profile of the experimental object as the boundary constraints of FMT reconstruction algorithms. We proposed a new 3D-surface reconstruction method based on Fourier transform profilometry (FTP) method under the blue-purple light condition. The slice images were reconstructed using proper image processing methods, frequency spectrum analysis and filtering. The results of experiment showed that the method properly reconstructed the 3D-surface of objects and has the mm-level accuracy. Compared to other methods, this one is simple and fast. Besides its well-reconstructed, the proposed method could help monitor the behavior of the object during the experiment to ensure the correspondence of the imaging process. Furthermore, the method chooses blue-purple light section as its light source to avoid the interference towards fluorescence imaging.

  13. Exploration Flight Test 1 Afterbody Aerothermal Environment Reconstruction

    NASA Technical Reports Server (NTRS)

    Hyatt, Andrew J.; Oliver, Brandon; Amar, Adam; Lessard, Victor

    2016-01-01

    The Exploration Flight Test 1 vehicle included roughly 100 near surface thermocouples on the after body of the vehicle. The temperature traces at each of these instruments have been used to perform inverse environment reconstruction to determine the aerothermal environment experienced during re-entry of the vehicle. This paper provides an overview of the reconstructed environments and identifies critical aspects of the environment. These critical aspects include transition and reaction control system jet influence. A blind test of the process and reconstruction tool was also performed to build confidence in the reconstructed environments. Finally, an uncertainty quantification analysis was also performed to identify the impact of each of the uncertainties on the reconstructed environments.

  14. Grey signal processing and data reconstruction in the non-diffracting beam triangulation measurement system

    NASA Astrophysics Data System (ADS)

    Meng, Hao; Wang, Zhongyu; Fu, Jihua

    2008-12-01

    The non-diffracting beam triangulation measurement system possesses the advantages of longer measurement range, higher theoretical measurement accuracy and higher resolution over the traditional laser triangulation measurement system. Unfortunately the measurement accuracy of the system is greatly degraded due to the speckle noise, the CCD photoelectric noise and the background light noise in practical applications. Hence, some effective signal processing methods must be applied to improve the measurement accuracy. In this paper a novel effective method for removing the noises in the non-diffracting beam triangulation measurement system is proposed. In the method the grey system theory is used to process and reconstruct the measurement signal. Through implementing the grey dynamic filtering based on the dynamic GM(1,1), the noises can be effectively removed from the primary measurement data and the measurement accuracy of the system can be improved as a result.

  15. Modeling the Non-Equilibrium Process of the Chemical Adsorption of Ammonia on GaN(0001) Reconstructed Surfaces Based on Steepest-Entropy-Ascent Quantum Thermodynamics.

    PubMed

    Kusaba, Akira; Li, Guanchen; von Spakovsky, Michael R; Kangawa, Yoshihiro; Kakimoto, Koichi

    2017-08-15

    Clearly understanding elementary growth processes that depend on surface reconstruction is essential to controlling vapor-phase epitaxy more precisely. In this study, ammonia chemical adsorption on GaN(0001) reconstructed surfaces under metalorganic vapor phase epitaxy (MOVPE) conditions (3Ga-H and N ad -H + Ga-H on a 2 × 2 unit cell) is investigated using steepest-entropy-ascent quantum thermodynamics (SEAQT). SEAQT is a thermodynamic-ensemble based, first-principles framework that can predict the behavior of non-equilibrium processes, even those far from equilibrium where the state evolution is a combination of reversible and irreversible dynamics. SEAQT is an ideal choice to handle this problem on a first-principles basis since the chemical adsorption process starts from a highly non-equilibrium state. A result of the analysis shows that the probability of adsorption on 3Ga-H is significantly higher than that on N ad -H + Ga-H. Additionally, the growth temperature dependence of these adsorption probabilities and the temperature increase due to the heat of reaction is determined. The non-equilibrium thermodynamic modeling applied can lead to better control of the MOVPE process through the selection of preferable reconstructed surfaces. The modeling also demonstrates the efficacy of DFT-SEAQT coupling for determining detailed non-equilibrium process characteristics with a much smaller computational burden than would be entailed with mechanics-based, microscopic-mesoscopic approaches.

  16. Modeling the Non-Equilibrium Process of the Chemical Adsorption of Ammonia on GaN(0001) Reconstructed Surfaces Based on Steepest-Entropy-Ascent Quantum Thermodynamics

    PubMed Central

    Kusaba, Akira; von Spakovsky, Michael R.; Kangawa, Yoshihiro; Kakimoto, Koichi

    2017-01-01

    Clearly understanding elementary growth processes that depend on surface reconstruction is essential to controlling vapor-phase epitaxy more precisely. In this study, ammonia chemical adsorption on GaN(0001) reconstructed surfaces under metalorganic vapor phase epitaxy (MOVPE) conditions (3Ga-H and Nad-H + Ga-H on a 2 × 2 unit cell) is investigated using steepest-entropy-ascent quantum thermodynamics (SEAQT). SEAQT is a thermodynamic-ensemble based, first-principles framework that can predict the behavior of non-equilibrium processes, even those far from equilibrium where the state evolution is a combination of reversible and irreversible dynamics. SEAQT is an ideal choice to handle this problem on a first-principles basis since the chemical adsorption process starts from a highly non-equilibrium state. A result of the analysis shows that the probability of adsorption on 3Ga-H is significantly higher than that on Nad-H + Ga-H. Additionally, the growth temperature dependence of these adsorption probabilities and the temperature increase due to the heat of reaction is determined. The non-equilibrium thermodynamic modeling applied can lead to better control of the MOVPE process through the selection of preferable reconstructed surfaces. The modeling also demonstrates the efficacy of DFT-SEAQT coupling for determining detailed non-equilibrium process characteristics with a much smaller computational burden than would be entailed with mechanics-based, microscopic-mesoscopic approaches. PMID:28809816

  17. Payload/GSE/data system interface: Users guide for the VPF (Vertical Processing Facility)

    NASA Technical Reports Server (NTRS)

    1993-01-01

    Payload/GSE/data system interface users guide for the Vertical Processing Facility is presented. The purpose of the document is three fold. First, the simulated Payload and Ground Support Equipment (GSE) Data System Interface, which is also known as the payload T-0 (T-Zero) System is described. This simulated system is located with the Cargo Integration Test Equipment (CITE) in the Vertical Processing Facility (VPF) that is located in the KSC Industrial Area. The actual Payload T-0 System consists of the Orbiter, Mobile Launch Platforms (MLPs), and Launch Complex (LC) 39A and B. This is referred to as the Pad Payload T-0 System (Refer to KSC-DL-116 for Pad Payload T-0 System description). Secondly, information is provided to the payload customer of differences between this simulated system and the actual system. Thirdly, a reference guide of the VPF Payload T-0 System for both KSC and payload customer personnel is provided.

  18. Defense Waste Processing Facility (DWPF) Viscosity Model: Revisions for Processing High TiO 2 Containing Glasses

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jantzen, C. M.; Edwards, T. B.

    Radioactive high-level waste (HLW) at the Savannah River Site (SRS) has successfully been vitrified into borosilicate glass in the Defense Waste Processing Facility (DWPF) since 1996. Vitrification requires stringent product/process (P/P) constraints since the glass cannot be reworked once it is poured into ten foot tall by two foot diameter canisters. A unique “feed forward” statistical process control (SPC) was developed for this control rather than statistical quality control (SQC). In SPC, the feed composition to the DWPF melter is controlled prior to vitrification. In SQC, the glass product would be sampled after it is vitrified. Individual glass property-composition modelsmore » form the basis for the “feed forward” SPC. The models transform constraints on the melt and glass properties into constraints on the feed composition going to the melter in order to guarantee, at the 95% confidence level, that the feed will be processable and that the durability of the resulting waste form will be acceptable to a geologic repository. The DWPF SPC system is known as the Product Composition Control System (PCCS). The DWPF will soon be receiving wastes from the Salt Waste Processing Facility (SWPF) containing increased concentrations of TiO 2, Na 2O, and Cs 2O . The SWPF is being built to pretreat the high-curie fraction of the salt waste to be removed from the HLW tanks in the F- and H-Area Tank Farms at the SRS. In order to process TiO 2 concentrations >2.0 wt% in the DWPF, new viscosity data were developed over the range of 1.90 to 6.09 wt% TiO 2 and evaluated against the 2005 viscosity model. An alternate viscosity model is also derived for potential future use, should the DWPF ever need to process other titanate-containing ion exchange materials. The ultimate limit on the amount of TiO 2 that can be accommodated from SWPF will be determined by the three PCCS models, the waste composition of a given sludge batch, the waste loading of the sludge batch, and

  19. Contamination concerns in the modular containerless processing facility

    NASA Technical Reports Server (NTRS)

    Seshan, P. K.; Trinh, E. H.

    1989-01-01

    This paper describes the problems of the control and management of contamination in the Modular Containerless Processing Facility (MCPF), that is being currently developed at the JPL for the Space Station, and in the MCPF's precursor version, called the Drop Physics Module (DPM), which will be carried aboard one or more Space Shuttle missions. Attention is given to the identification of contamination sources, their mode of transport to the sample positioned within the chamber, and the protection of the sample, as well as to the mathematical simulatiom of the contaminant transport. It is emphasized that, in order to choose and implement the most appropriate contamination control strategy for each investigator, a number of simplified mathematical simulations will have to be developed, and ground-based contamination experiments will have to be carried out with identical materials.

  20. 40 CFR 60.706 - Reconstruction.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... PERFORMANCE FOR NEW STATIONARY SOURCES Standards of Performance for Volatile Organic Compound Emissions From Synthetic Organic Chemical Manufacturing Industry (SOCMI) Reactor Processes § 60.706 Reconstruction. (a) For...

  1. 40 CFR 60.706 - Reconstruction.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... PERFORMANCE FOR NEW STATIONARY SOURCES Standards of Performance for Volatile Organic Compound Emissions From Synthetic Organic Chemical Manufacturing Industry (SOCMI) Reactor Processes § 60.706 Reconstruction. (a) For...

  2. Event Reconstruction in the PandaRoot framework

    NASA Astrophysics Data System (ADS)

    Spataro, Stefano

    2012-12-01

    The PANDA experiment will study the collisions of beams of anti-protons, with momenta ranging from 2-15 GeV/c, with fixed proton and nuclear targets in the charm energy range, and will be built at the FAIR facility. In preparation for the experiment, the PandaRoot software framework is under development for detector simulation, reconstruction and data analysis, running on an Alien2-based grid. The basic features are handled by the FairRoot framework, based on ROOT and Virtual Monte Carlo, while the PANDA detector specifics and reconstruction code are implemented inside PandaRoot. The realization of Technical Design Reports for the tracking detectors has pushed the finalization of the tracking reconstruction code, which is complete for the Target Spectrometer, and of the analysis tools. Particle Identification algorithms are currently implemented using Bayesian approach and compared to Multivariate Analysis methods. Moreover, the PANDA data acquisition foresees a triggerless operation in which events are not defined by a hardware 1st level trigger decision, but all the signals are stored with time stamps requiring a deconvolution by the software. This has led to a redesign of the software from an event basis to a time-ordered structure. In this contribution, the reconstruction capabilities of the Panda spectrometer will be reported, focusing on the performances of the tracking system and the results for the analysis of physics benchmark channels, as well as the new (and challenging) concept of time-based simulation and its implementation.

  3. Design analysis of levitation facility for space processing applications. [Skylab program, space shuttles

    NASA Technical Reports Server (NTRS)

    Frost, R. T.; Kornrumpf, W. P.; Napaluch, L. J.; Harden, J. D., Jr.; Walden, J. P.; Stockhoff, E. H.; Wouch, G.; Walker, L. H.

    1974-01-01

    Containerless processing facilities for the space laboratory and space shuttle are defined. Materials process examples representative of the most severe requirements for the facility in terms of electrical power, radio frequency equipment, and the use of an auxiliary electron beam heater were used to discuss matters having the greatest effect upon the space shuttle pallet payload interfaces and envelopes. Improved weight, volume, and efficiency estimates for the RF generating equipment were derived. Results are particularly significant because of the reduced requirements for heat rejection from electrical equipment, one of the principal envelope problems for shuttle pallet payloads. It is shown that although experiments on containerless melting of high temperature refractory materials make it desirable to consider the highest peak powers which can be made available on the pallet, total energy requirements are kept relatively low by the very fast processing times typical of containerless experiments and allows consideration of heat rejection capabilities lower than peak power demand if energy storage in system heat capacitances is considered. Batteries are considered to avoid a requirement for fuel cells capable of furnishing this brief peak power demand.

  4. Dynamic dual-tracer PET reconstruction.

    PubMed

    Gao, Fei; Liu, Huafeng; Jian, Yiqiang; Shi, Pengcheng

    2009-01-01

    Although of important medical implications, simultaneous dual-tracer positron emission tomography reconstruction remains a challenging problem, primarily because the photon measurements from dual tracers are overlapped. In this paper, we propose a simultaneous dynamic dual-tracer reconstruction of tissue activity maps based on guidance from tracer kinetics. The dual-tracer reconstruction problem is formulated in a state-space representation, where parallel compartment models serve as continuous-time system equation describing the tracer kinetic processes of dual tracers, and the imaging data is expressed as discrete sampling of the system states in measurement equation. The image reconstruction problem has therefore become a state estimation problem in a continuous-discrete hybrid paradigm, and H infinity filtering is adopted as the estimation strategy. As H infinity filtering makes no assumptions on the system and measurement statistics, robust reconstruction results can be obtained for the dual-tracer PET imaging system where the statistical properties of measurement data and system uncertainty are not available a priori, even when there are disturbances in the kinetic parameters. Experimental results on digital phantoms, Monte Carlo simulations and physical phantoms have demonstrated the superior performance.

  5. Adaptive algorithms of position and energy reconstruction in Anger-camera type detectors: experimental data processing in ANTS

    NASA Astrophysics Data System (ADS)

    Morozov, A.; Defendi, I.; Engels, R.; Fraga, F. A. F.; Fraga, M. M. F. R.; Gongadze, A.; Guerard, B.; Jurkovic, M.; Kemmerling, G.; Manzin, G.; Margato, L. M. S.; Niko, H.; Pereira, L.; Petrillo, C.; Peyaud, A.; Piscitelli, F.; Raspino, D.; Rhodes, N. J.; Sacchetti, F.; Schooneveld, E. M.; Solovov, V.; Van Esch, P.; Zeitelhack, K.

    2013-05-01

    The software package ANTS (Anger-camera type Neutron detector: Toolkit for Simulations), developed for simulation of Anger-type gaseous detectors for thermal neutron imaging was extended to include a module for experimental data processing. Data recorded with a sensor array containing up to 100 photomultiplier tubes (PMT) or silicon photomultipliers (SiPM) in a custom configuration can be loaded and the positions and energies of the events can be reconstructed using the Center-of-Gravity, Maximum Likelihood or Least Squares algorithm. A particular strength of the new module is the ability to reconstruct the light response functions and relative gains of the photomultipliers from flood field illumination data using adaptive algorithms. The performance of the module is demonstrated with simulated data generated in ANTS and experimental data recorded with a 19 PMT neutron detector. The package executables are publicly available at http://coimbra.lip.pt/~andrei/

  6. Local Surface Reconstruction from MER images using Stereo Workstation

    NASA Astrophysics Data System (ADS)

    Shin, Dongjoe; Muller, Jan-Peter

    2010-05-01

    The authors present a semi-automatic workflow that reconstructs the 3D shape of the martian surface from local stereo images delivered by PnCam or NavCam on systems such as the NASA Mars Exploration Rover (MER) Mission and in the future the ESA-NASA ExoMars rover PanCam. The process is initiated with manually selected tiepoints on a stereo workstation which is then followed by a tiepoint refinement, stereo-matching using region growing and Levenberg-Marquardt Algorithm (LMA)-based bundle adjustment processing. The stereo workstation, which is being developed by UCL in collaboration with colleagues at the Jet Propulsion Laboratory (JPL) within the EU FP7 ProVisG project, includes a set of practical GUI-based tools that enable an operator to define a visually correct tiepoint via a stereo display. To achieve platform and graphic hardware independence, the stereo application has been implemented using JPL's JADIS graphic library which is written in JAVA and the remaining processing blocks used in the reconstruction workflow have also been developed as a JAVA package to increase the code re-usability, portability and compatibility. Although initial tiepoints from the stereo workstation are reasonably acceptable as true correspondences, it is often required to employ an optional validity check and/or quality enhancing process. To meet this requirement, the workflow has been designed to include a tiepoint refinement process based on the Adaptive Least Square Correlation (ALSC) matching algorithm so that the initial tiepoints can be further enhanced to sub-pixel precision or rejected if they fail to pass the ALSC matching threshold. Apart from the accuracy of reconstruction, it is obvious that the other criterion to assess the quality of reconstruction is the density (or completeness) of reconstruction, which is not attained in the refinement process. Thus, we re-implemented a stereo region growing process, which is a core matching algorithm within the UCL

  7. [Hygiene provisions for the processing of food in nurseries and child care facilities. Approaching problems in practical experience].

    PubMed

    Bosche, H; Schmeisser, N

    2008-11-01

    In Germany more than 2 million children under the age of six attend child care institutions. Among the duties, these institutions have to provide meals to the children. Several food-borne viruses pose a particular threat to infants. In accordance with the new European Law on Food Hygiene nurseries and child care facilities are business premises as they process and dispense food. Law requires guarding all stages of food acquisition, storage, preparation and dispersal against health hazards. Furthermore, facilities are legally required to provide risk control and to ensure that food issued by their kitchen does not pose a health hazard upon consumption. Overall, child care facilities are given by far a more comprehensive responsibility under the new European Law. This article introduces a hygiene manual for child care facilities in accordance with the EU Law on Hygiene, which was field tested in more than 70 child care facilities during the course of the extensive organisational process. The manual supplies easy-tohandle instructions and form sheets for documentation and hence assists in realising legal provisions.

  8. Shading correction assisted iterative cone-beam CT reconstruction

    NASA Astrophysics Data System (ADS)

    Yang, Chunlin; Wu, Pengwei; Gong, Shutao; Wang, Jing; Lyu, Qihui; Tang, Xiangyang; Niu, Tianye

    2017-11-01

    Recent advances in total variation (TV) technology enable accurate CT image reconstruction from highly under-sampled and noisy projection data. The standard iterative reconstruction algorithms, which work well in conventional CT imaging, fail to perform as expected in cone beam CT (CBCT) applications, wherein the non-ideal physics issues, including scatter and beam hardening, are more severe. These physics issues result in large areas of shading artifacts and cause deterioration to the piecewise constant property assumed in reconstructed images. To overcome this obstacle, we incorporate a shading correction scheme into low-dose CBCT reconstruction and propose a clinically acceptable and stable three-dimensional iterative reconstruction method that is referred to as the shading correction assisted iterative reconstruction. In the proposed method, we modify the TV regularization term by adding a shading compensation image to the reconstructed image to compensate for the shading artifacts while leaving the data fidelity term intact. This compensation image is generated empirically, using image segmentation and low-pass filtering, and updated in the iterative process whenever necessary. When the compensation image is determined, the objective function is minimized using the fast iterative shrinkage-thresholding algorithm accelerated on a graphic processing unit. The proposed method is evaluated using CBCT projection data of the Catphan© 600 phantom and two pelvis patients. Compared with the iterative reconstruction without shading correction, the proposed method reduces the overall CT number error from around 200 HU to be around 25 HU and increases the spatial uniformity by a factor of 20 percent, given the same number of sparsely sampled projections. A clinically acceptable and stable iterative reconstruction algorithm for CBCT is proposed in this paper. Differing from the existing algorithms, this algorithm incorporates a shading correction scheme into the low

  9. Scramjet test flow reconstruction for a large-scale expansion tube, Part 1: quasi-one-dimensional modelling

    NASA Astrophysics Data System (ADS)

    Gildfind, D. E.; Jacobs, P. A.; Morgan, R. G.; Chan, W. Y. K.; Gollan, R. J.

    2018-07-01

    Large-scale free-piston driven expansion tubes have uniquely high total pressure capabilities which make them an important resource for development of access-to-space scramjet engine technology. However, many aspects of their operation are complex, and their test flows are fundamentally unsteady and difficult to measure. While computational fluid dynamics methods provide an important tool for quantifying these flows, these calculations become very expensive with increasing facility size and therefore have to be carefully constructed to ensure sufficient accuracy is achieved within feasible computational times. This study examines modelling strategies for a Mach 10 scramjet test condition developed for The University of Queensland's X3 facility. The present paper outlines the challenges associated with test flow reconstruction, describes the experimental set-up for the X3 experiments, and then details the development of an experimentally tuned quasi-one-dimensional CFD model of the full facility. The 1-D model, which accurately captures longitudinal wave processes, is used to calculate the transient flow history in the shock tube. This becomes the inflow to a higher-fidelity 2-D axisymmetric simulation of the downstream facility, detailed in the Part 2 companion paper, leading to a validated, fully defined nozzle exit test flow.

  10. Scramjet test flow reconstruction for a large-scale expansion tube, Part 1: quasi-one-dimensional modelling

    NASA Astrophysics Data System (ADS)

    Gildfind, D. E.; Jacobs, P. A.; Morgan, R. G.; Chan, W. Y. K.; Gollan, R. J.

    2017-11-01

    Large-scale free-piston driven expansion tubes have uniquely high total pressure capabilities which make them an important resource for development of access-to-space scramjet engine technology. However, many aspects of their operation are complex, and their test flows are fundamentally unsteady and difficult to measure. While computational fluid dynamics methods provide an important tool for quantifying these flows, these calculations become very expensive with increasing facility size and therefore have to be carefully constructed to ensure sufficient accuracy is achieved within feasible computational times. This study examines modelling strategies for a Mach 10 scramjet test condition developed for The University of Queensland's X3 facility. The present paper outlines the challenges associated with test flow reconstruction, describes the experimental set-up for the X3 experiments, and then details the development of an experimentally tuned quasi-one-dimensional CFD model of the full facility. The 1-D model, which accurately captures longitudinal wave processes, is used to calculate the transient flow history in the shock tube. This becomes the inflow to a higher-fidelity 2-D axisymmetric simulation of the downstream facility, detailed in the Part 2 companion paper, leading to a validated, fully defined nozzle exit test flow.

  11. Tensor-based Dictionary Learning for Dynamic Tomographic Reconstruction

    PubMed Central

    Tan, Shengqi; Zhang, Yanbo; Wang, Ge; Mou, Xuanqin; Cao, Guohua; Wu, Zhifang; Yu, Hengyong

    2015-01-01

    In dynamic computed tomography (CT) reconstruction, the data acquisition speed limits the spatio-temporal resolution. Recently, compressed sensing theory has been instrumental in improving CT reconstruction from far few-view projections. In this paper, we present an adaptive method to train a tensor-based spatio-temporal dictionary for sparse representation of an image sequence during the reconstruction process. The correlations among atoms and across phases are considered to capture the characteristics of an object. The reconstruction problem is solved by the alternating direction method of multipliers. To recover fine or sharp structures such as edges, the nonlocal total variation is incorporated into the algorithmic framework. Preclinical examples including a sheep lung perfusion study and a dynamic mouse cardiac imaging demonstrate that the proposed approach outperforms the vectorized dictionary-based CT reconstruction in the case of few-view reconstruction. PMID:25779991

  12. Milestones of mathematical model for business process management related to cost estimate documentation in petroleum industry

    NASA Astrophysics Data System (ADS)

    Khamidullin, R. I.

    2018-05-01

    The paper is devoted to milestones of the optimal mathematical model for a business process related to cost estimate documentation compiled during construction and reconstruction of oil and gas facilities. It describes the study and analysis of fundamental issues in petroleum industry, which are caused by economic instability and deterioration of a business strategy. Business process management is presented as business process modeling aimed at the improvement of the studied business process, namely main criteria of optimization and recommendations for the improvement of the above-mentioned business model.

  13. Synthetic fiber production facilities: Background information for proposed standards

    NASA Astrophysics Data System (ADS)

    Goodwin, D. R.

    1982-10-01

    Standards of performance to control emissions of volatile organic compounds (VOC) from new, modified, and reconstructed synthetic fiber production facilities are being proposed under section III of the Clean Air Act. This document contains information on the background and authority, regulatory alternatives considered, and environmental and economic impacts of the regulatory alternatives.

  14. Planning and Designing Facilities. Facility Design and Development--Part 1

    ERIC Educational Resources Information Center

    Hypes, Michael G.

    2006-01-01

    Before one begins the planning process for a new facility, it is important to determine if there is a need for a new facility. The demand for a new facility can be drawn from increases in the number of users, the type of users, and the type of events to be conducted in the facility. A feasibility study should be conducted to analyze the legal…

  15. Reconstruction of audio waveforms from spike trains of artificial cochlea models

    PubMed Central

    Zai, Anja T.; Bhargava, Saurabh; Mesgarani, Nima; Liu, Shih-Chii

    2015-01-01

    Spiking cochlea models describe the analog processing and spike generation process within the biological cochlea. Reconstructing the audio input from the artificial cochlea spikes is therefore useful for understanding the fidelity of the information preserved in the spikes. The reconstruction process is challenging particularly for spikes from the mixed signal (analog/digital) integrated circuit (IC) cochleas because of multiple non-linearities in the model and the additional variance caused by random transistor mismatch. This work proposes an offline method for reconstructing the audio input from spike responses of both a particular spike-based hardware model called the AEREAR2 cochlea and an equivalent software cochlea model. This method was previously used to reconstruct the auditory stimulus based on the peri-stimulus histogram of spike responses recorded in the ferret auditory cortex. The reconstructed audio from the hardware cochlea is evaluated against an analogous software model using objective measures of speech quality and intelligibility; and further tested in a word recognition task. The reconstructed audio under low signal-to-noise (SNR) conditions (SNR < –5 dB) gives a better classification performance than the original SNR input in this word recognition task. PMID:26528113

  16. Image reconstruction by domain-transform manifold learning.

    PubMed

    Zhu, Bo; Liu, Jeremiah Z; Cauley, Stephen F; Rosen, Bruce R; Rosen, Matthew S

    2018-03-21

    Image reconstruction is essential for imaging applications across the physical and life sciences, including optical and radar systems, magnetic resonance imaging, X-ray computed tomography, positron emission tomography, ultrasound imaging and radio astronomy. During image acquisition, the sensor encodes an intermediate representation of an object in the sensor domain, which is subsequently reconstructed into an image by an inversion of the encoding function. Image reconstruction is challenging because analytic knowledge of the exact inverse transform may not exist a priori, especially in the presence of sensor non-idealities and noise. Thus, the standard reconstruction approach involves approximating the inverse function with multiple ad hoc stages in a signal processing chain, the composition of which depends on the details of each acquisition strategy, and often requires expert parameter tuning to optimize reconstruction performance. Here we present a unified framework for image reconstruction-automated transform by manifold approximation (AUTOMAP)-which recasts image reconstruction as a data-driven supervised learning task that allows a mapping between the sensor and the image domain to emerge from an appropriate corpus of training data. We implement AUTOMAP with a deep neural network and exhibit its flexibility in learning reconstruction transforms for various magnetic resonance imaging acquisition strategies, using the same network architecture and hyperparameters. We further demonstrate that manifold learning during training results in sparse representations of domain transforms along low-dimensional data manifolds, and observe superior immunity to noise and a reduction in reconstruction artefacts compared with conventional handcrafted reconstruction methods. In addition to improving the reconstruction performance of existing acquisition methodologies, we anticipate that AUTOMAP and other learned reconstruction approaches will accelerate the development of

  17. Diffraction Correlation to Reconstruct Highly Strained Particles

    NASA Astrophysics Data System (ADS)

    Brown, Douglas; Harder, Ross; Clark, Jesse; Kim, J. W.; Kiefer, Boris; Fullerton, Eric; Shpyrko, Oleg; Fohtung, Edwin

    2015-03-01

    Through the use of coherent x-ray diffraction a three-dimensional diffraction pattern of a highly strained nano-crystal can be recorded in reciprocal space by a detector. Only the intensities are recorded, resulting in a loss of the complex phase. The recorded diffraction pattern therefore requires computational processing to reconstruct the density and complex distribution of the diffracted nano-crystal. For highly strained crystals, standard methods using HIO and ER algorithms are no longer sufficient to reconstruct the diffraction pattern. Our solution is to correlate the symmetry in reciprocal space to generate an a priori shape constraint to guide the computational reconstruction of the diffraction pattern. This approach has improved the ability to accurately reconstruct highly strained nano-crystals.

  18. 18 CFR 157.21 - Pre-filing procedures and review process for LNG terminal facilities and other natural gas...

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... and review process for LNG terminal facilities and other natural gas facilities prior to filing of... COMMISSION, DEPARTMENT OF ENERGY REGULATIONS UNDER NATURAL GAS ACT APPLICATIONS FOR CERTIFICATES OF PUBLIC CONVENIENCE AND NECESSITY AND FOR ORDERS PERMITTING AND APPROVING ABANDONMENT UNDER SECTION 7 OF THE NATURAL...

  19. A workflow to process 3D+time microscopy images of developing organisms and reconstruct their cell lineage

    PubMed Central

    Faure, Emmanuel; Savy, Thierry; Rizzi, Barbara; Melani, Camilo; Stašová, Olga; Fabrèges, Dimitri; Špir, Róbert; Hammons, Mark; Čúnderlík, Róbert; Recher, Gaëlle; Lombardot, Benoît; Duloquin, Louise; Colin, Ingrid; Kollár, Jozef; Desnoulez, Sophie; Affaticati, Pierre; Maury, Benoît; Boyreau, Adeline; Nief, Jean-Yves; Calvat, Pascal; Vernier, Philippe; Frain, Monique; Lutfalla, Georges; Kergosien, Yannick; Suret, Pierre; Remešíková, Mariana; Doursat, René; Sarti, Alessandro; Mikula, Karol; Peyriéras, Nadine; Bourgine, Paul

    2016-01-01

    The quantitative and systematic analysis of embryonic cell dynamics from in vivo 3D+time image data sets is a major challenge at the forefront of developmental biology. Despite recent breakthroughs in the microscopy imaging of living systems, producing an accurate cell lineage tree for any developing organism remains a difficult task. We present here the BioEmergences workflow integrating all reconstruction steps from image acquisition and processing to the interactive visualization of reconstructed data. Original mathematical methods and algorithms underlie image filtering, nucleus centre detection, nucleus and membrane segmentation, and cell tracking. They are demonstrated on zebrafish, ascidian and sea urchin embryos with stained nuclei and membranes. Subsequent validation and annotations are carried out using Mov-IT, a custom-made graphical interface. Compared with eight other software tools, our workflow achieved the best lineage score. Delivered in standalone or web service mode, BioEmergences and Mov-IT offer a unique set of tools for in silico experimental embryology. PMID:26912388

  20. Review of digital holography reconstruction methods

    NASA Astrophysics Data System (ADS)

    Dovhaliuk, Rostyslav Yu.

    2018-01-01

    Development of digital holography opened new ways of both transparent and opaque objects non-destructive study. In this paper, a digital hologram reconstruction process is investigated. The advantages and limitations of common wave propagation methods are discussed. The details of a software implementation of a digital hologram reconstruction methods are presented. Finally, the performance of each wave propagation method is evaluated, and recommendations about possible use cases for each of them are given.

  1. Reconstructing apology: David Cameron's Bloody Sunday apology in the press.

    PubMed

    McNeill, Andrew; Lyons, Evanthia; Pehrson, Samuel

    2014-12-01

    While there is an acknowledgement in apology research that political apologies are highly mediated, the process of mediation itself has lacked scrutiny. This article suggests that the idea of reconstruction helps to understand how apologies are mediated and evaluated. David Cameron's apology for Bloody Sunday is examined to see how he constructs four aspects of apology: social actors, consequences, categorization, and reasons. The reconstruction of those aspects by British, Unionist, and Nationalist press along with reconstructions made by soldiers in an online forum are considered. Data analysis was informed by thematic analysis and discourse analysis which helped to explore key aspects of reconstruction and how elements of Cameron's apology are altered in subsequent mediated forms of the apology. These mediated reconstructions of the apology allowed their authors to evaluate the apology in different ways. Thus, in this article, it is suggested that the evaluation of the apology by different groups is preceded by a reconstruction of it in accordance with rhetorical goals. This illuminates the process of mediation and helps to understand divergent responses to political apologies. © 2013 The British Psychological Society.

  2. Uncertainty Analysis in 3D Equilibrium Reconstruction

    DOE PAGES

    Cianciosa, Mark R.; Hanson, James D.; Maurer, David A.

    2018-02-21

    Reconstruction is an inverse process where a parameter space is searched to locate a set of parameters with the highest probability of describing experimental observations. Due to systematic errors and uncertainty in experimental measurements, this optimal set of parameters will contain some associated uncertainty. This uncertainty in the optimal parameters leads to uncertainty in models derived using those parameters. V3FIT is a three-dimensional (3D) equilibrium reconstruction code that propagates uncertainty from the input signals, to the reconstructed parameters, and to the final model. Here in this paper, we describe the methods used to propagate uncertainty in V3FIT. Using the resultsmore » of whole shot 3D equilibrium reconstruction of the Compact Toroidal Hybrid, this propagated uncertainty is validated against the random variation in the resulting parameters. Two different model parameterizations demonstrate how the uncertainty propagation can indicate the quality of a reconstruction. As a proxy for random sampling, the whole shot reconstruction results in a time interval that will be used to validate the propagated uncertainty from a single time slice.« less

  3. Uncertainty Analysis in 3D Equilibrium Reconstruction

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cianciosa, Mark R.; Hanson, James D.; Maurer, David A.

    Reconstruction is an inverse process where a parameter space is searched to locate a set of parameters with the highest probability of describing experimental observations. Due to systematic errors and uncertainty in experimental measurements, this optimal set of parameters will contain some associated uncertainty. This uncertainty in the optimal parameters leads to uncertainty in models derived using those parameters. V3FIT is a three-dimensional (3D) equilibrium reconstruction code that propagates uncertainty from the input signals, to the reconstructed parameters, and to the final model. Here in this paper, we describe the methods used to propagate uncertainty in V3FIT. Using the resultsmore » of whole shot 3D equilibrium reconstruction of the Compact Toroidal Hybrid, this propagated uncertainty is validated against the random variation in the resulting parameters. Two different model parameterizations demonstrate how the uncertainty propagation can indicate the quality of a reconstruction. As a proxy for random sampling, the whole shot reconstruction results in a time interval that will be used to validate the propagated uncertainty from a single time slice.« less

  4. Waste receiving and processing facility module 1 data management system software project management plan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Clark, R.E.

    1994-11-02

    This document provides the software development plan for the Waste Receiving and Processing (WRAP) Module 1 Data Management System (DMS). The DMS is one of the plant computer systems for the new WRAP 1 facility (Project W-026). The DMS will collect, store, and report data required to certify the low level waste (LLW) and transuranic (TRU) waste items processed at WRAP 1 as acceptable for shipment, storage, or disposal.

  5. Singular value decomposition for photon-processing nuclear imaging systems and applications for reconstruction and computing null functions.

    PubMed

    Jha, Abhinav K; Barrett, Harrison H; Frey, Eric C; Clarkson, Eric; Caucci, Luca; Kupinski, Matthew A

    2015-09-21

    Recent advances in technology are enabling a new class of nuclear imaging systems consisting of detectors that use real-time maximum-likelihood (ML) methods to estimate the interaction position, deposited energy, and other attributes of each photon-interaction event and store these attributes in a list format. This class of systems, which we refer to as photon-processing (PP) nuclear imaging systems, can be described by a fundamentally different mathematical imaging operator that allows processing of the continuous-valued photon attributes on a per-photon basis. Unlike conventional photon-counting (PC) systems that bin the data into images, PP systems do not have any binning-related information loss. Mathematically, while PC systems have an infinite-dimensional null space due to dimensionality considerations, PP systems do not necessarily suffer from this issue. Therefore, PP systems have the potential to provide improved performance in comparison to PC systems. To study these advantages, we propose a framework to perform the singular-value decomposition (SVD) of the PP imaging operator. We use this framework to perform the SVD of operators that describe a general two-dimensional (2D) planar linear shift-invariant (LSIV) PP system and a hypothetical continuously rotating 2D single-photon emission computed tomography (SPECT) PP system. We then discuss two applications of the SVD framework. The first application is to decompose the object being imaged by the PP imaging system into measurement and null components. We compare these components to the measurement and null components obtained with PC systems. In the process, we also present a procedure to compute the null functions for a PC system. The second application is designing analytical reconstruction algorithms for PP systems. The proposed analytical approach exploits the fact that PP systems acquire data in a continuous domain to estimate a continuous object function. The approach is parallelizable and

  6. Singular value decomposition for photon-processing nuclear imaging systems and applications for reconstruction and computing null functions

    NASA Astrophysics Data System (ADS)

    Jha, Abhinav K.; Barrett, Harrison H.; Frey, Eric C.; Clarkson, Eric; Caucci, Luca; Kupinski, Matthew A.

    2015-09-01

    Recent advances in technology are enabling a new class of nuclear imaging systems consisting of detectors that use real-time maximum-likelihood (ML) methods to estimate the interaction position, deposited energy, and other attributes of each photon-interaction event and store these attributes in a list format. This class of systems, which we refer to as photon-processing (PP) nuclear imaging systems, can be described by a fundamentally different mathematical imaging operator that allows processing of the continuous-valued photon attributes on a per-photon basis. Unlike conventional photon-counting (PC) systems that bin the data into images, PP systems do not have any binning-related information loss. Mathematically, while PC systems have an infinite-dimensional null space due to dimensionality considerations, PP systems do not necessarily suffer from this issue. Therefore, PP systems have the potential to provide improved performance in comparison to PC systems. To study these advantages, we propose a framework to perform the singular-value decomposition (SVD) of the PP imaging operator. We use this framework to perform the SVD of operators that describe a general two-dimensional (2D) planar linear shift-invariant (LSIV) PP system and a hypothetical continuously rotating 2D single-photon emission computed tomography (SPECT) PP system. We then discuss two applications of the SVD framework. The first application is to decompose the object being imaged by the PP imaging system into measurement and null components. We compare these components to the measurement and null components obtained with PC systems. In the process, we also present a procedure to compute the null functions for a PC system. The second application is designing analytical reconstruction algorithms for PP systems. The proposed analytical approach exploits the fact that PP systems acquire data in a continuous domain to estimate a continuous object function. The approach is parallelizable and

  7. 42 CFR 82.10 - Overview of the dose reconstruction process.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... doses using techniques discussed in § 82.16. Once the resulting data set is complete, NIOSH will.... Additionally, NIOSH may compile data, and information from NIOSH records that may contribute to the dose... which dose and exposure monitoring data is incomplete or insufficient for dose reconstruction. (h) NIOSH...

  8. 42 CFR 82.10 - Overview of the dose reconstruction process.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... doses using techniques discussed in § 82.16. Once the resulting data set is complete, NIOSH will.... Additionally, NIOSH may compile data, and information from NIOSH records that may contribute to the dose... which dose and exposure monitoring data is incomplete or insufficient for dose reconstruction. (h) NIOSH...

  9. 42 CFR 82.10 - Overview of the dose reconstruction process.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... doses using techniques discussed in § 82.16. Once the resulting data set is complete, NIOSH will.... Additionally, NIOSH may compile data, and information from NIOSH records that may contribute to the dose... which dose and exposure monitoring data is incomplete or insufficient for dose reconstruction. (h) NIOSH...

  10. 42 CFR 82.10 - Overview of the dose reconstruction process.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... doses using techniques discussed in § 82.16. Once the resulting data set is complete, NIOSH will.... Additionally, NIOSH may compile data, and information from NIOSH records that may contribute to the dose... which dose and exposure monitoring data is incomplete or insufficient for dose reconstruction. (h) NIOSH...

  11. 42 CFR 82.10 - Overview of the dose reconstruction process.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... doses using techniques discussed in § 82.16. Once the resulting data set is complete, NIOSH will.... Additionally, NIOSH may compile data, and information from NIOSH records that may contribute to the dose... which dose and exposure monitoring data is incomplete or insufficient for dose reconstruction. (h) NIOSH...

  12. Development of a prototype chest digital tomosynthesis (CDT) R/F system with fast image reconstruction using graphics processing unit (GPU) programming

    NASA Astrophysics Data System (ADS)

    Choi, Sunghoon; Lee, Seungwan; Lee, Haenghwa; Lee, Donghoon; Choi, Seungyeon; Shin, Jungwook; Seo, Chang-Woo; Kim, Hee-Joung

    2017-03-01

    Digital tomosynthesis offers the advantage of low radiation doses compared to conventional computed tomography (CT) by utilizing small numbers of projections ( 80) acquired over a limited angular range. It produces 3D volumetric data, although there are artifacts due to incomplete sampling. Based upon these characteristics, we developed a prototype digital tomosynthesis R/F system for applications in chest imaging. Our prototype chest digital tomosynthesis (CDT) R/F system contains an X-ray tube with high power R/F pulse generator, flat-panel detector, R/F table, electromechanical radiographic subsystems including a precise motor controller, and a reconstruction server. For image reconstruction, users select between analytic and iterative reconstruction methods. Our reconstructed images of Catphan700 and LUNGMAN phantoms clearly and rapidly described the internal structures of phantoms using graphics processing unit (GPU) programming. Contrast-to-noise ratio (CNR) values of the CTP682 module of Catphan700 were higher in images using a simultaneous algebraic reconstruction technique (SART) than in those using filtered back-projection (FBP) for all materials by factors of 2.60, 3.78, 5.50, 2.30, 3.70, and 2.52 for air, lung foam, low density polyethylene (LDPE), Delrin® (acetal homopolymer resin), bone 50% (hydroxyapatite), and Teflon, respectively. Total elapsed times for producing 3D volume were 2.92 s and 86.29 s on average for FBP and SART (20 iterations), respectively. The times required for reconstruction were clinically feasible. Moreover, the total radiation dose from our system (5.68 mGy) was lower than that of conventional chest CT scan. Consequently, our prototype tomosynthesis R/F system represents an important advance in digital tomosynthesis applications.

  13. Every factor helps: Rapid Ptychographic Reconstruction

    NASA Astrophysics Data System (ADS)

    Nashed, Youssef

    2015-03-01

    Recent advances in microscopy, specifically higher spatial resolution and data acquisition rates, require faster and more robust phase retrieval reconstruction methods. Ptychography is a phase retrieval technique for reconstructing the complex transmission function of a specimen from a sequence of diffraction patterns in visible light, X-ray, and electron microscopes. As technical advances allow larger fields to be imaged, computational challenges arise for reconstructing the correspondingly larger data volumes. Waiting to postprocess datasets offline results in missed opportunities. Here we present a parallel method for real-time ptychographic phase retrieval. It uses a hybrid parallel strategy to divide the computation between multiple graphics processing units (GPUs). A final specimen reconstruction is then achieved by different techniques to merge sub-dataset results into a single complex phase and amplitude image. Results are shown on a simulated specimen and real datasets from X-ray experiments conducted at a synchrotron light source.

  14. Limited view angle iterative CT reconstruction

    NASA Astrophysics Data System (ADS)

    Kisner, Sherman J.; Haneda, Eri; Bouman, Charles A.; Skatter, Sondre; Kourinny, Mikhail; Bedford, Simon

    2012-03-01

    Computed Tomography (CT) is widely used for transportation security to screen baggage for potential threats. For example, many airports use X-ray CT to scan the checked baggage of airline passengers. The resulting reconstructions are then used for both automated and human detection of threats. Recently, there has been growing interest in the use of model-based reconstruction techniques for application in CT security systems. Model-based reconstruction offers a number of potential advantages over more traditional direct reconstruction such as filtered backprojection (FBP). Perhaps one of the greatest advantages is the potential to reduce reconstruction artifacts when non-traditional scan geometries are used. For example, FBP tends to produce very severe streaking artifacts when applied to limited view data, which can adversely affect subsequent processing such as segmentation and detection. In this paper, we investigate the use of model-based reconstruction in conjunction with limited-view scanning architectures, and we illustrate the value of these methods using transportation security examples. The advantage of limited view architectures is that it has the potential to reduce the cost and complexity of a scanning system, but its disadvantage is that limited-view data can result in structured artifacts in reconstructed images. Our method of reconstruction depends on the formulation of both a forward projection model for the system, and a prior model that accounts for the contents and densities of typical baggage. In order to evaluate our new method, we use realistic models of baggage with randomly inserted simple simulated objects. Using this approach, we show that model-based reconstruction can substantially reduce artifacts and improve important metrics of image quality such as the accuracy of the estimated CT numbers.

  15. Interior reconstruction method based on rotation-translation scanning model.

    PubMed

    Wang, Xianchao; Tang, Ziyue; Yan, Bin; Li, Lei; Bao, Shanglian

    2014-01-01

    In various applications of computed tomography (CT), it is common that the reconstructed object is over the field of view (FOV) or we may intend to sue a FOV which only covers the region of interest (ROI) for the sake of reducing radiation dose. These kinds of imaging situations often lead to interior reconstruction problems which are difficult cases in the reconstruction field of CT, due to the truncated projection data at every view angle. In this paper, an interior reconstruction method is developed based on a rotation-translation (RT) scanning model. The method is implemented by first scanning the reconstructed region, and then scanning a small region outside the support of the reconstructed object after translating the rotation centre. The differentiated backprojection (DBP) images of the reconstruction region and the small region outside the object can be respectively obtained from the two-time scanning data without data rebinning process. At last, the projection onto convex sets (POCS) algorithm is applied to reconstruct the interior region. Numerical simulations are conducted to validate the proposed reconstruction method.

  16. Mastectomy Skin Necrosis After Breast Reconstruction: A Comparative Analysis Between Autologous Reconstruction and Implant-Based Reconstruction.

    PubMed

    Sue, Gloria R; Lee, Gordon K

    2018-05-01

    Mastectomy skin necrosis is a significant problem after breast reconstruction. We sought to perform a comparative analysis on this complication between patients undergoing autologous breast reconstruction and patients undergoing 2-stage expander implant breast reconstruction. A retrospective review was performed on consecutive patients undergoing autologous breast reconstruction or 2-stage expander implant breast reconstruction by the senior author from 2006 through 2015. Patient demographic factors including age, body mass index, history of diabetes, history of smoking, and history of radiation to the breast were collected. Our primary outcome measure was mastectomy skin necrosis. Fisher exact test was used for statistical analysis between the 2 patient cohorts. The treatment patterns of mastectomy skin necrosis were then analyzed. We identified 204 patients who underwent autologous breast reconstruction and 293 patients who underwent 2-stage expander implant breast reconstruction. Patients undergoing autologous breast reconstruction were older, heavier, more likely to have diabetes, and more likely to have had prior radiation to the breast compared with patients undergoing implant-based reconstruction. The incidence of mastectomy skin necrosis was 30.4% of patients in the autologous group compared with only 10.6% of patients in the tissue expander group (P < 0.001). The treatment of this complication differed between these 2 patient groups. In general, those with autologous reconstructions were treated with more conservative means. Although 37.1% of patients were treated successfully with local wound care in the autologous group, only 3.2% were treated with local wound care in the tissue expander group (P < 0.001). Less than half (29.0%) of patients in the autologous group were treated with an operative intervention for this complication compared with 41.9% in the implant-based group (P = 0.25). Mastectomy skin necrosis is significantly more likely to occur after

  17. Diagnostic delay amongst tuberculosis patients in Jogjakarta Province, Indonesia is related to the quality of services in DOTS facilities.

    PubMed

    Ahmad, Riris Andono; Mahendradhata, Yodi; Utarini, Adi; de Vlas, Sake J

    2011-04-01

    To understand determinants of care-seeking patterns and diagnostic delay amongst tuberculosis (TB) patients diagnosed at direct observed treatment short course (DOTS) facilities in Jogjakarta, Indonesia. Cross-sectional survey amongst newly diagnosed TB patients in 89 DOTS facilities whose history of care-seeking was reconstructed through retrospective interviews gathering data on socio-demographic determinants, onset of TB symptoms, type of health facilities visited, duration of each care-seeking action were recorded. Two hundred and fifty-three TB patients were included in the study whose median duration of patients' delay was 1 week and whose total duration of diagnostic delay was 5.4 weeks. The median number of visits was 4. Many of the patients' socio-demographic determinants were not associated with the care-seeking patterns, and no socio-demographic determinants were associated with the duration of diagnostic delay. More than 60% of TB patients started their care-seeking processes outside DOTS facilities, but the number of visits in DOTS facilities was greater during the overall care-seeking process. Surprisingly, patient's immediate visits to a DOTS facility did not correspond to shorter diagnostic delay. Diagnostic delay in Jogjakarta province was not associated with patients' socio demographic factors, but rather with the health system providing DOTS services. This suggests that strengthening the health system and improving diagnostic quality within DOTS services is now a more rational strategy than expanding the TB programme to engage more providers. © 2010 Blackwell Publishing Ltd.

  18. Microgravity and Materials Processing Facility study (MMPF): Requirements and Analyses of Commercial Operations (RACO) preliminary data release

    NASA Technical Reports Server (NTRS)

    1988-01-01

    This requirements and analyses of commercial operations (RACO) study data release reflects the current status of research activities of the Microgravity and Materials Processing Facility under Modification No. 21 to NASA/MSFC Contract NAS8-36122. Section 1 includes 65 commercial space processing projects suitable for deployment aboard the Space Station. Section 2 contains reports of the R:BASE (TM) electronic data base being used in the study, synopses of the experiments, and a summary of data on the experimental facilities. Section 3 is a discussion of video and data compression techniques used as well as a mission timeline analysis.

  19. Enterobacteriaceae and related organisms recovered from biofilms in a commercial shell egg processing facility.

    USDA-ARS?s Scientific Manuscript database

    During six visits, biofilms from egg contact and non-contact surfaces in a commercial shell egg processing facility were sampled. Thirty-five different sample sites were selected: Pre-wash and wash tanks (lids, screens, tank interiors, nozzle guards), post-wash spindles, blower filters, belts (far...

  20. Image improvement and three-dimensional reconstruction using holographic image processing

    NASA Technical Reports Server (NTRS)

    Stroke, G. W.; Halioua, M.; Thon, F.; Willasch, D. H.

    1977-01-01

    Holographic computing principles make possible image improvement and synthesis in many cases of current scientific and engineering interest. Examples are given for the improvement of resolution in electron microscopy and 3-D reconstruction in electron microscopy and X-ray crystallography, following an analysis of optical versus digital computing in such applications.

  1. Reconstruction of biofilm images: combining local and global structural parameters

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Resat, Haluk; Renslow, Ryan S.; Beyenal, Haluk

    2014-10-20

    Digitized images can be used for quantitative comparison of biofilms grown under different conditions. Using biofilm image reconstruction, it was previously found that biofilms with a completely different look can have nearly identical structural parameters and that the most commonly utilized global structural parameters were not sufficient to uniquely define these biofilms. Here, additional local and global parameters are introduced to show that these parameters considerably increase the reliability of the image reconstruction process. Assessment using human evaluators indicated that the correct identification rate of the reconstructed images increased from 50% to 72% with the introduction of the new parametersmore » into the reconstruction procedure. An expanded set of parameters especially improved the identification of biofilm structures with internal orientational features and of structures in which colony sizes and spatial locations varied. Hence, the newly introduced structural parameter sets helped to better classify the biofilms by incorporating finer local structural details into the reconstruction process.« less

  2. Image reconstruction by domain-transform manifold learning

    NASA Astrophysics Data System (ADS)

    Zhu, Bo; Liu, Jeremiah Z.; Cauley, Stephen F.; Rosen, Bruce R.; Rosen, Matthew S.

    2018-03-01

    Image reconstruction is essential for imaging applications across the physical and life sciences, including optical and radar systems, magnetic resonance imaging, X-ray computed tomography, positron emission tomography, ultrasound imaging and radio astronomy. During image acquisition, the sensor encodes an intermediate representation of an object in the sensor domain, which is subsequently reconstructed into an image by an inversion of the encoding function. Image reconstruction is challenging because analytic knowledge of the exact inverse transform may not exist a priori, especially in the presence of sensor non-idealities and noise. Thus, the standard reconstruction approach involves approximating the inverse function with multiple ad hoc stages in a signal processing chain, the composition of which depends on the details of each acquisition strategy, and often requires expert parameter tuning to optimize reconstruction performance. Here we present a unified framework for image reconstruction—automated transform by manifold approximation (AUTOMAP)—which recasts image reconstruction as a data-driven supervised learning task that allows a mapping between the sensor and the image domain to emerge from an appropriate corpus of training data. We implement AUTOMAP with a deep neural network and exhibit its flexibility in learning reconstruction transforms for various magnetic resonance imaging acquisition strategies, using the same network architecture and hyperparameters. We further demonstrate that manifold learning during training results in sparse representations of domain transforms along low-dimensional data manifolds, and observe superior immunity to noise and a reduction in reconstruction artefacts compared with conventional handcrafted reconstruction methods. In addition to improving the reconstruction performance of existing acquisition methodologies, we anticipate that AUTOMAP and other learned reconstruction approaches will accelerate the development

  3. A new method for reconstruction of solar irradiance

    NASA Astrophysics Data System (ADS)

    Privalsky, Victor

    2018-07-01

    The purpose of this research is to show how time series should be reconstructed using an example with the data on total solar irradiation (TSI) of the Earth and on sunspot numbers (SSN) since 1749. The traditional approach through regression equation(s) is designed for time-invariant vectors of random variables and is not applicable to time series, which present random functions of time. The autoregressive reconstruction (ARR) method suggested here requires fitting a multivariate stochastic difference equation to the target/proxy time series. The reconstruction is done through the scalar equation for the target time series with the white noise term excluded. The time series approach is shown to provide a better reconstruction of TSI than the correlation/regression method. A reconstruction criterion is introduced which allows one to define in advance the achievable level of success in the reconstruction. The conclusion is that time series, including the total solar irradiance, cannot be reconstructed properly if the data are not treated as sample records of random processes and analyzed in both time and frequency domains.

  4. An improved schlieren method for measurement and automatic reconstruction of the far-field focal spot

    PubMed Central

    Wang, Zhengzhou; Hu, Bingliang; Yin, Qinye

    2017-01-01

    The schlieren method of measuring far-field focal spots offers many advantages at the Shenguang III laser facility such as low cost and automatic laser-path collimation. However, current methods of far-field focal spot measurement often suffer from low precision and efficiency when the final focal spot is merged manually, thereby reducing the accuracy of reconstruction. In this paper, we introduce an improved schlieren method to construct the high dynamic-range image of far-field focal spots and improve the reconstruction accuracy and efficiency. First, a detection method based on weak light beam sampling and magnification imaging was designed; images of the main and side lobes of the focused laser irradiance in the far field were obtained using two scientific CCD cameras. Second, using a self-correlation template matching algorithm, a circle the same size as the schlieren ball was dug from the main lobe cutting image and used to change the relative region of the main lobe cutting image within a 100×100 pixel region. The position that had the largest correlation coefficient between the side lobe cutting image and the main lobe cutting image when a circle was dug was identified as the best matching point. Finally, the least squares method was used to fit the center of the side lobe schlieren small ball, and the error was less than 1 pixel. The experimental results show that this method enables the accurate, high-dynamic-range measurement of a far-field focal spot and automatic image reconstruction. Because the best matching point is obtained through image processing rather than traditional reconstruction methods based on manual splicing, this method is less sensitive to the efficiency of focal-spot reconstruction and thus offers better experimental precision. PMID:28207758

  5. A Microsoft Project-Based Planning, Tracking, and Management Tool for the National Transonic Facility's Model Changeover Process

    NASA Technical Reports Server (NTRS)

    Vairo, Daniel M.

    1998-01-01

    The removal and installation of sting-mounted wind tunnel models in the National Transonic Facility (NTF) is a multi-task process having a large impact on the annual throughput of the facility. Approximately ten model removal and installation cycles occur annually at the NTF with each cycle requiring slightly over five days to complete. The various tasks of the model changeover process were modeled in Microsoft Project as a template to provide a planning, tracking, and management tool. The template can also be used as a tool to evaluate improvements to this process. This document describes the development of the template and provides step-by-step instructions on its use and as a planning and tracking tool. A secondary role of this document is to provide an overview of the model changeover process and briefly describe the tasks associated with it.

  6. Containerless Processing in Reduced Gravity Using the TEMPUS Facility

    NASA Technical Reports Server (NTRS)

    Roger, Jan R.; Robinson, Michael B.

    1996-01-01

    Containerless processing provides a high purity environment for the study of high-temperature, very reactive materials. It is an important method which provides access to the metastable state of an undercooled melt. In the absence of container walls, the nucleation rate is greatly reduced and undercooling up to (Tm-Tn)/Tm approx. 0.2 can be obtained, where Tm and Tn are the melting and nucleation temperatures, respectively. Electromagnetic levitation represents a method particularly well-suited for the study of metallic melts. The TEMPUS facility is a research instrument designed to perform electromagnetic levitation studies in reduced gravity. It provides temperatures up to 2600 C, levitation of several grams of material and access to the undercooled state for an extended period of time (up to hours).

  7. Product/Process (P/P) Models For The Defense Waste Processing Facility (DWPF): Model Ranges And Validation Ranges For Future Processing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jantzen, C.; Edwards, T.

    Radioactive high level waste (HLW) at the Savannah River Site (SRS) has successfully been vitrified into borosilicate glass in the Defense Waste Processing Facility (DWPF) since 1996. Vitrification requires stringent product/process (P/P) constraints since the glass cannot be reworked once it is poured into ten foot tall by two foot diameter canisters. A unique “feed forward” statistical process control (SPC) was developed for this control rather than statistical quality control (SQC). In SPC, the feed composition to the DWPF melter is controlled prior to vitrification. In SQC, the glass product would be sampled after it is vitrified. Individual glass property-compositionmore » models form the basis for the “feed forward” SPC. The models transform constraints on the melt and glass properties into constraints on the feed composition going to the melter in order to guarantee, at the 95% confidence level, that the feed will be processable and that the durability of the resulting waste form will be acceptable to a geologic repository.« less

  8. Cryo-EM Structure Determination Using Segmented Helical Image Reconstruction.

    PubMed

    Fromm, S A; Sachse, C

    2016-01-01

    Treating helices as single-particle-like segments followed by helical image reconstruction has become the method of choice for high-resolution structure determination of well-ordered helical viruses as well as flexible filaments. In this review, we will illustrate how the combination of latest hardware developments with optimized image processing routines have led to a series of near-atomic resolution structures of helical assemblies. Originally, the treatment of helices as a sequence of segments followed by Fourier-Bessel reconstruction revealed the potential to determine near-atomic resolution structures from helical specimens. In the meantime, real-space image processing of helices in a stack of single particles was developed and enabled the structure determination of specimens that resisted classical Fourier helical reconstruction and also facilitated high-resolution structure determination. Despite the progress in real-space analysis, the combination of Fourier and real-space processing is still commonly used to better estimate the symmetry parameters as the imposition of the correct helical symmetry is essential for high-resolution structure determination. Recent hardware advancement by the introduction of direct electron detectors has significantly enhanced the image quality and together with improved image processing procedures has made segmented helical reconstruction a very productive cryo-EM structure determination method. © 2016 Elsevier Inc. All rights reserved.

  9. Fetal brain volumetry through MRI volumetric reconstruction and segmentation

    PubMed Central

    Estroff, Judy A.; Barnewolt, Carol E.; Connolly, Susan A.; Warfield, Simon K.

    2013-01-01

    Purpose Fetal MRI volumetry is a useful technique but it is limited by a dependency upon motion-free scans, tedious manual segmentation, and spatial inaccuracy due to thick-slice scans. An image processing pipeline that addresses these limitations was developed and tested. Materials and methods The principal sequences acquired in fetal MRI clinical practice are multiple orthogonal single-shot fast spin echo scans. State-of-the-art image processing techniques were used for inter-slice motion correction and super-resolution reconstruction of high-resolution volumetric images from these scans. The reconstructed volume images were processed with intensity non-uniformity correction and the fetal brain extracted by using supervised automated segmentation. Results Reconstruction, segmentation and volumetry of the fetal brains for a cohort of twenty-five clinically acquired fetal MRI scans was done. Performance metrics for volume reconstruction, segmentation and volumetry were determined by comparing to manual tracings in five randomly chosen cases. Finally, analysis of the fetal brain and parenchymal volumes was performed based on the gestational age of the fetuses. Conclusion The image processing pipeline developed in this study enables volume rendering and accurate fetal brain volumetry by addressing the limitations of current volumetry techniques, which include dependency on motion-free scans, manual segmentation, and inaccurate thick-slice interpolation. PMID:20625848

  10. Reconstructing European forest management from 1600 to 2010

    NASA Astrophysics Data System (ADS)

    McGrath, M. J.; Luyssaert, S.; Meyfroidt, P.; Kaplan, J. O.; Buergi, M.; Chen, Y.; Erb, K.; Gimmi, U.; McInerney, D.; Naudts, K.; Otto, J.; Pasztor, F.; Ryder, J.; Schelhaas, M.-J.; Valade, A.

    2015-04-01

    European forest use for fuel, timber and food dates back to pre-Roman times. Century-scale ecological processes and their legacy effects require accounting for forest management when studying today's forest carbon sink. Forest management reconstructions that are used to drive land surface models are one way to quantify the impact of both historical and today's large scale application of forest management on today's forest-related carbon sink and surface climate. In this study we reconstruct European forest management from 1600 to 2010 making use of diverse approaches, data sources and assumptions. Between 1600 and 1828, a demand-supply approach was used in which wood supply was reconstructed based on estimates of historical annual wood increment and land cover reconstructions. For the same period demand estimates accounted for the fuelwood needed in households, wood used in food processing, charcoal used in metal smelting and salt production, timber for construction and population estimates. Comparing estimated demand and supply resulted in a spatially explicit reconstruction of the share of forests under coppice, high stand management and forest left unmanaged. For the reconstruction between 1829 and 2010 a supply-driven back-casting method was used. The method used age reconstructions from the years 1950 to 2010 as its starting point. Our reconstruction reproduces the most important changes in forest management between 1600 and 2010: (1) an increase of 593 000 km2 in conifers at the expense of deciduous forest (decreasing by 538 000 km2), (2) a 612 000 km2 decrease in unmanaged forest, (3) a 152 000 km2 decrease in coppice management, (4) a 818 000 km2 increase in high stand management, and (5) the rise and fall of litter raking which at its peak in 1853 removed 50 Tg dry litter per year.

  11. Markov prior-based block-matching algorithm for superdimension reconstruction of porous media

    NASA Astrophysics Data System (ADS)

    Li, Yang; He, Xiaohai; Teng, Qizhi; Feng, Junxi; Wu, Xiaohong

    2018-04-01

    A superdimension reconstruction algorithm is used for the reconstruction of three-dimensional (3D) structures of a porous medium based on a single two-dimensional image. The algorithm borrows the concepts of "blocks," "learning," and "dictionary" from learning-based superresolution reconstruction and applies them to the 3D reconstruction of a porous medium. In the neighborhood-matching process of the conventional superdimension reconstruction algorithm, the Euclidean distance is used as a criterion, although it may not really reflect the structural correlation between adjacent blocks in an actual situation. Hence, in this study, regular items are adopted as prior knowledge in the reconstruction process, and a Markov prior-based block-matching algorithm for superdimension reconstruction is developed for more accurate reconstruction. The algorithm simultaneously takes into consideration the probabilistic relationship between the already reconstructed blocks in three different perpendicular directions (x , y , and z ) and the block to be reconstructed, and the maximum value of the probability product of the blocks to be reconstructed (as found in the dictionary for the three directions) is adopted as the basis for the final block selection. Using this approach, the problem of an imprecise spatial structure caused by a point simulation can be overcome. The problem of artifacts in the reconstructed structure is also addressed through the addition of hard data and by neighborhood matching. To verify the improved reconstruction accuracy of the proposed method, the statistical and morphological features of the results from the proposed method and traditional superdimension reconstruction method are compared with those of the target system. The proposed superdimension reconstruction algorithm is confirmed to enable a more accurate reconstruction of the target system while also eliminating artifacts.

  12. Defense Waste Processing Facility Simulant Chemical Processing Cell Studies for Sludge Batch 9

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, Tara E.; Newell, J. David; Woodham, Wesley H.

    The Savannah River National Laboratory (SRNL) received a technical task request from Defense Waste Processing Facility (DWPF) and Saltstone Engineering to perform simulant tests to support the qualification of Sludge Batch 9 (SB9) and to develop the flowsheet for SB9 in the DWPF. These efforts pertained to the DWPF Chemical Process Cell (CPC). CPC experiments were performed using SB9 simulant (SB9A) to qualify SB9 for sludge-only and coupled processing using the nitric-formic flowsheet in the DWPF. Two simulant batches were prepared, one representing SB8 Tank 40H and another representing SB9 Tank 51H. The simulant used for SB9 qualification testing wasmore » prepared by blending the SB8 Tank 40H and SB9 Tank 51H simulants. The blended simulant is referred to as SB9A. Eleven CPC experiments were run with an acid stoichiometry ranging between 105% and 145% of the Koopman minimum acid equation (KMA), which is equivalent to 109.7% and 151.5% of the Hsu minimum acid factor. Three runs were performed in the 1L laboratory scale setup, whereas the remainder were in the 4L laboratory scale setup. Sludge Receipt and Adjustment Tank (SRAT) and Slurry Mix Evaporator (SME) cycles were performed on nine of the eleven. The other two were SRAT cycles only. One coupled flowsheet and one extended run were performed for SRAT and SME processing. Samples of the condensate, sludge, and off-gas were taken to monitor the chemistry of the CPC experiments.« less

  13. Reconstruction of neuronal input through modeling single-neuron dynamics and computations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Qin, Qing; Wang, Jiang; Yu, Haitao

    Mathematical models provide a mathematical description of neuron activity, which can better understand and quantify neural computations and corresponding biophysical mechanisms evoked by stimulus. In this paper, based on the output spike train evoked by the acupuncture mechanical stimulus, we present two different levels of models to describe the input-output system to achieve the reconstruction of neuronal input. The reconstruction process is divided into two steps: First, considering the neuronal spiking event as a Gamma stochastic process. The scale parameter and the shape parameter of Gamma process are, respectively, defined as two spiking characteristics, which are estimated by a state-spacemore » method. Then, leaky integrate-and-fire (LIF) model is used to mimic the response system and the estimated spiking characteristics are transformed into two temporal input parameters of LIF model, through two conversion formulas. We test this reconstruction method by three different groups of simulation data. All three groups of estimates reconstruct input parameters with fairly high accuracy. We then use this reconstruction method to estimate the non-measurable acupuncture input parameters. Results show that under three different frequencies of acupuncture stimulus conditions, estimated input parameters have an obvious difference. The higher the frequency of the acupuncture stimulus is, the higher the accuracy of reconstruction is.« less

  14. Reconstruction of neuronal input through modeling single-neuron dynamics and computations

    NASA Astrophysics Data System (ADS)

    Qin, Qing; Wang, Jiang; Yu, Haitao; Deng, Bin; Chan, Wai-lok

    2016-06-01

    Mathematical models provide a mathematical description of neuron activity, which can better understand and quantify neural computations and corresponding biophysical mechanisms evoked by stimulus. In this paper, based on the output spike train evoked by the acupuncture mechanical stimulus, we present two different levels of models to describe the input-output system to achieve the reconstruction of neuronal input. The reconstruction process is divided into two steps: First, considering the neuronal spiking event as a Gamma stochastic process. The scale parameter and the shape parameter of Gamma process are, respectively, defined as two spiking characteristics, which are estimated by a state-space method. Then, leaky integrate-and-fire (LIF) model is used to mimic the response system and the estimated spiking characteristics are transformed into two temporal input parameters of LIF model, through two conversion formulas. We test this reconstruction method by three different groups of simulation data. All three groups of estimates reconstruct input parameters with fairly high accuracy. We then use this reconstruction method to estimate the non-measurable acupuncture input parameters. Results show that under three different frequencies of acupuncture stimulus conditions, estimated input parameters have an obvious difference. The higher the frequency of the acupuncture stimulus is, the higher the accuracy of reconstruction is.

  15. Reconstruction and Simulation of Neocortical Microcircuitry.

    PubMed

    Markram, Henry; Muller, Eilif; Ramaswamy, Srikanth; Reimann, Michael W; Abdellah, Marwan; Sanchez, Carlos Aguado; Ailamaki, Anastasia; Alonso-Nanclares, Lidia; Antille, Nicolas; Arsever, Selim; Kahou, Guy Antoine Atenekeng; Berger, Thomas K; Bilgili, Ahmet; Buncic, Nenad; Chalimourda, Athanassia; Chindemi, Giuseppe; Courcol, Jean-Denis; Delalondre, Fabien; Delattre, Vincent; Druckmann, Shaul; Dumusc, Raphael; Dynes, James; Eilemann, Stefan; Gal, Eyal; Gevaert, Michael Emiel; Ghobril, Jean-Pierre; Gidon, Albert; Graham, Joe W; Gupta, Anirudh; Haenel, Valentin; Hay, Etay; Heinis, Thomas; Hernando, Juan B; Hines, Michael; Kanari, Lida; Keller, Daniel; Kenyon, John; Khazen, Georges; Kim, Yihwa; King, James G; Kisvarday, Zoltan; Kumbhar, Pramod; Lasserre, Sébastien; Le Bé, Jean-Vincent; Magalhães, Bruno R C; Merchán-Pérez, Angel; Meystre, Julie; Morrice, Benjamin Roy; Muller, Jeffrey; Muñoz-Céspedes, Alberto; Muralidhar, Shruti; Muthurasa, Keerthan; Nachbaur, Daniel; Newton, Taylor H; Nolte, Max; Ovcharenko, Aleksandr; Palacios, Juan; Pastor, Luis; Perin, Rodrigo; Ranjan, Rajnish; Riachi, Imad; Rodríguez, José-Rodrigo; Riquelme, Juan Luis; Rössert, Christian; Sfyrakis, Konstantinos; Shi, Ying; Shillcock, Julian C; Silberberg, Gilad; Silva, Ricardo; Tauheed, Farhan; Telefont, Martin; Toledo-Rodriguez, Maria; Tränkler, Thomas; Van Geit, Werner; Díaz, Jafet Villafranca; Walker, Richard; Wang, Yun; Zaninetta, Stefano M; DeFelipe, Javier; Hill, Sean L; Segev, Idan; Schürmann, Felix

    2015-10-08

    We present a first-draft digital reconstruction of the microcircuitry of somatosensory cortex of juvenile rat. The reconstruction uses cellular and synaptic organizing principles to algorithmically reconstruct detailed anatomy and physiology from sparse experimental data. An objective anatomical method defines a neocortical volume of 0.29 ± 0.01 mm(3) containing ~31,000 neurons, and patch-clamp studies identify 55 layer-specific morphological and 207 morpho-electrical neuron subtypes. When digitally reconstructed neurons are positioned in the volume and synapse formation is restricted to biological bouton densities and numbers of synapses per connection, their overlapping arbors form ~8 million connections with ~37 million synapses. Simulations reproduce an array of in vitro and in vivo experiments without parameter tuning. Additionally, we find a spectrum of network states with a sharp transition from synchronous to asynchronous activity, modulated by physiological mechanisms. The spectrum of network states, dynamically reconfigured around this transition, supports diverse information processing strategies. VIDEO ABSTRACT. Copyright © 2015 Elsevier Inc. All rights reserved.

  16. GPU-accelerated Kernel Regression Reconstruction for Freehand 3D Ultrasound Imaging.

    PubMed

    Wen, Tiexiang; Li, Ling; Zhu, Qingsong; Qin, Wenjian; Gu, Jia; Yang, Feng; Xie, Yaoqin

    2017-07-01

    Volume reconstruction method plays an important role in improving reconstructed volumetric image quality for freehand three-dimensional (3D) ultrasound imaging. By utilizing the capability of programmable graphics processing unit (GPU), we can achieve a real-time incremental volume reconstruction at a speed of 25-50 frames per second (fps). After incremental reconstruction and visualization, hole-filling is performed on GPU to fill remaining empty voxels. However, traditional pixel nearest neighbor-based hole-filling fails to reconstruct volume with high image quality. On the contrary, the kernel regression provides an accurate volume reconstruction method for 3D ultrasound imaging but with the cost of heavy computational complexity. In this paper, a GPU-based fast kernel regression method is proposed for high-quality volume after the incremental reconstruction of freehand ultrasound. The experimental results show that improved image quality for speckle reduction and details preservation can be obtained with the parameter setting of kernel window size of [Formula: see text] and kernel bandwidth of 1.0. The computational performance of the proposed GPU-based method can be over 200 times faster than that on central processing unit (CPU), and the volume with size of 50 million voxels in our experiment can be reconstructed within 10 seconds.

  17. Low dose reconstruction algorithm for differential phase contrast imaging.

    PubMed

    Wang, Zhentian; Huang, Zhifeng; Zhang, Li; Chen, Zhiqiang; Kang, Kejun; Yin, Hongxia; Wang, Zhenchang; Marco, Stampanoni

    2011-01-01

    Differential phase contrast imaging computed tomography (DPCI-CT) is a novel x-ray inspection method to reconstruct the distribution of refraction index rather than the attenuation coefficient in weakly absorbing samples. In this paper, we propose an iterative reconstruction algorithm for DPCI-CT which benefits from the new compressed sensing theory. We first realize a differential algebraic reconstruction technique (DART) by discretizing the projection process of the differential phase contrast imaging into a linear partial derivative matrix. In this way the compressed sensing reconstruction problem of DPCI reconstruction can be transformed to a resolved problem in the transmission imaging CT. Our algorithm has the potential to reconstruct the refraction index distribution of the sample from highly undersampled projection data. Thus it can significantly reduce the dose and inspection time. The proposed algorithm has been validated by numerical simulations and actual experiments.

  18. Elemental properties of coal slag and measured airborne exposures at two coal slag processing facilities

    PubMed Central

    Mugford, Christopher; Boylstein, Randy; Gibbs, Jenna L

    2017-01-01

    In 1974, the National Institute for Occupational Safety and Health recommended a ban on the use of silica sand abrasives containing >1% silica due to the risk of silicosis. This gave rise to substitutes including coal slag. An Occupational Safety and Health Administration investigation in 2010 uncovered a case cluster of suspected pneumoconiosis in four former workers at a coal slag processing facility in Illinois, possibly attributable to occupational exposure to coal slag dust. This article presents the results from a National Institute for Occupational Safety and Health industrial hygiene survey at the same coal slag processing facility and a second facility. The industrial hygiene survey consisted of the collection of: a) bulk samples of unprocessed coal slag, finished granule product, and settled dust for metals and silica; b) full-shift area air samples for dust, metals, and crystalline silica; and c) full-shift personal air samples for dust, metals, and crystalline silica. Bulk samples consisted mainly of iron, manganese, titanium, and vanadium. Some samples had detectable levels of arsenic, beryllium, cadmium, and cobalt. Unprocessed coal slags from Illinois and Kentucky contained 0.43–0.48% (4,300–4,800 mg/kg) silica. Full-shift area air samples identified elevated total dust levels in the screen (2–38 mg/m3) and bag house (21 mg/m3) areas. Full-shift area air samples identified beryllium, chromium, cobalt, copper, iron, nickel, manganese, and vanadium. Overall, personal air samples for total and respirable dust (0.1–6.6 mg/m3 total; and 0.1–0.4 mg/m3 respirable) were lower than area air samples. All full-shift personal air samples for metals and silica were below published occupational exposure limits. All bulk samples of finished product granules contained less than 1% silica, supporting the claim coal slag may present less risk for silicosis than silica sand. We note that the results presented here are solely from two coal slag processing

  19. Elemental properties of coal slag and measured airborne exposures at two coal slag processing facilities.

    PubMed

    Mugford, Christopher; Boylstein, Randy; Gibbs, Jenna L

    2017-05-01

    In 1974, the National Institute for Occupational Safety and Health recommended a ban on the use of silica sand abrasives containing >1% silica due to the risk of silicosis. This gave rise to substitutes including coal slag. An Occupational Safety and Health Administration investigation in 2010 uncovered a case cluster of suspected pneumoconiosis in four former workers at a coal slag processing facility in Illinois, possibly attributable to occupational exposure to coal slag dust. This article presents the results from a National Institute for Occupational Safety and Health industrial hygiene survey at the same coal slag processing facility and a second facility. The industrial hygiene survey consisted of the collection of: (a) bulk samples of unprocessed coal slag, finished granule product, and settled dust for metals and silica; (b) full-shift area air samples for dust, metals, and crystalline silica; and (c) full-shift personal air samples for dust, metals, and crystalline silica. Bulk samples consisted mainly of iron, manganese, titanium, and vanadium. Some samples had detectable levels of arsenic, beryllium, cadmium, and cobalt. Unprocessed coal slags from Illinois and Kentucky contained 0.43-0.48% (4,300-4,800 mg/kg) silica. Full-shift area air samples identified elevated total dust levels in the screen (2-38 mg/m 3 ) and bag house (21 mg/m 3 ) areas. Full-shift area air samples identified beryllium, chromium, cobalt, copper, iron, nickel, manganese, and vanadium. Overall, personal air samples for total and respirable dust (0.1-6.6 mg/m 3 total; and 0.1-0.4 mg/m 3 respirable) were lower than area air samples. All full-shift personal air samples for metals and silica were below published occupational exposure limits. All bulk samples of finished product granules contained less than 1% silica, supporting the claim coal slag may present less risk for silicosis than silica sand. We note that the results presented here are solely from two coal slag processing

  20. Defense waste processing facility (DWPF) liquids model: revisions for processing higher TIO 2 containing glasses

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jantzen, C. M.; Edwards, T. B.; Trivelpiece, C. L.

    Radioactive high level waste (HLW) at the Savannah River Site (SRS) has successfully been vitrified into borosilicate glass in the Defense Waste Processing Facility (DWPF) since 1996. Vitrification requires stringent product/process (P/P) constraints since the glass cannot be reworked once it is poured into ten foot tall by two foot diameter canisters. A unique “feed forward” statistical process control (SPC) was developed for this control rather than statistical quality control (SQC). In SPC, the feed composition to the DWPF melter is controlled prior to vitrification. In SQC, the glass product would be sampled after it is vitrified. Individual glass property-compositionmore » models form the basis for the “feed forward” SPC. The models transform constraints on the melt and glass properties into constraints on the feed composition going to the melter in order to guarantee, at the 95% confidence level, that the feed will be processable and that the durability of the resulting waste form will be acceptable to a geologic repository. This report documents the development of revised TiO 2, Na 2O, Li 2O and Fe 2O 3 coefficients in the SWPF liquidus model and revised coefficients (a, b, c, and d).« less

  1. Trace: a high-throughput tomographic reconstruction engine for large-scale datasets.

    PubMed

    Bicer, Tekin; Gürsoy, Doğa; Andrade, Vincent De; Kettimuthu, Rajkumar; Scullin, William; Carlo, Francesco De; Foster, Ian T

    2017-01-01

    Modern synchrotron light sources and detectors produce data at such scale and complexity that large-scale computation is required to unleash their full power. One of the widely used imaging techniques that generates data at tens of gigabytes per second is computed tomography (CT). Although CT experiments result in rapid data generation, the analysis and reconstruction of the collected data may require hours or even days of computation time with a medium-sized workstation, which hinders the scientific progress that relies on the results of analysis. We present Trace, a data-intensive computing engine that we have developed to enable high-performance implementation of iterative tomographic reconstruction algorithms for parallel computers. Trace provides fine-grained reconstruction of tomography datasets using both (thread-level) shared memory and (process-level) distributed memory parallelization. Trace utilizes a special data structure called replicated reconstruction object to maximize application performance. We also present the optimizations that we apply to the replicated reconstruction objects and evaluate them using tomography datasets collected at the Advanced Photon Source. Our experimental evaluations show that our optimizations and parallelization techniques can provide 158× speedup using 32 compute nodes (384 cores) over a single-core configuration and decrease the end-to-end processing time of a large sinogram (with 4501 × 1 × 22,400 dimensions) from 12.5 h to <5 min per iteration. The proposed tomographic reconstruction engine can efficiently process large-scale tomographic data using many compute nodes and minimize reconstruction times.

  2. Integration of prior CT into CBCT reconstruction for improved image quality via reconstruction of difference: first patient studies

    NASA Astrophysics Data System (ADS)

    Zhang, Hao; Gang, Grace J.; Lee, Junghoon; Wong, John; Stayman, J. Webster

    2017-03-01

    Purpose: There are many clinical situations where diagnostic CT is used for an initial diagnosis or treatment planning, followed by one or more CBCT scans that are part of an image-guided intervention. Because the high-quality diagnostic CT scan is a rich source of patient-specific anatomical knowledge, this provides an opportunity to incorporate the prior CT image into subsequent CBCT reconstruction for improved image quality. We propose a penalized-likelihood method called reconstruction of difference (RoD), to directly reconstruct differences between the CBCT scan and the CT prior. In this work, we demonstrate the efficacy of RoD with clinical patient datasets. Methods: We introduce a data processing workflow using the RoD framework to reconstruct anatomical changes between the prior CT and current CBCT. This workflow includes processing steps to account for non-anatomical differences between the two scans including 1) scatter correction for CBCT datasets due to increased scatter fractions in CBCT data; 2) histogram matching for attenuation variations between CT and CBCT; and 3) registration for different patient positioning. CBCT projection data and CT planning volumes for two radiotherapy patients - one abdominal study and one head-and-neck study - were investigated. Results: In comparisons between the proposed RoD framework and more traditional FDK and penalized-likelihood reconstructions, we find a significant improvement in image quality when prior CT information is incorporated into the reconstruction. RoD is able to provide additional low-contrast details while correctly incorporating actual physical changes in patient anatomy. Conclusions: The proposed framework provides an opportunity to either improve image quality or relax data fidelity constraints for CBCT imaging when prior CT studies of the same patient are available. Possible clinical targets include CBCT image-guided radiotherapy and CBCT image-guided surgeries.

  3. A survey of GPU-based acceleration techniques in MRI reconstructions

    PubMed Central

    Wang, Haifeng; Peng, Hanchuan; Chang, Yuchou

    2018-01-01

    Image reconstruction in magnetic resonance imaging (MRI) clinical applications has become increasingly more complicated. However, diagnostic and treatment require very fast computational procedure. Modern competitive platforms of graphics processing unit (GPU) have been used to make high-performance parallel computations available, and attractive to common consumers for computing massively parallel reconstruction problems at commodity price. GPUs have also become more and more important for reconstruction computations, especially when deep learning starts to be applied into MRI reconstruction. The motivation of this survey is to review the image reconstruction schemes of GPU computing for MRI applications and provide a summary reference for researchers in MRI community. PMID:29675361

  4. A survey of GPU-based acceleration techniques in MRI reconstructions.

    PubMed

    Wang, Haifeng; Peng, Hanchuan; Chang, Yuchou; Liang, Dong

    2018-03-01

    Image reconstruction in magnetic resonance imaging (MRI) clinical applications has become increasingly more complicated. However, diagnostic and treatment require very fast computational procedure. Modern competitive platforms of graphics processing unit (GPU) have been used to make high-performance parallel computations available, and attractive to common consumers for computing massively parallel reconstruction problems at commodity price. GPUs have also become more and more important for reconstruction computations, especially when deep learning starts to be applied into MRI reconstruction. The motivation of this survey is to review the image reconstruction schemes of GPU computing for MRI applications and provide a summary reference for researchers in MRI community.

  5. Automatic methods of the processing of data from track detectors on the basis of the PAVICOM facility

    NASA Astrophysics Data System (ADS)

    Aleksandrov, A. B.; Goncharova, L. A.; Davydov, D. A.; Publichenko, P. A.; Roganova, T. M.; Polukhina, N. G.; Feinberg, E. L.

    2007-02-01

    New automatic methods essentially simplify and increase the rate of the processing of data from track detectors. This provides a possibility of processing large data arrays and considerably improves their statistical significance. This fact predetermines the development of new experiments which plan to use large-volume targets, large-area emulsion, and solid-state track detectors [1]. In this regard, the problem of training qualified physicists who are capable of operating modern automatic equipment is very important. Annually, about ten Moscow students master the new methods, working at the Lebedev Physical Institute at the PAVICOM facility [2 4]. Most students specializing in high-energy physics are only given an idea of archaic manual methods of the processing of data from track detectors. In 2005, on the basis of the PAVICOM facility and the physicstraining course of Moscow State University, a new training work was prepared. This work is devoted to the determination of the energy of neutrons passing through a nuclear emulsion. It provides the possibility of acquiring basic practical skills of the processing of data from track detectors using automatic equipment and can be included in the educational process of students of any physical faculty. Those who have mastered the methods of automatic data processing in a simple and pictorial example of track detectors will be able to apply their knowledge in various fields of science and technique. Formulation of training works for pregraduate and graduate students is a new additional aspect of application of the PAVICOM facility described earlier in [4].

  6. Listeria monocytogenes in Food-Processing Facilities, Food Contamination, and Human Listeriosis: The Brazilian Scenario.

    PubMed

    Camargo, Anderson Carlos; Woodward, Joshua John; Call, Douglas Ruben; Nero, Luís Augusto

    2017-11-01

    Listeria monocytogenes is a foodborne pathogen that contaminates food-processing environments and persists within biofilms on equipment, utensils, floors, and drains, ultimately reaching final products by cross-contamination. This pathogen grows even under high salt conditions or refrigeration temperatures, remaining viable in various food products until the end of their shelf life. While the estimated incidence of listeriosis is lower than other enteric illnesses, infections caused by L. monocytogenes are more likely to lead to hospitalizations and fatalities. Despite the description of L. monocytogenes occurrence in Brazilian food-processing facilities and foods, there is a lack of consistent data regarding listeriosis cases and outbreaks directly associated with food consumption. Listeriosis requires rapid treatment with antibiotics and most drugs suitable for Gram-positive bacteria are effective against L. monocytogenes. Only a minority of clinical antibiotic-resistant L. monocytogenes strains have been described so far; whereas many strains recovered from food-processing facilities and foods exhibited resistance to antimicrobials not suitable against listeriosis. L. monocytogenes control in food industries is a challenge, demanding proper cleaning and application of sanitization procedures to eliminate this foodborne pathogen from the food-processing environment and ensure food safety. This review focuses on presenting the L. monocytogenes distribution in food-processing environment, food contamination, and control in the food industry, as well as the consequences of listeriosis to human health, providing a comparison of the current Brazilian situation with the international scenario.

  7. Restoration of singularities in reconstructed phase of crystal image in electron holography.

    PubMed

    Li, Wei; Tanji, Takayoshi

    2014-12-01

    Off-axis electron holography can be used to measure the inner potential of a specimen from its reconstructed phase image and is thus a powerful technique for materials scientists. However, abrupt reversals of contrast from white to black may sometimes occur in a digitally reconstructed phase image, which results in inaccurate information. Such phase distortion is mainly due to the digital reconstruction process and weak electron wave amplitude in some areas of the specimen. Therefore, digital image processing can be applied to the reconstruction and restoration of phase images. In this paper, fringe reconnection processing is applied to phase image restoration of a crystal structure image. The disconnection and wrong connection of interference fringes in the hologram that directly cause a 2π phase jump imperfection are correctly reconnected. Experimental results show that the phase distortion is significantly reduced after the processing. The quality of the reconstructed phase image was improved by the removal of imperfections in the final phase. © The Author 2014. Published by Oxford University Press on behalf of The Japanese Society of Microscopy. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  8. Implementation of GPU accelerated SPECT reconstruction with Monte Carlo-based scatter correction.

    PubMed

    Bexelius, Tobias; Sohlberg, Antti

    2018-06-01

    Statistical SPECT reconstruction can be very time-consuming especially when compensations for collimator and detector response, attenuation, and scatter are included in the reconstruction. This work proposes an accelerated SPECT reconstruction algorithm based on graphics processing unit (GPU) processing. Ordered subset expectation maximization (OSEM) algorithm with CT-based attenuation modelling, depth-dependent Gaussian convolution-based collimator-detector response modelling, and Monte Carlo-based scatter compensation was implemented using OpenCL. The OpenCL implementation was compared against the existing multi-threaded OSEM implementation running on a central processing unit (CPU) in terms of scatter-to-primary ratios, standardized uptake values (SUVs), and processing speed using mathematical phantoms and clinical multi-bed bone SPECT/CT studies. The difference in scatter-to-primary ratios, visual appearance, and SUVs between GPU and CPU implementations was minor. On the other hand, at its best, the GPU implementation was noticed to be 24 times faster than the multi-threaded CPU version on a normal 128 × 128 matrix size 3 bed bone SPECT/CT data set when compensations for collimator and detector response, attenuation, and scatter were included. GPU SPECT reconstructions show great promise as an every day clinical reconstruction tool.

  9. SPECIAL ISSUE ON OPTICAL PROCESSING OF INFORMATION: Reconstruction of vector physical fields by optical tomography

    NASA Astrophysics Data System (ADS)

    Kulchin, Yurii N.; Vitrik, O. B.; Kamenev, O. T.; Kirichenko, O. V.; Petrov, Yu S.

    1995-10-01

    Reconstruction of vector physical fields by optical tomography, with the aid of a system of fibre-optic measuring lines, is considered. The reported experimental results are used to reconstruct the distribution of the square of the gradient of transverse displacements of a flat membrane.

  10. Analyser-based phase contrast image reconstruction using geometrical optics.

    PubMed

    Kitchen, M J; Pavlov, K M; Siu, K K W; Menk, R H; Tromba, G; Lewis, R A

    2007-07-21

    Analyser-based phase contrast imaging can provide radiographs of exceptional contrast at high resolution (<100 microm), whilst quantitative phase and attenuation information can be extracted using just two images when the approximations of geometrical optics are satisfied. Analytical phase retrieval can be performed by fitting the analyser rocking curve with a symmetric Pearson type VII function. The Pearson VII function provided at least a 10% better fit to experimentally measured rocking curves than linear or Gaussian functions. A test phantom, a hollow nylon cylinder, was imaged at 20 keV using a Si(1 1 1) analyser at the ELETTRA synchrotron radiation facility. Our phase retrieval method yielded a more accurate object reconstruction than methods based on a linear fit to the rocking curve. Where reconstructions failed to map expected values, calculations of the Takagi number permitted distinction between the violation of the geometrical optics conditions and the failure of curve fitting procedures. The need for synchronized object/detector translation stages was removed by using a large, divergent beam and imaging the object in segments. Our image acquisition and reconstruction procedure enables quantitative phase retrieval for systems with a divergent source and accounts for imperfections in the analyser.

  11. Supersonic Flight Dynamics Test: Trajectory, Atmosphere, and Aerodynamics Reconstruction

    NASA Technical Reports Server (NTRS)

    Kutty, Prasad; Karlgaard, Christopher D.; Blood, Eric M.; O'Farrell, Clara; Ginn, Jason M.; Shoenenberger, Mark; Dutta, Soumyo

    2015-01-01

    The Supersonic Flight Dynamics Test is a full-scale flight test of a Supersonic Inflatable Aerodynamic Decelerator, which is part of the Low Density Supersonic Decelerator technology development project. The purpose of the project is to develop and mature aerodynamic decelerator technologies for landing large mass payloads on the surface of Mars. The technologies include a Supersonic Inflatable Aerodynamic Decelerator and Supersonic Parachutes. The first Supersonic Flight Dynamics Test occurred on June 28th, 2014 at the Pacific Missile Range Facility. This test was used to validate the test architecture for future missions. The flight was a success and, in addition, was able to acquire data on the aerodynamic performance of the supersonic inflatable decelerator. This paper describes the instrumentation, analysis techniques, and acquired flight test data utilized to reconstruct the vehicle trajectory, atmosphere, and aerodynamics. The results of the reconstruction show significantly higher lofting of the trajectory, which can partially be explained by off-nominal booster motor performance. The reconstructed vehicle force and moment coefficients fall well within pre-flight predictions. A parameter identification analysis indicates that the vehicle displayed greater aerodynamic static stability than seen in pre-flight computational predictions and ballistic range tests.

  12. A modified discrete algebraic reconstruction technique for multiple grey image reconstruction for limited angle range tomography.

    PubMed

    Liang, Zhiting; Guan, Yong; Liu, Gang; Chen, Xiangyu; Li, Fahu; Guo, Pengfei; Tian, Yangchao

    2016-03-01

    The `missing wedge', which is due to a restricted rotation range, is a major challenge for quantitative analysis of an object using tomography. With prior knowledge of the grey levels, the discrete algebraic reconstruction technique (DART) is able to reconstruct objects accurately with projections in a limited angle range. However, the quality of the reconstructions declines as the number of grey levels increases. In this paper, a modified DART (MDART) was proposed, in which each independent region of homogeneous material was chosen as a research object, instead of the grey values. The grey values of each discrete region were estimated according to the solution of the linear projection equations. The iterative process of boundary pixels updating and correcting the grey values of each region was executed alternately. Simulation experiments of binary phantoms as well as multiple grey phantoms show that MDART is capable of achieving high-quality reconstructions with projections in a limited angle range. The interesting advancement of MDART is that neither prior knowledge of the grey values nor the number of grey levels is necessary.

  13. Non-iterative volumetric particle reconstruction near moving bodies

    NASA Astrophysics Data System (ADS)

    Mendelson, Leah; Techet, Alexandra

    2017-11-01

    When multi-camera 3D PIV experiments are performed around a moving body, the body often obscures visibility of regions of interest in the flow field in a subset of cameras. We evaluate the performance of non-iterative particle reconstruction algorithms used for synthetic aperture PIV (SAPIV) in these partially-occluded regions. We show that when partial occlusions are present, the quality and availability of 3D tracer particle information depends on the number of cameras and reconstruction procedure used. Based on these findings, we introduce an improved non-iterative reconstruction routine for SAPIV around bodies. The reconstruction procedure combines binary masks, already required for reconstruction of the body's 3D visual hull, and a minimum line-of-sight algorithm. This approach accounts for partial occlusions without performing separate processing for each possible subset of cameras. We combine this reconstruction procedure with three-dimensional imaging on both sides of the free surface to reveal multi-fin wake interactions generated by a jumping archer fish. Sufficient particle reconstruction in near-body regions is crucial to resolving the wake structures of upstream fins (i.e., dorsal and anal fins) before and during interactions with the caudal tail.

  14. Online Event Reconstruction in the CBM Experiment at FAIR

    NASA Astrophysics Data System (ADS)

    Akishina, Valentina; Kisel, Ivan

    2018-02-01

    Targeting for rare observables, the CBM experiment will operate at high interaction rates of up to 10 MHz, which is unprecedented in heavy-ion experiments so far. It requires a novel free-streaming readout system and a new concept of data processing. The huge data rates of the CBM experiment will be reduced online to the recordable rate before saving the data to the mass storage. Full collision reconstruction and selection will be performed online in a dedicated processor farm. In order to make an efficient event selection online a clean sample of particles has to be provided by the reconstruction package called First Level Event Selection (FLES). The FLES reconstruction and selection package consists of several modules: track finding, track fitting, event building, short-lived particles finding, and event selection. Since detector measurements contain also time information, the event building is done at all stages of the reconstruction process. The input data are distributed within the FLES farm in a form of time-slices. A time-slice is reconstructed in parallel between processor cores. After all tracks of the whole time-slice are found and fitted, they are collected into clusters of tracks originated from common primary vertices, which then are fitted, thus identifying the interaction points. Secondary tracks are associated with primary vertices according to their estimated production time. After that short-lived particles are found and the full event building process is finished. The last stage of the FLES package is a selection of events according to the requested trigger signatures. The event reconstruction procedure and the results of its application to simulated collisions in the CBM detector setup are presented and discussed in detail.

  15. 'tomo_display' and 'vol_tools': IDL VM Packages for Tomography Data Reconstruction, Processing, and Visualization

    NASA Astrophysics Data System (ADS)

    Rivers, M. L.; Gualda, G. A.

    2009-05-01

    One of the challenges in tomography is the availability of suitable software for image processing and analysis in 3D. We present here 'tomo_display' and 'vol_tools', two packages created in IDL that enable reconstruction, processing, and visualization of tomographic data. They complement in many ways the capabilities offered by Blob3D (Ketcham 2005 - Geosphere, 1: 32-41, DOI: 10.1130/GES00001.1) and, in combination, allow users without programming knowledge to perform all steps necessary to obtain qualitative and quantitative information using tomographic data. The package 'tomo_display' was created and is maintained by Mark Rivers. It allows the user to: (1) preprocess and reconstruct parallel beam tomographic data, including removal of anomalous pixels, ring artifact reduction, and automated determination of the rotation center, (2) visualization of both raw and reconstructed data, either as individual frames, or as a series of sequential frames. The package 'vol_tools' consists of a series of small programs created and maintained by Guilherme Gualda to perform specific tasks not included in other packages. Existing modules include simple tools for cropping volumes, generating histograms of intensity, sample volume measurement (useful for porous samples like pumice), and computation of volume differences (for differential absorption tomography). The module 'vol_animate' can be used to generate 3D animations using rendered isosurfaces around objects. Both packages use the same NetCDF format '.volume' files created using code written by Mark Rivers. Currently, only 16-bit integer volumes are created and read by the packages, but floating point and 8-bit data can easily be stored in the NetCDF format as well. A simple GUI to convert sequences of tiffs into '.volume' files is available within 'vol_tools'. Both 'tomo_display' and 'vol_tools' include options to (1) generate onscreen output that allows for dynamic visualization in 3D, (2) save sequences of tiffs to disk

  16. Microsurgical scalp reconstruction after a mountain lion attack.

    PubMed

    Hazani, Ron; Buntic, Rudolf F; Brooks, Darrell

    2008-09-01

    Mountain lion attacks on humans are rare and potentially fatal. Although few victims experience minor injuries, permanent disfigurement and disability is common among survivors of these assaults. Since 1986, a steady number of mountain lion attacks have been noted in California. We report a recent attack of a cougar on a couple hiking in California's Prairie Creek Redwoods State Park. The victim sustained a significant scalp injury that led to a life-threatening soft-tissue infection. We present an analysis of the injury pattern as it relates to the bite marks, the resulting degloving injury, and the surgical reconstruction. We also offer a current survey of the pathogens often found in cats' and mountain lions' bite wounds and the appropriate antibiotic treatment. Given the infrequency at which clinicians encounter mountain lion injuries, we recommend that after initial management and exclusion of life threatening injuries patients be transferred to a tertiary care facility capable of managing the various reconstructive challenges such as the one presented in this case.

  17. BLENDING ANALYSIS FOR RADIOACTIVE SALT WASTE PROCESSING FACILITY

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, S.

    2012-05-10

    Savannah River National Laboratory (SRNL) evaluated methods to mix and blend the contents of the blend tanks to ensure the contents are properly blended before they are transferred from the blend tank such as Tank 21 and Tank 24 to the Salt Waste Processing Facility (SWPF) feed tank. The tank contents consist of three forms: dissolved salt solution, other waste salt solutions, and sludge containing settled solids. This paper focuses on developing the computational model and estimating the operation time of submersible slurry pump when the tank contents are adequately blended prior to their transfer to the SWPF facility. Amore » three-dimensional computational fluid dynamics approach was taken by using the full scale configuration of SRS Type-IV tank, Tank 21H. Major solid obstructions such as the tank wall boundary, the transfer pump column, and three slurry pump housings including one active and two inactive pumps were included in the mixing performance model. Basic flow pattern results predicted by the computational model were benchmarked against the SRNL test results and literature data. Tank 21 is a waste tank that is used to prepare batches of salt feed for SWPF. The salt feed must be a homogeneous solution satisfying the acceptance criterion of the solids entrainment during transfer operation. The work scope described here consists of two modeling areas. They are the steady state flow pattern calculations before the addition of acid solution for tank blending operation and the transient mixing analysis during miscible liquid blending operation. The transient blending calculations were performed by using the 95% homogeneity criterion for the entire liquid domain of the tank. The initial conditions for the entire modeling domain were based on the steady-state flow pattern results with zero second phase concentration. The performance model was also benchmarked against the SRNL test results and literature data.« less

  18. Alloplastic adjuncts in breast reconstruction

    PubMed Central

    Cabalag, Miguel S.; Rostek, Marie; Miller, George S.; Chae, Michael P.; Quinn, Tam; Rozen, Warren M.

    2016-01-01

    Background There has been an increasing role of acellular dermal matrices (ADMs) and synthetic meshes in both single- and two-stage implant/expander breast reconstruction. Numerous alloplastic adjuncts exist, and these vary in material type, processing, storage, surgical preparation, level of sterility, available sizes and cost. However, there is little published data on most, posing a significant challenge to the reconstructive surgeon trying to compare and select the most suitable product. The aims of this systematic review were to identify, summarize and evaluate the outcomes of studies describing the use of alloplastic adjuncts for post-mastectomy breast reconstruction. The secondary aims were to determine their cost-effectiveness and analyze outcomes in patients who also underwent radiotherapy. Methods Using the PRSIMA 2009 statement, a systematic review was conducted to find articles reporting on the outcomes on the use of alloplastic adjuncts in post-mastectomy breast reconstruction. Multiple databases were searched independently by three authors (Cabalag MS, Miller GS and Chae MP), including: Ovid MEDLINE (1950 to present), Embase (1980 to 2015), PubMed and Cochrane Database of Systematic Reviews. Results Current published literature on available alloplastic adjuncts are predominantly centered on ADMs, both allogeneic and xenogeneic, with few outcome studies available for synthetic meshes. Outcomes on the 89 articles, which met the inclusion criteria, were summarized and analyzed. The reported outcomes on alloplastic adjunct-assisted breast reconstruction were varied, with most data available on the use of ADMs, particularly AlloDerm® (LifeCell, Branchburg, New Jersey, USA). The use of ADMs in single-stage direct-to-implant breast reconstruction resulted in lower complication rates (infection, seroma, implant loss and late revision), and was more cost effective when compared to non-ADM, two-stage reconstruction. The majority of studies demonstrated

  19. Tracking implementation and (un)intended consequences: a process evaluation of an innovative peripheral health facility financing mechanism in Kenya

    PubMed Central

    Waweru, Evelyn; Goodman, Catherine; Kedenge, Sarah; Tsofa, Benjamin; Molyneux, Sassy

    2016-01-01

    In many African countries, user fees have failed to achieve intended access and quality of care improvements. Subsequent user fee reduction or elimination policies have often been poorly planned, without alternative sources of income for facilities. We describe early implementation of an innovative national health financing intervention in Kenya; the health sector services fund (HSSF). In HSSF, central funds are credited directly into a facility’s bank account quarterly, and facility funds are managed by health facility management committees (HFMCs) including community representatives. HSSF is therefore a finance mechanism with potential to increase access to funds for peripheral facilities, support user fee reduction and improve equity in access. We conducted a process evaluation of HSSF implementation based on a theory of change underpinning the intervention. Methods included interviews at national, district and facility levels, facility record reviews, a structured exit survey and a document review. We found impressive achievements: HSSF funds were reaching facilities; funds were being overseen and used in a way that strengthened transparency and community involvement; and health workers’ motivation and patient satisfaction improved. Challenges or unintended outcomes included: complex and centralized accounting requirements undermining efficiency; interactions between HSSF and user fees leading to difficulties in accessing crucial user fee funds; and some relationship problems between key players. Although user fees charged had not increased, national reduction policies were still not being adhered to. Finance mechanisms can have a strong positive impact on peripheral facilities, and HFMCs can play a valuable role in managing facilities. Although fiduciary oversight is essential, mechanisms should allow for local decision-making and ensure that unmanageable paperwork is avoided. There are also limits to what can be achieved with relatively small funds in

  20. GIS analysis of the siting criteria for the Mixed and Low-Level Waste Treatment Facility and the Idaho Waste Processing Facility

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hoskinson, R.L.

    1994-01-01

    This report summarizes a study conducted using the Arc/Info{reg_sign} geographic information system (GIS) to analyze the criteria used for site selection for the Mixed and Low-Level Waste Treatment Facility (MLLWTF) and the Idaho Waste Processing Facility (IWPF). The purpose of the analyses was to determine, based on predefined criteria, the areas on the INEL that best satisfied the criteria. The coverages used in this study were produced by importing the AutoCAD files that produced the maps for a pre site selection draft report into the GIS. The files were then converted to Arc/Info{reg_sign} GIS format. The initial analysis was mademore » by considering all of the criteria as having equal importance in determining the areas of the INEL that would best satisfy the requirements. Another analysis emphasized four of the criteria as ``must`` criteria which had to be satisfied. Additional analyses considered other criteria that were considered for, but not included in the predefined criteria. This GIS analysis of the siting criteria for the IWPF and MLLWTF provides a logical, repeatable, and defensible approach to the determination of candidate locations for the facilities. The results of the analyses support the location of the Candidate Locations.« less

  1. CANISTER HANDLING FACILITY DESCRIPTION DOCUMENT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    J.F. Beesley

    The purpose of this facility description document (FDD) is to establish requirements and associated bases that drive the design of the Canister Handling Facility (CHF), which will allow the design effort to proceed to license application. This FDD will be revised at strategic points as the design matures. This FDD identifies the requirements and describes the facility design, as it currently exists, with emphasis on attributes of the design provided to meet the requirements. This FDD is an engineering tool for design control; accordingly, the primary audience and users are design engineers. This FDD is part of an iterative designmore » process. It leads the design process with regard to the flowdown of upper tier requirements onto the facility. Knowledge of these requirements is essential in performing the design process. The FDD follows the design with regard to the description of the facility. The description provided in this FDD reflects the current results of the design process.« less

  2. Virtual head rotation reveals a process of route reconstruction from human vestibular signals

    PubMed Central

    Day, Brian L; Fitzpatrick, Richard C

    2005-01-01

    The vestibular organs can feed perceptual processes that build a picture of our route as we move about in the world. However, raw vestibular signals do not define the path taken because, during travel, the head can undergo accelerations unrelated to the route and also be orientated in any direction to vary the signal. This study investigated the computational process by which the brain transforms raw vestibular signals for the purpose of route reconstruction. We electrically stimulated the vestibular nerves of human subjects to evoke a virtual head rotation fixed in skull co-ordinates and measure its perceptual effect. The virtual head rotation caused subjects to perceive an illusory whole-body rotation that was a cyclic function of head-pitch angle. They perceived whole-body yaw rotation in one direction with the head pitched forwards, the opposite direction with the head pitched backwards, and no rotation with the head in an intermediate position. A model based on vector operations and the anatomy and firing properties of semicircular canals precisely predicted these perceptions. In effect, a neural process computes the vector dot product between the craniocentric vestibular vector of head rotation and the gravitational unit vector. This computation yields the signal of body rotation in the horizontal plane that feeds our perception of the route travelled. PMID:16002439

  3. IRVE-3 Post-Flight Reconstruction

    NASA Technical Reports Server (NTRS)

    Olds, Aaron D.; Beck, Roger; Bose, David; White, Joseph; Edquist, Karl; Hollis, Brian; Lindell, Michael; Cheatwood, F. N.; Gsell, Valerie; Bowden, Ernest

    2013-01-01

    The Inflatable Re-entry Vehicle Experiment 3 (IRVE-3) was conducted from the NASA Wallops Flight Facility on July 23, 2012. Launched on a Black Brant XI sounding rocket, the IRVE-3 research vehicle achieved an apogee of 469 km, deployed and inflated a Hypersonic Inflatable Aerodynamic Decelerator (HIAD), re-entered the Earth's atmosphere at Mach 10 and achieved a peak deceleration of 20 g's before descending to splashdown roughly 20 minutes after launch. This paper presents the filtering methodology and results associated with the development of the Best Estimated Trajectory of the IRVE-3 flight test. The reconstructed trajectory is compared against project requirements and pre-flight predictions of entry state, aerodynamics, HIAD flexibility, and attitude control system performance.

  4. Health system reconstruction: Perspectives of Iraqi physicians

    PubMed Central

    Squires, A.; Sindi, A.; Fennie, K.

    2010-01-01

    In conflict or post-conflict situations, health system reconstruction becomes a critical component of ensuring stability. The purpose of this study was to determine the priorities for health system reconstruction among Iraqi physicians residing in the northern region of the country. A convenience sample of practicing male and female physicians residing in the Kurdish region completed a 13-item survey about health system reconstruction. A total of 1001 practitioners completed the survey with gender breakdown of 29% female and 71% male, all working in different specialty areas. Significant differences between the providers based on gender (p = 0.001), specialty (p = 0.001) and geographic location (p = 0.004) were found to affect the responses of the participants. This study demonstrates that input from healthcare professionals is important for health system reconstruction, but that gender, geography and medical specialty make the process complex. PMID:20155543

  5. The ATOVS and AVHRR product processing facility for EPS

    NASA Astrophysics Data System (ADS)

    Klaes, D.; Ackermann, J.; Schraidt, R.; Patterson, T.; Schlüssel, P.; Phillips, P.; Arriaga, A.; Grandell, J.

    The ATOVS/AVHRR Product Processing Facility (PPF) of the EPS (EUMETSAT Polar System) Core Ground Segment comprises the Level 1 processing of the data from the ATOVS sounding instruments AMSU-A, MHS and HIRS/4, and the imager AVHRR/3 into calibrated and navigated radiances. A second component includes the level 2 processing, which uses as input the level 1 products of the aforementioned instruments. The specification of the PPF is based on two well-known and well-established software packages, which have been used by the international community for some years: The AAPP (ATOVS and AVHRR Pre-processing Package) and ICI (Inversion Coupled with Imager). The PPF is able to process data from instruments flown on the Metop and NOAA satellites. For the level 1 processing of the sounding instruments' data (HIRS, AMSU-A and MHS), the basic functionality of AAPP has been kept; however, the individual chains for each instrument have been separated and additional functionality has been integrated. For HIRS a global calibration, as performed by NOAA/NESDIS today, has been included. For AMSU-A and MHS the moon contamination of the calibration space view can be corrected for. Additional functionality has also been included in the AVHRR processing. In particular, an enhanced navigation by landmark processing has been implemented to ensure accurate geo-location. Additionally, the PPF can digest and process the global AVHRR data either at full pixel resolution (1 km at nadir), which is the nominal mode for the Metop processing, or at the reduced resolution of the NOAA/GAC (Global Area Coverage) data (about 4 km resolution at nadir). For the level 2 processing the ICI had to be modified to include the most recent improvement in fast radiative transfer modelling as included in the RTTOV-7. As a first step towards the realisation of the PPF a prototype has been generated for the purpose to help specifying the details of the PPF, and for verification of the latter by generation of

  6. CONNJUR Workflow Builder: A software integration environment for spectral reconstruction

    PubMed Central

    Fenwick, Matthew; Weatherby, Gerard; Vyas, Jay; Sesanker, Colbert; Martyn, Timothy O.; Ellis, Heidi J.C.; Gryk, Michael R.

    2015-01-01

    CONNJUR Workflow Builder (WB) is an open-source software integration environment that leverages existing spectral reconstruction tools to create a synergistic, coherent platform for converting biomolecular NMR data from the time domain to the frequency domain. WB provides data integration of primary data and metadata using a relational database, and includes a library of pre-built workflows for processing time domain data. WB simplifies maximum entropy reconstruction, facilitating the processing of non-uniformly sampled time domain data. As will be shown in the paper, the unique features of WB provide it with novel abilities to enhance the quality, accuracy, and fidelity of the spectral reconstruction process. WB also provides features which promote collaboration, education, parameterization, and non-uniform data sets along with processing integrated with the Rowland NMR Toolkit (RNMRTK) and NMRPipe software packages. WB is available free of charge in perpetuity, dual-licensed under the MIT and GPL open source licenses. PMID:26066803

  7. CONNJUR Workflow Builder: a software integration environment for spectral reconstruction.

    PubMed

    Fenwick, Matthew; Weatherby, Gerard; Vyas, Jay; Sesanker, Colbert; Martyn, Timothy O; Ellis, Heidi J C; Gryk, Michael R

    2015-07-01

    CONNJUR Workflow Builder (WB) is an open-source software integration environment that leverages existing spectral reconstruction tools to create a synergistic, coherent platform for converting biomolecular NMR data from the time domain to the frequency domain. WB provides data integration of primary data and metadata using a relational database, and includes a library of pre-built workflows for processing time domain data. WB simplifies maximum entropy reconstruction, facilitating the processing of non-uniformly sampled time domain data. As will be shown in the paper, the unique features of WB provide it with novel abilities to enhance the quality, accuracy, and fidelity of the spectral reconstruction process. WB also provides features which promote collaboration, education, parameterization, and non-uniform data sets along with processing integrated with the Rowland NMR Toolkit (RNMRTK) and NMRPipe software packages. WB is available free of charge in perpetuity, dual-licensed under the MIT and GPL open source licenses.

  8. 78 FR 69539 - Removal of Attestation Process for Facilities Using H-1A Registered Nurses

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-20

    ... of Attestation Process for Facilities Using H-1A Registered Nurses AGENCY: Employment and Training... registered nurses under the H-1A visa program. These subparts became obsolete after the authorizing statute... nonimmigrant classification exclusively for the temporary admission and employment of registered nurses, which...

  9. [Reconstructive surgery of penile deformities and tissue deficiencies].

    PubMed

    Kelemen, Zsolt

    2009-05-31

    Penile deformities and tissue deficiencies can disturb sexual intercourse or make it impossible. The aim of the study is to summarize the different diseases according to their clinical appearance and pathological processes and to review operative methods and personal experiences. Surgical treatment of hypo- and epispadias is usually performed in childhood, but curvatures after unsuccessful operation can demand the reconstruction of urethra, skin and corpora cavernosa eventually. Peyronie's disease and curvature after penile fracture desire the reconstruction of tunica albuginea. Plaque surgery used to be performed with dermal, tunica vaginalis or venous grafts, but best results are obtained by shortening procedure on the contralateral side according to the Heinecke-Mikulitz principle. Tissue deficiencies and curvatures were observed after necrotic inflammatory processes, like Fournier's gangrene or chronic dermatitis. Skin defects were cured by flaps and grafts. Abscesses of penis, severe tissue defects and also curvatures were observed after intracavernous injection in cases of erectile dysfunction. Possibilities of reconstruction seem to be very poor. Oil granuloma of penis presents a new task for penile reconstruction. The best results of skin replacement were achieved by temporary embedding of the penis in scrotum.

  10. Migration of Beryllium via Multiple Exposure Pathways among Work Processes in Four Different Facilities.

    PubMed

    Armstrong, Jenna L; Day, Gregory A; Park, Ji Young; Stefaniak, Aleksandr B; Stanton, Marcia L; Deubner, David C; Kent, Michael S; Schuler, Christine R; Virji, M Abbas

    2014-01-01

    Inhalation of beryllium is associated with the development of sensitization; however, dermal exposure may also be important. The primary aim of this study was to elucidate relationships among exposure pathways in four different manufacturing and finishing facilities. Secondary aims were to identify jobs with increased levels of beryllium in air, on skin, and on surfaces; identify potential discrepancies in exposure pathways, and determine if these are related to jobs with previously identified risk. Beryllium was measured in air, on cotton gloves, and on work surfaces. Summary statistics were calculated and correlations among all three measurement types were examined at the facility and job level. Exposure ranking strategies were used to identify jobs with higher exposures. The highest air, glove, and surface measurements were observed in beryllium metal production and beryllium oxide ceramics manufacturing jobs that involved hot processes and handling powders. Two finishing and distribution facilities that handle solid alloy products had lower exposures than the primary production facilities, and there were differences observed among jobs. For all facilities combined, strong correlations were found between air-surface (rp ≥ 0.77), glove-surface (rp ≥ 0.76), and air-glove measurements (rp ≥ 0.69). In jobs where higher risk of beryllium sensitization or disease has been reported, exposure levels for all three measurement types were higher than in jobs with lower risk, though they were not the highest. Some jobs with low air concentrations had higher levels of beryllium on glove and surface wipe samples, suggesting a need to further evaluate the causes of the discrepant levels. Although such correlations provide insight on where beryllium is located throughout the workplace, they cannot identify the direction of the pathways between air, surface, or skin. Ranking strategies helped to identify jobs with the highest combined air, glove, and/or surface exposures

  11. Kinect Posture Reconstruction Based on a Local Mixture of Gaussian Process Models.

    PubMed

    Liu, Zhiguang; Zhou, Liuyang; Leung, Howard; Shum, Hubert P H

    2016-11-01

    Depth sensor based 3D human motion estimation hardware such as Kinect has made interactive applications more popular recently. However, it is still challenging to accurately recognize postures from a single depth camera due to the inherently noisy data derived from depth images and self-occluding action performed by the user. In this paper, we propose a new real-time probabilistic framework to enhance the accuracy of live captured postures that belong to one of the action classes in the database. We adopt the Gaussian Process model as a prior to leverage the position data obtained from Kinect and marker-based motion capture system. We also incorporate a temporal consistency term into the optimization framework to constrain the velocity variations between successive frames. To ensure that the reconstructed posture resembles the accurate parts of the observed posture, we embed a set of joint reliability measurements into the optimization framework. A major drawback of Gaussian Process is its cubic learning complexity when dealing with a large database due to the inverse of a covariance matrix. To solve the problem, we propose a new method based on a local mixture of Gaussian Processes, in which Gaussian Processes are defined in local regions of the state space. Due to the significantly decreased sample size in each local Gaussian Process, the learning time is greatly reduced. At the same time, the prediction speed is enhanced as the weighted mean prediction for a given sample is determined by the nearby local models only. Our system also allows incrementally updating a specific local Gaussian Process in real time, which enhances the likelihood of adapting to run-time postures that are different from those in the database. Experimental results demonstrate that our system can generate high quality postures even under severe self-occlusion situations, which is beneficial for real-time applications such as motion-based gaming and sport training.

  12. Wavelet processing and digital interferometric contrast to improve reconstructions from X-ray Gabor holograms.

    PubMed

    Aguilar, Juan C; Misawa, Masaki; Matsuda, Kiyofumi; Suzuki, Yoshio; Takeuchi, Akihisa; Yasumoto, Masato

    2018-05-01

    In this work, the application of an undecimated wavelet transformation together with digital interferometric contrast to improve the resulting reconstructions in a digital hard X-ray Gabor holographic microscope is shown. Specifically, the starlet transform is used together with digital Zernike contrast. With this contrast, the results show that only a small set of scales from the hologram are, in effect, useful, and it is possible to enhance the details of the reconstruction.

  13. Respiratory motion correction in emission tomography image reconstruction.

    PubMed

    Reyes, Mauricio; Malandain, Grégoire; Koulibaly, Pierre Malick; González Ballester, Miguel A; Darcourt, Jacques

    2005-01-01

    In Emission Tomography imaging, respiratory motion causes artifacts in lungs and cardiac reconstructed images, which lead to misinterpretations and imprecise diagnosis. Solutions like respiratory gating, correlated dynamic PET techniques, list-mode data based techniques and others have been tested with improvements over the spatial activity distribution in lungs lesions, but with the disadvantages of requiring additional instrumentation or discarding part of the projection data used for reconstruction. The objective of this study is to incorporate respiratory motion correction directly into the image reconstruction process, without any additional acquisition protocol consideration. To this end, we propose an extension to the Maximum Likelihood Expectation Maximization (MLEM) algorithm that includes a respiratory motion model, which takes into account the displacements and volume deformations produced by the respiratory motion during the data acquisition process. We present results from synthetic simulations incorporating real respiratory motion as well as from phantom and patient data.

  14. Deep learning methods to guide CT image reconstruction and reduce metal artifacts

    NASA Astrophysics Data System (ADS)

    Gjesteby, Lars; Yang, Qingsong; Xi, Yan; Zhou, Ye; Zhang, Junping; Wang, Ge

    2017-03-01

    The rapidly-rising field of machine learning, including deep learning, has inspired applications across many disciplines. In medical imaging, deep learning has been primarily used for image processing and analysis. In this paper, we integrate a convolutional neural network (CNN) into the computed tomography (CT) image reconstruction process. Our first task is to monitor the quality of CT images during iterative reconstruction and decide when to stop the process according to an intelligent numerical observer instead of using a traditional stopping rule, such as a fixed error threshold or a maximum number of iterations. After training on ground truth images, the CNN was successful in guiding an iterative reconstruction process to yield high-quality images. Our second task is to improve a sinogram to correct for artifacts caused by metal objects. A large number of interpolation and normalization-based schemes were introduced for metal artifact reduction (MAR) over the past four decades. The NMAR algorithm is considered a state-of-the-art method, although residual errors often remain in the reconstructed images, especially in cases of multiple metal objects. Here we merge NMAR with deep learning in the projection domain to achieve additional correction in critical image regions. Our results indicate that deep learning can be a viable tool to address CT reconstruction challenges.

  15. Reconstruction of network topology using status-time-series data

    NASA Astrophysics Data System (ADS)

    Pandey, Pradumn Kumar; Badarla, Venkataramana

    2018-01-01

    Uncovering the heterogeneous connection pattern of a networked system from the available status-time-series (STS) data of a dynamical process on the network is of great interest in network science and known as a reverse engineering problem. Dynamical processes on a network are affected by the structure of the network. The dependency between the diffusion dynamics and structure of the network can be utilized to retrieve the connection pattern from the diffusion data. Information of the network structure can help to devise the control of dynamics on the network. In this paper, we consider the problem of network reconstruction from the available status-time-series (STS) data using matrix analysis. The proposed method of network reconstruction from the STS data is tested successfully under susceptible-infected-susceptible (SIS) diffusion dynamics on real-world and computer-generated benchmark networks. High accuracy and efficiency of the proposed reconstruction procedure from the status-time-series data define the novelty of the method. Our proposed method outperforms compressed sensing theory (CST) based method of network reconstruction using STS data. Further, the same procedure of network reconstruction is applied to the weighted networks. The ordering of the edges in the weighted networks is identified with high accuracy.

  16. 40 CFR 60.110b - Applicability and designation of affected facility.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... an attachment to the notification required by 40 CFR 65.5(b). [52 FR 11429, Apr. 8, 1987, as amended... designation of affected facility. (a) Except as provided in paragraph (b) of this section, the affected..., reconstruction, or modification is commenced after July 23, 1984. (b) This subpart does not apply to storage...

  17. 40 CFR 60.110b - Applicability and designation of affected facility.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... an attachment to the notification required by 40 CFR 65.5(b). [52 FR 11429, Apr. 8, 1987, as amended... designation of affected facility. (a) Except as provided in paragraph (b) of this section, the affected..., reconstruction, or modification is commenced after July 23, 1984. (b) This subpart does not apply to storage...

  18. 40 CFR 60.110b - Applicability and designation of affected facility.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... an attachment to the notification required by 40 CFR 65.5(b). [52 FR 11429, Apr. 8, 1987, as amended... designation of affected facility. (a) Except as provided in paragraph (b) of this section, the affected..., reconstruction, or modification is commenced after July 23, 1984. (b) This subpart does not apply to storage...

  19. 40 CFR 60.110b - Applicability and designation of affected facility.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... an attachment to the notification required by 40 CFR 65.5(b). [52 FR 11429, Apr. 8, 1987, as amended... designation of affected facility. (a) Except as provided in paragraph (b) of this section, the affected..., reconstruction, or modification is commenced after July 23, 1984. (b) This subpart does not apply to storage...

  20. 40 CFR 60.110b - Applicability and designation of affected facility.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... an attachment to the notification required by 40 CFR 65.5(b). [52 FR 11429, Apr. 8, 1987, as amended... designation of affected facility. (a) Except as provided in paragraph (b) of this section, the affected..., reconstruction, or modification is commenced after July 23, 1984. (b) This subpart does not apply to storage...

  1. Tracking implementation and (un)intended consequences: a process evaluation of an innovative peripheral health facility financing mechanism in Kenya.

    PubMed

    Waweru, Evelyn; Goodman, Catherine; Kedenge, Sarah; Tsofa, Benjamin; Molyneux, Sassy

    2016-03-01

    In many African countries, user fees have failed to achieve intended access and quality of care improvements. Subsequent user fee reduction or elimination policies have often been poorly planned, without alternative sources of income for facilities. We describe early implementation of an innovative national health financing intervention in Kenya; the health sector services fund (HSSF). In HSSF, central funds are credited directly into a facility's bank account quarterly, and facility funds are managed by health facility management committees (HFMCs) including community representatives. HSSF is therefore a finance mechanism with potential to increase access to funds for peripheral facilities, support user fee reduction and improve equity in access. We conducted a process evaluation of HSSF implementation based on a theory of change underpinning the intervention. Methods included interviews at national, district and facility levels, facility record reviews, a structured exit survey and a document review. We found impressive achievements: HSSF funds were reaching facilities; funds were being overseen and used in a way that strengthened transparency and community involvement; and health workers' motivation and patient satisfaction improved. Challenges or unintended outcomes included: complex and centralized accounting requirements undermining efficiency; interactions between HSSF and user fees leading to difficulties in accessing crucial user fee funds; and some relationship problems between key players. Although user fees charged had not increased, national reduction policies were still not being adhered to. Finance mechanisms can have a strong positive impact on peripheral facilities, and HFMCs can play a valuable role in managing facilities. Although fiduciary oversight is essential, mechanisms should allow for local decision-making and ensure that unmanageable paperwork is avoided. There are also limits to what can be achieved with relatively small funds in

  2. Data Management Facility Operations Plan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Keck, Nicole N

    2014-06-30

    The Data Management Facility (DMF) is the data center that houses several critical Atmospheric Radiation Measurement (ARM) Climate Research Facility services, including first-level data processing for the ARM Mobile Facilities (AMFs), Eastern North Atlantic (ENA), North Slope of Alaska (NSA), Southern Great Plains (SGP), and Tropical Western Pacific (TWP) sites, as well as Value-Added Product (VAP) processing, development systems, and other network services.

  3. ACL reconstruction - discharge

    MedlinePlus

    Anterior cruciate ligament reconstruction - discharge; ACL reconstruction - discharge ... had surgery to reconstruct your anterior cruciate ligament (ACL). The surgeon drilled holes in the bones of ...

  4. Mines and mineral processing facilities in the vicinity of the March 11, 2011, earthquake in northern Honshu, Japan

    USGS Publications Warehouse

    Menzie, W. David; Baker, Michael S.; Bleiwas, Donald I.; Kuo, Chin

    2011-01-01

    U.S. Geological Survey data indicate that the area affected by the March 11, 2011, magnitude 9.0 earthquake and associated tsunami is home to nine cement plants, eight iodine plants, four iron and steel plants, four limestone mines, three copper refineries, two gold refineries, two lead refineries, two zinc refineries, one titanium dioxide plant, and one titanium sponge processing facility. These facilities have the capacity to produce the following percentages of the world's nonfuel mineral production: 25 percent of iodine, 10 percent of titanium sponge (metal), 3 percent of refined zinc, 2.5 percent of refined copper, and 1.4 percent of steel. In addition, the nine cement plants contribute about one-third of Japan's cement annual production. The iodine is a byproduct from production of natural gas at the Miniami Kanto gas field, east of Tokyo in Chiba Prefecture. Japan is the world's second leading (after Chile) producer of iodine, which is processed in seven nearby facilities.

  5. 3D noise power spectrum applied on clinical MDCT scanners: effects of reconstruction algorithms and reconstruction filters

    NASA Astrophysics Data System (ADS)

    Miéville, Frédéric A.; Bolard, Gregory; Benkreira, Mohamed; Ayestaran, Paul; Gudinchet, François; Bochud, François; Verdun, Francis R.

    2011-03-01

    The noise power spectrum (NPS) is the reference metric for understanding the noise content in computed tomography (CT) images. To evaluate the noise properties of clinical multidetector (MDCT) scanners, local 2D and 3D NPSs were computed for different acquisition reconstruction parameters. A 64- and a 128-MDCT scanners were employed. Measurements were performed on a water phantom in axial and helical acquisition modes. CT dose index was identical for both installations. Influence of parameters such as the pitch, the reconstruction filter (soft, standard and bone) and the reconstruction algorithm (filtered-back projection (FBP), adaptive statistical iterative reconstruction (ASIR)) were investigated. Images were also reconstructed in the coronal plane using a reformat process. Then 2D and 3D NPS methods were computed. In axial acquisition mode, the 2D axial NPS showed an important magnitude variation as a function of the z-direction when measured at the phantom center. In helical mode, a directional dependency with lobular shape was observed while the magnitude of the NPS was kept constant. Important effects of the reconstruction filter, pitch and reconstruction algorithm were observed on 3D NPS results for both MDCTs. With ASIR, a reduction of the NPS magnitude and a shift of the NPS peak to the low frequency range were visible. 2D coronal NPS obtained from the reformat images was impacted by the interpolation when compared to 2D coronal NPS obtained from 3D measurements. The noise properties of volume measured in last generation MDCTs was studied using local 3D NPS metric. However, impact of the non-stationarity noise effect may need further investigations.

  6. 40 CFR 60.750 - Applicability, designation of affected facility, and delegation of authority.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Standards of Performance for Municipal Solid Waste Landfills § 60.750 Applicability, designation of affected facility, and delegation of authority. (a) The provisions of this subpart apply to each municipal solid waste landfill that commenced construction, reconstruction or modification on or after May 30, 1991...

  7. 40 CFR 60.750 - Applicability, designation of affected facility, and delegation of authority.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Standards of Performance for Municipal Solid Waste Landfills § 60.750 Applicability, designation of affected facility, and delegation of authority. (a) The provisions of this subpart apply to each municipal solid waste landfill that commenced construction, reconstruction or modification on or after May 30, 1991...

  8. A first look at reconstructed data from the GlueX detector

    NASA Astrophysics Data System (ADS)

    Taylor, Simon; GlueX Collaboration

    2015-10-01

    Construction of the GlueX detector in Hall D at the Thomas Jefferson National Accelerator Facility has recently been completed as part of the 12 GeV Upgrade to the facility. The detector consists of a barrel region containing devices for tracking charged particles and a lead-scintillator calorimeter for detecting photons, and a forward region consisting of two layers of scintillator paddles for time-of-flight measurements and a lead-glass electromagnetic calorimeter. The electron beam from the accelerator is converted into a photon beam by inserting a diamond radiator, thereby producing a coherent bremsstrahlung spectrum of photons impinging on a 30 cm-long LH2 target. The energy of the photon beam is determined using a tagging spectrometer. A commissioning run took place in Spring of 2015 during which all of the detector components were read out. Preliminary calibrations have been determined to a level sufficient to allow reconstruction of final states with several charged tracks and neutral particles. A first look at results of reconstruction of events using the GlueX detector will be presented. This material is based upon work supported by the U.S. Department of Energy, Office of Science, Office of Nuclear Physics under Contract DE-AC05-06OR23177.

  9. High data volume and transfer rate techniques used at NASA's image processing facility

    NASA Technical Reports Server (NTRS)

    Heffner, P.; Connell, E.; Mccaleb, F.

    1978-01-01

    Data storage and transfer operations at a new image processing facility are described. The equipment includes high density digital magnetic tape drives and specially designed controllers to provide an interface between the tape drives and computerized image processing systems. The controller performs the functions necessary to convert the continuous serial data stream from the tape drive to a word-parallel blocked data stream which then goes to the computer-based system. With regard to the tape packing density, 1.8 times 10 to the tenth data bits are stored on a reel of one-inch tape. System components and their operation are surveyed, and studies on advanced storage techniques are summarized.

  10. Sparsity-constrained PET image reconstruction with learned dictionaries

    NASA Astrophysics Data System (ADS)

    Tang, Jing; Yang, Bao; Wang, Yanhua; Ying, Leslie

    2016-09-01

    PET imaging plays an important role in scientific and clinical measurement of biochemical and physiological processes. Model-based PET image reconstruction such as the iterative expectation maximization algorithm seeking the maximum likelihood solution leads to increased noise. The maximum a posteriori (MAP) estimate removes divergence at higher iterations. However, a conventional smoothing prior or a total-variation (TV) prior in a MAP reconstruction algorithm causes over smoothing or blocky artifacts in the reconstructed images. We propose to use dictionary learning (DL) based sparse signal representation in the formation of the prior for MAP PET image reconstruction. The dictionary to sparsify the PET images in the reconstruction process is learned from various training images including the corresponding MR structural image and a self-created hollow sphere. Using simulated and patient brain PET data with corresponding MR images, we study the performance of the DL-MAP algorithm and compare it quantitatively with a conventional MAP algorithm, a TV-MAP algorithm, and a patch-based algorithm. The DL-MAP algorithm achieves improved bias and contrast (or regional mean values) at comparable noise to what the other MAP algorithms acquire. The dictionary learned from the hollow sphere leads to similar results as the dictionary learned from the corresponding MR image. Achieving robust performance in various noise-level simulation and patient studies, the DL-MAP algorithm with a general dictionary demonstrates its potential in quantitative PET imaging.

  11. Outcomes of short-gap sensory nerve injuries reconstructed with processed nerve allografts from a multicenter registry study.

    PubMed

    Rinker, Brian D; Ingari, John V; Greenberg, Jeffrey A; Thayer, Wesley P; Safa, Bauback; Buncke, Gregory M

    2015-06-01

    Short-gap digital nerve injuries are a common surgical problem, but the optimal treatment modality is unknown. A multicenter database was queried and analyzed to determine the outcomes of nerve gap reconstructions between 5 and 15 mm with processed nerve allograft. The current RANGER registry is designed to continuously monitor and compile injury, repair, safety, and outcomes data. Centers followed their own standard of care for treatment and follow-up. The database was queried for digital nerve injuries with a gap between 5 and 15 mm reporting sufficient follow-up data to complete outcomes analysis. Available quantitative outcome measures were reviewed and reported. Meaningful recovery was defined by the Medical Research Council Classification (MRCC) scale at S3-S4 for sensory function. Sufficient follow-up data were available for 24 subjects (37 repairs) in the prescribed gap range. Mean age was 43 years (range, 23-81). Mean gap was 11 ± 3 (5-15) mm. Time to repair was 13 ± 42 (0-215) days. There were 25 lacerations, 8 avulsion/amputations, 2 gunshots, 1 crush injury, and 1 injury of unknown mechanism. Meaningful recovery, defined as S3-S4 on the MRCC scales, was reported in 92% of repairs. Sensory recovery of S3+ or S4 was observed in 84% of repairs. Static 2PD was 7.1 ± 2.9 mm (n = 19). Return to light touch was observed in 23 out of 32 repairs reporting Semmes-Weinstein monofilament outcomes (SWMF). There were no reported nerve adverse events. Sensory outcomes for processed nerve allografts were equivalent to historical controls for nerve autograft and exceed those of conduit. Processed nerve allografts provide an effective solution for short-gap digital nerve reconstructions. Thieme Medical Publishers 333 Seventh Avenue, New York, NY 10001, USA.

  12. Wind reconstruction algorithm for Viking Lander 1

    NASA Astrophysics Data System (ADS)

    Kynkäänniemi, Tuomas; Kemppinen, Osku; Harri, Ari-Matti; Schmidt, Walter

    2017-06-01

    The wind measurement sensors of Viking Lander 1 (VL1) were only fully operational for the first 45 sols of the mission. We have developed an algorithm for reconstructing the wind measurement data after the wind measurement sensor failures. The algorithm for wind reconstruction enables the processing of wind data during the complete VL1 mission. The heater element of the quadrant sensor, which provided auxiliary measurement for wind direction, failed during the 45th sol of the VL1 mission. Additionally, one of the wind sensors of VL1 broke down during sol 378. Regardless of the failures, it was still possible to reconstruct the wind measurement data, because the failed components of the sensors did not prevent the determination of the wind direction and speed, as some of the components of the wind measurement setup remained intact for the complete mission. This article concentrates on presenting the wind reconstruction algorithm and methods for validating the operation of the algorithm. The algorithm enables the reconstruction of wind measurements for the complete VL1 mission. The amount of available sols is extended from 350 to 2245 sols.

  13. Current Sensor Fault Reconstruction for PMSM Drives

    PubMed Central

    Huang, Gang; Luo, Yi-Ping; Zhang, Chang-Fan; He, Jing; Huang, Yi-Shan

    2016-01-01

    This paper deals with a current sensor fault reconstruction algorithm for the torque closed-loop drive system of an interior PMSM. First, sensor faults are equated to actuator ones by a new introduced state variable. Then, in αβ coordinates, based on the motor model with active flux linkage, a current observer is constructed with a specific sliding mode equivalent control methodology to eliminate the effects of unknown disturbances, and the phase current sensor faults are reconstructed by means of an adaptive method. Finally, an αβ axis current fault processing module is designed based on the reconstructed value. The feasibility and effectiveness of the proposed method are verified by simulation and experimental tests on the RT-LAB platform. PMID:26840317

  14. People, Process and Technology: Strategies for Assuring Sustainable Implementation of EMRs at Public-Sector Health Facilities in Kenya

    PubMed Central

    Kang’a, Samuel G.; Muthee, Veronica M.; Liku, Nzisa; Too, Diana; Puttkammer, Nancy

    2016-01-01

    The Ministry of Health (MoH) rollout of electronic medical record systems (EMRs) has continuously been embraced across health facilities in Kenya since 2012. This has been driven by a government led process supported by PEPFAR that recommended standardized systems for facilities. Various strategies were deployed to assure meaningful and sustainable EMRs implementation: sensitization of leadership; user training, formation of health facility-level multi-disciplinary teams; formation of county-level Technical Working Groups; data migration; routine data quality assessments; point of care adoption; successive release of software upgrades; and power provision. Successes recorded include goodwill and leadership from the county management (22 counties), growth in the number of EMR trained users (2561 health care workers), collaboration in among other things, data migration(90 health facilities completed) and establishment of county TWGs (13 TWGs). Sustenance of EMRs demand across facilities is possible through; county TWGs oversight, timely resolution of users’ issues and provision of reliable power. PMID:28269864

  15. People, Process and Technology: Strategies for Assuring Sustainable Implementation of EMRs at Public-Sector Health Facilities in Kenya.

    PubMed

    Kang'a, Samuel G; Muthee, Veronica M; Liku, Nzisa; Too, Diana; Puttkammer, Nancy

    2016-01-01

    The Ministry of Health (MoH) rollout of electronic medical record systems (EMRs) has continuously been embraced across health facilities in Kenya since 2012. This has been driven by a government led process supported by PEPFAR that recommended standardized systems for facilities. Various strategies were deployed to assure meaningful and sustainable EMRs implementation: sensitization of leadership; user training, formation of health facility-level multi-disciplinary teams; formation of county-level Technical Working Groups; data migration; routine data quality assessments; point of care adoption; successive release of software upgrades; and power provision. Successes recorded include goodwill and leadership from the county management (22 counties), growth in the number of EMR trained users (2561 health care workers), collaboration in among other things, data migration(90 health facilities completed) and establishment of county TWGs (13 TWGs). Sustenance of EMRs demand across facilities is possible through; county TWGs oversight, timely resolution of users' issues and provision of reliable power.

  16. A protocol for generating a high-quality genome-scale metabolic reconstruction.

    PubMed

    Thiele, Ines; Palsson, Bernhard Ø

    2010-01-01

    Network reconstructions are a common denominator in systems biology. Bottom-up metabolic network reconstructions have been developed over the last 10 years. These reconstructions represent structured knowledge bases that abstract pertinent information on the biochemical transformations taking place within specific target organisms. The conversion of a reconstruction into a mathematical format facilitates a myriad of computational biological studies, including evaluation of network content, hypothesis testing and generation, analysis of phenotypic characteristics and metabolic engineering. To date, genome-scale metabolic reconstructions for more than 30 organisms have been published and this number is expected to increase rapidly. However, these reconstructions differ in quality and coverage that may minimize their predictive potential and use as knowledge bases. Here we present a comprehensive protocol describing each step necessary to build a high-quality genome-scale metabolic reconstruction, as well as the common trials and tribulations. Therefore, this protocol provides a helpful manual for all stages of the reconstruction process.

  17. A protocol for generating a high-quality genome-scale metabolic reconstruction

    PubMed Central

    Thiele, Ines; Palsson, Bernhard Ø.

    2011-01-01

    Network reconstructions are a common denominator in systems biology. Bottom-up metabolic network reconstructions have developed over the past 10 years. These reconstructions represent structured knowledge-bases that abstract pertinent information on the biochemical transformations taking place within specific target organisms. The conversion of a reconstruction into a mathematical format facilitates myriad computational biological studies including evaluation of network content, hypothesis testing and generation, analysis of phenotypic characteristics, and metabolic engineering. To date, genome-scale metabolic reconstructions for more than 30 organisms have been published and this number is expected to increase rapidly. However, these reconstructions differ in quality and coverage that may minimize their predictive potential and use as knowledge-bases. Here, we present a comprehensive protocol describing each step necessary to build a high-quality genome-scale metabolic reconstruction as well as common trials and tribulations. Therefore, this protocol provides a helpful manual for all stages of the reconstruction process. PMID:20057383

  18. SPECT reconstruction using DCT-induced tight framelet regularization

    NASA Astrophysics Data System (ADS)

    Zhang, Jiahan; Li, Si; Xu, Yuesheng; Schmidtlein, C. R.; Lipson, Edward D.; Feiglin, David H.; Krol, Andrzej

    2015-03-01

    Wavelet transforms have been successfully applied in many fields of image processing. Yet, to our knowledge, they have never been directly incorporated to the objective function in Emission Computed Tomography (ECT) image reconstruction. Our aim has been to investigate if the ℓ1-norm of non-decimated discrete cosine transform (DCT) coefficients of the estimated radiotracer distribution could be effectively used as the regularization term for the penalized-likelihood (PL) reconstruction, where a regularizer is used to enforce the image smoothness in the reconstruction. In this study, the ℓ1-norm of 2D DCT wavelet decomposition was used as a regularization term. The Preconditioned Alternating Projection Algorithm (PAPA), which we proposed in earlier work to solve penalized likelihood (PL) reconstruction with non-differentiable regularizers, was used to solve this optimization problem. The DCT wavelet decompositions were performed on the transaxial reconstructed images. We reconstructed Monte Carlo simulated SPECT data obtained for a numerical phantom with Gaussian blobs as hot lesions and with a warm random lumpy background. Reconstructed images using the proposed method exhibited better noise suppression and improved lesion conspicuity, compared with images reconstructed using expectation maximization (EM) algorithm with Gaussian post filter (GPF). Also, the mean square error (MSE) was smaller, compared with EM-GPF. A critical and challenging aspect of this method was selection of optimal parameters. In summary, our numerical experiments demonstrated that the ℓ1-norm of discrete cosine transform (DCT) wavelet frame transform DCT regularizer shows promise for SPECT image reconstruction using PAPA method.

  19. Fluids and Combustion Facility Acoustic Emissions Controlled by Aggressive Low-Noise Design Process

    NASA Technical Reports Server (NTRS)

    Cooper, Beth A.; Young, Judith A.

    2004-01-01

    The Fluids and Combustion Facility (FCF) is a dual-rack microgravity research facility that is being developed by Northrop Grumman Information Technology (NGIT) for the International Space Station (ISS) at the NASA Glenn Research Center. As an on-orbit test bed, FCF will host a succession of experiments in fluid and combustion physics. The Fluids Integrated Rack (FIR) and the Combustion Integrated Rack (CIR) must meet ISS acoustic emission requirements (ref. 1), which support speech communication and hearing-loss-prevention goals for ISS crew. To meet these requirements, the NGIT acoustics team implemented an aggressive low-noise design effort that incorporated frequent acoustic emission testing for all internal noise sources, larger-scale systems, and fully integrated racks (ref. 2). Glenn's Acoustical Testing Laboratory (ref. 3) provided acoustical testing services (see the following photograph) as well as specialized acoustical engineering support as part of the low-noise design process (ref. 4).

  20. Posttraumatic thumb reconstruction.

    PubMed

    Muzaffar, Arshad R; Chao, James J; Friedrich, Jeffrey B; Freidrich, Jeffrey B

    2005-10-01

    After reading this article, the reader should be able to: 1. Discuss the critical anatomic features of the thumb as they affect on reconstructive decision making. 2. Define the goals of reconstruction. 3. Discuss an algorithm for thumb reconstruction according to the level of amputation. 4. Understand the role of prosthetics in thumb reconstruction. The function of the thumb is critical to overall hand function. Uniquely endowed with anatomic features that allow circumduction and opposition, the thumb enables activities of pinch, grasp, and fine manipulation that are essential in daily life. Destruction of the thumb secondary to trauma represents a much more significant loss than would result from loss of any other digit. Therefore, significant effort has been focused on thumb reconstruction. Numerous techniques have been described, ranging from simple osteoplastic techniques to complex microsurgical procedures. With an appreciation of the unique anatomic properties of the thumb, the hand surgeon is better able to understand the goals of thumb reconstruction and to develop an algorithm for thumb reconstruction. With such an understanding, an individualized reconstructive plan can be developed for each patient. A great many options are available for posttraumatic thumb reconstruction. Optimal results are obtained by pursuing an organized and logical approach to reconstruction based upon the level of tissue loss. Reconstruction methods depend on the location of the amputation and range from homodigital and heterodigital flaps to partial-toe transfer or a great-toe wrap-around flap to first-web-space deepening using Z-plasties, a dorsal rotation flap, or a distant flap, to distraction osteogenesis, lengthening of the thumb ray, spare parts from another injured digit in the acute setting for pollicization or heterotopic replantation, and microvascular toe transfer. Amputations in the distal third of the thumb are generally well-tolerated. The primary reconstructive

  1. Joint MR-PET reconstruction using a multi-channel image regularizer

    PubMed Central

    Koesters, Thomas; Otazo, Ricardo; Bredies, Kristian; Sodickson, Daniel K

    2016-01-01

    While current state of the art MR-PET scanners enable simultaneous MR and PET measurements, the acquired data sets are still usually reconstructed separately. We propose a new multi-modality reconstruction framework using second order Total Generalized Variation (TGV) as a dedicated multi-channel regularization functional that jointly reconstructs images from both modalities. In this way, information about the underlying anatomy is shared during the image reconstruction process while unique differences are preserved. Results from numerical simulations and in-vivo experiments using a range of accelerated MR acquisitions and different MR image contrasts demonstrate improved PET image quality, resolution, and quantitative accuracy. PMID:28055827

  2. ADART: an adaptive algebraic reconstruction algorithm for discrete tomography.

    PubMed

    Maestre-Deusto, F Javier; Scavello, Giovanni; Pizarro, Joaquín; Galindo, Pedro L

    2011-08-01

    In this paper we suggest an algorithm based on the Discrete Algebraic Reconstruction Technique (DART) which is capable of computing high quality reconstructions from substantially fewer projections than required for conventional continuous tomography. Adaptive DART (ADART) goes a step further than DART on the reduction of the number of unknowns of the associated linear system achieving a significant reduction in the pixel error rate of reconstructed objects. The proposed methodology automatically adapts the border definition criterion at each iteration, resulting in a reduction of the number of pixels belonging to the border, and consequently of the number of unknowns in the general algebraic reconstruction linear system to be solved, being this reduction specially important at the final stage of the iterative process. Experimental results show that reconstruction errors are considerably reduced using ADART when compared to original DART, both in clean and noisy environments.

  3. Node 2 and Japanese Experimental Module (JEM) In Space Station Processing Facility

    NASA Technical Reports Server (NTRS)

    2003-01-01

    Lining the walls of the Space Station Processing Facility at the Kennedy Space Center (KSC) are the launch awaiting U.S. Node 2 (lower left). and the first pressurized module of the Japanese Experimental Module (JEM) (upper right), named 'Kibo' (Hope). Node 2, the 'utility hub' and second of three connectors between International Space Station (ISS) modules, was built in the Torino, Italy facility of Alenia Spazio, an International contractor based in Rome. Japan's major contribution to the station, the JEM, was built by the Space Development Agency of Japan (NASDA) at the Tsukuba Space Center near Tokyo and will expand research capabilities aboard the station. Both were part of an agreement between NASA and the European Space Agency (ESA). The Node 2 will be the next pressurized module installed on the Station. Once the Japanese and European laboratories are attached to it, the resulting roomier Station will expand from the equivalent space of a 3-bedroom house to a 5-bedroom house. The Marshall Space Center in Huntsville, Alabama manages the Node program for NASA.

  4. Photoacoustic image reconstruction: a quantitative analysis

    NASA Astrophysics Data System (ADS)

    Sperl, Jonathan I.; Zell, Karin; Menzenbach, Peter; Haisch, Christoph; Ketzer, Stephan; Marquart, Markus; Koenig, Hartmut; Vogel, Mika W.

    2007-07-01

    Photoacoustic imaging is a promising new way to generate unprecedented contrast in ultrasound diagnostic imaging. It differs from other medical imaging approaches, in that it provides spatially resolved information about optical absorption of targeted tissue structures. Because the data acquisition process deviates from standard clinical ultrasound, choice of the proper image reconstruction method is crucial for successful application of the technique. In the literature, multiple approaches have been advocated, and the purpose of this paper is to compare four reconstruction techniques. Thereby, we focused on resolution limits, stability, reconstruction speed, and SNR. We generated experimental and simulated data and reconstructed images of the pressure distribution using four different methods: delay-and-sum (DnS), circular backprojection (CBP), generalized 2D Hough transform (HTA), and Fourier transform (FTA). All methods were able to depict the point sources properly. DnS and CBP produce blurred images containing typical superposition artifacts. The HTA provides excellent SNR and allows a good point source separation. The FTA is the fastest and shows the best FWHM. In our study, we found the FTA to show the best overall performance. It allows a very fast and theoretically exact reconstruction. Only a hardware-implemented DnS might be faster and enable real-time imaging. A commercial system may also perform several methods to fully utilize the new contrast mechanism and guarantee optimal resolution and fidelity.

  5. Patient involvement in the decision-making process improves satisfaction and quality of life in postmastectomy breast reconstruction.

    PubMed

    Ashraf, Azra A; Colakoglu, Salih; Nguyen, John T; Anastasopulos, Alexandra J; Ibrahim, Ahmed M S; Yueh, Janet H; Lin, Samuel J; Tobias, Adam M; Lee, Bernard T

    2013-09-01

    The patient-physician relationship has evolved from the paternalistic, physician-dominant model to the shared-decision-making and informed-consumerist model. The level of patient involvement in this decision-making process can potentially influence patient satisfaction and quality of life. In this study, patient-physician decision models are evaluated in patients undergoing postmastectomy breast reconstruction. All women who underwent breast reconstruction at an academic hospital from 1999-2007 were identified. Patients meeting inclusion criteria were mailed questionnaires at a minimum of 1 y postoperatively with questions about decision making, satisfaction, and quality of life. There were 707 women eligible for our study and 465 completed surveys (68% response rate). Patients were divided into one of three groups: paternalistic (n = 18), informed-consumerist (n = 307), shared (n = 140). There were differences in overall general satisfaction (P = 0.034), specifically comparing the informed group to the paternalistic group (66.7% versus 38.9%, P = 0.020) and the shared to the paternalistic group (69.3% versus 38.9%, P = 0.016). There were no differences in aesthetic satisfaction. There were differences found in the SF-12 physical component summary score across all groups (P = 0.033), and a difference was found between the informed and paternalistic groups (P < 0.05). There were no differences in the mental component score (P = 0.42). Women undergoing breast reconstruction predominantly used the informed model of decision making. Patients who adopted a more active role, whether using an informed or shared approach, had higher general patient satisfaction and physical component summary scores compared with patients whose decision making was paternalistic. Copyright © 2013 Elsevier Inc. All rights reserved.

  6. 3D medical volume reconstruction using web services.

    PubMed

    Kooper, Rob; Shirk, Andrew; Lee, Sang-Chul; Lin, Amy; Folberg, Robert; Bajcsy, Peter

    2008-04-01

    We address the problem of 3D medical volume reconstruction using web services. The use of proposed web services is motivated by the fact that the problem of 3D medical volume reconstruction requires significant computer resources and human expertise in medical and computer science areas. Web services are implemented as an additional layer to a dataflow framework called data to knowledge. In the collaboration between UIC and NCSA, pre-processed input images at NCSA are made accessible to medical collaborators for registration. Every time UIC medical collaborators inspected images and selected corresponding features for registration, the web service at NCSA is contacted and the registration processing query is executed using the image to knowledge library of registration methods. Co-registered frames are returned for verification by medical collaborators in a new window. In this paper, we present 3D volume reconstruction problem requirements and the architecture of the developed prototype system at http://isda.ncsa.uiuc.edu/MedVolume. We also explain the tradeoffs of our system design and provide experimental data to support our system implementation. The prototype system has been used for multiple 3D volume reconstructions of blood vessels and vasculogenic mimicry patterns in histological sections of uveal melanoma studied by fluorescent confocal laser scanning microscope.

  7. Trauma pancreaticoduodenectomy for complex pancreaticoduodenal injury. Delayed reconstruction.

    PubMed

    Gupta, Vikas; Wig, Jai Dev; Garg, Harsh

    2008-09-02

    To assess the feasibility and safety of the delayed reconstruction approach in patients with complex pancreaticoduodenal injuries. Tertiary care center in Northern India. Five patients with complex pancreaticoduodenal injuries, three following blunt and two following penetrating injury. All patients underwent a pancreaticoduodenectomy. T-tube drainage of the common bile duct and external tube drainage of the pancreatic duct were established. A wide bore tube drain was left in the right upper abdomen. The postoperative course was uneventful in four patients. One patient died from coagulopathy on the 4th postoperative day. Delayed reconstruction was carried out in four patients. In one patient, a pancreaticojejunal anastomosis could not be performed. The postoperative period was uneventful and no patient had a biliary or a pancreatic leak. All four patients are well on follow-up. Delayed reconstruction in complex pancreaticoduodenal injuries is a feasible and viable option as was demonstrated by this study. Controlled external tube drainage of the bile and pancreatic ducts facilitates postoperative care and prevents on-going contamination of the peritoneal cavity with bile and pancreatic juice. Leaving behind the uncinate process shortens the operating time with less blood loss. Planned reconstruction is carried out once the inflammatory process has settled.

  8. The Chandra X-ray Observatory removed from its container in the Vertical Processing Facility

    NASA Technical Reports Server (NTRS)

    1999-01-01

    Inside the Vertical Processing Facility (VPF), the overhead crane lifts Chandra X-ray Observatory completely out of its protective container. While in the VPF, the telescope will undergo final installation of associated electronic components; it will also be tested, fueled and mated with the Inertial Upper Stage booster. A set of integrated tests will follow. Chandra is scheduled for launch July 9 aboard Space Shuttle Columbia, on mission STS-93 . Formerly called the Advanced X-ray Astrophysics Facility, Chandra comprises three major elements: the spacecraft, the science instrument module (SIM), and the world's most powerful X-ray telescope. Chandra will allow scientists from around the world to see previously invisible black holes and high-temperature gas clouds, giving the observatory the potential to rewrite the books on the structure and evolution of our universe.

  9. The Chandra X-ray Observatory removed from its container in the Vertical Processing Facility

    NASA Technical Reports Server (NTRS)

    1999-01-01

    Inside the Vertical Processing Facility (VPF), workers begin lifting the Chandra X-ray Observatory out of its protective container. While in the VPF, the telescope will undergo final installation of associated electronic components; it will also be tested, fueled and mated with the Inertial Upper Stage booster. A set of integrated tests will follow. Chandra is scheduled for launch July 9 aboard Space Shuttle Columbia, on mission STS-93 . Formerly called the Advanced X-ray Astrophysics Facility, Chandra comprises three major elements: the spacecraft, the science instrument module (SIM), and the world's most powerful X-ray telescope. Chandra will allow scientists from around the world to see previously invisible black holes and high-temperature gas clouds, giving the observatory the potential to rewrite the books on the structure and evolution of our universe.

  10. [Design of an HACCP program for a cocoa processing facility].

    PubMed

    López D'Sola, Patrizia; Sandia, María Gabriela; Bou Rached, Lizet; Hernández Serrano, Pilar

    2012-12-01

    The HACCP plan is a food safety management tool used to control physical, chemical and biological hazards associated to food processing through all the processing chain. The aim of this work is to design a HACCP Plan for a Venezuelan cocoa processing facility.The production of safe food products requires that the HACCP system be built upon a solid foundation of prerequisite programs such as Good Manufacturing Practices (GMP) and Sanitation Standard Operating Procedures (SSOP). The existence and effectiveness of these prerequisite programs were previously assessed.Good Agriculture Practices (GAP) audit to cocoa nibs suppliers were performed. To develop the HACCP plan, the five preliminary tasks and the seven HACCP principles were accomplished according to Codex Alimentarius procedures. Three Critical Control Points (CCP) were identified using a decision tree: winnowing (control of ochratoxin A), roasting (Salmonella control) and metallic particles detection. For each CCP, Critical limits were established, the Monitoring procedures, Corrective actions, Procedures for Verification and Documentation concerning all procedures and records appropriate to these principles and their application was established. To implement and maintain a HACCP plan for this processing plant is suggested. Recently OchratoxinA (OTA) has been related to cocoa beans. Although the shell separation from the nib has been reported as an effective measure to control this chemical hazard, ochratoxin prevalence study in cocoa beans produced in the country is recommended, and validate the winnowing step as well

  11. The structure of reconstructed chalcopyrite surfaces

    NASA Astrophysics Data System (ADS)

    Thinius, Sascha; Islam, Mazharul M.; Bredow, Thomas

    2018-03-01

    Chalcopyrite (CuFeS2) surfaces are of major interest for copper exploitation in aqueous solution, called leaching. Since leaching is a surface process knowledge of the surface structure, bonding pattern and oxidation states is important for improving the efficiency. At present such information is not available from experimental studies. Therefore a detailed computational study of chalcopyrite surfaces is performed. The structures of low-index stoichiometric chalcopyrite surfaces {hkl} h, k, l ∈ {0, 1, 2} have been studied with density functional theory (DFT) and global optimization strategies. We have applied ab initio molecular dynamics (MD) in combination with simulated annealing (SA) in order to explore possible reconstructions via a minima hopping (MH) algorithm. In almost all cases reconstruction involving substantial rearrangement has occurred accompanied by reduction of the surface energy. The analysis of the change in the coordination sphere and migration during reconstruction reveals that S-S dimers are formed on the surface. Further it was observed that metal atoms near the surface move toward the bulk forming metal alloys passivated by sulfur. The obtained surface energies of reconstructed surfaces are in the range of 0.53-0.95 J/m2.

  12. Bayesian reconstruction of projection reconstruction NMR (PR-NMR).

    PubMed

    Yoon, Ji Won

    2014-11-01

    Projection reconstruction nuclear magnetic resonance (PR-NMR) is a technique for generating multidimensional NMR spectra. A small number of projections from lower-dimensional NMR spectra are used to reconstruct the multidimensional NMR spectra. In our previous work, it was shown that multidimensional NMR spectra are efficiently reconstructed using peak-by-peak based reversible jump Markov chain Monte Carlo (RJMCMC) algorithm. We propose an extended and generalized RJMCMC algorithm replacing a simple linear model with a linear mixed model to reconstruct close NMR spectra into true spectra. This statistical method generates samples in a Bayesian scheme. Our proposed algorithm is tested on a set of six projections derived from the three-dimensional 700 MHz HNCO spectrum of a protein HasA. Copyright © 2014 Elsevier Ltd. All rights reserved.

  13. Feasibility Study for a Plasma Dynamo Facility to Investigate Fundamental Processes in Plasma Astrophysics. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Forest, Cary B.

    The scientific equipment purchased on this grant was used on the Plasma Dynamo Prototype Experiment as part of Professor Forest's feasibility study for determining if it would be worthwhile to propose building a larger plasma physics experiment to investigate various fundamental processes in plasma astrophysics. The initial research on the Plasma Dynamo Prototype Experiment was successful so Professor Forest and Professor Ellen Zweibel at UW-Madison submitted an NSF Major Research Instrumentation proposal titled "ARRA MRI: Development of a Plasma Dynamo Facility for Experimental Investigations of Fundamental Processes in Plasma Astrophysics." They received funding for this project and the Plasma Dynamomore » Facility also known as the "Madison Plasma Dynamo Experiment" was constructed. This experiment achieved its first plasma in the fall of 2012 and U.S. Dept. of Energy Grant No. DE-SC0008709 "Experimental Studies of Plasma Dynamos," now supports the research.« less

  14. Penile Reconstruction

    PubMed Central

    Salgado, Christopher J.; Chim, Harvey; Tang, Jennifer C.; Monstrey, Stan J.; Mardini, Samir

    2011-01-01

    A variety of surgical options exists for penile reconstruction. The key to success of therapy is holistic management of the patient, with attention to the psychological aspects of treatment. In this article, we review reconstructive modalities for various types of penile defects inclusive of partial and total defects as well as the buried penis, and also describe recent basic science advances, which may promise new options for penile reconstruction. PMID:22851914

  15. Parallel heterogeneous architectures for efficient OMP compressive sensing reconstruction

    NASA Astrophysics Data System (ADS)

    Kulkarni, Amey; Stanislaus, Jerome L.; Mohsenin, Tinoosh

    2014-05-01

    Compressive Sensing (CS) is a novel scheme, in which a signal that is sparse in a known transform domain can be reconstructed using fewer samples. The signal reconstruction techniques are computationally intensive and have sluggish performance, which make them impractical for real-time processing applications . The paper presents novel architectures for Orthogonal Matching Pursuit algorithm, one of the popular CS reconstruction algorithms. We show the implementation results of proposed architectures on FPGA, ASIC and on a custom many-core platform. For FPGA and ASIC implementation, a novel thresholding method is used to reduce the processing time for the optimization problem by at least 25%. Whereas, for the custom many-core platform, efficient parallelization techniques are applied, to reconstruct signals with variant signal lengths of N and sparsity of m. The algorithm is divided into three kernels. Each kernel is parallelized to reduce execution time, whereas efficient reuse of the matrix operators allows us to reduce area. Matrix operations are efficiently paralellized by taking advantage of blocked algorithms. For demonstration purpose, all architectures reconstruct a 256-length signal with maximum sparsity of 8 using 64 measurements. Implementation on Xilinx Virtex-5 FPGA, requires 27.14 μs to reconstruct the signal using basic OMP. Whereas, with thresholding method it requires 18 μs. ASIC implementation reconstructs the signal in 13 μs. However, our custom many-core, operating at 1.18 GHz, takes 18.28 μs to complete. Our results show that compared to the previous published work of the same algorithm and matrix size, proposed architectures for FPGA and ASIC implementations perform 1.3x and 1.8x respectively faster. Also, the proposed many-core implementation performs 3000x faster than the CPU and 2000x faster than the GPU.

  16. 10 CFR 95.17 - Processing facility clearance.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... existing facility clearance granted by a current CSA and authorize possession of license or certificate... concerning the foreign intelligence threat, risk of unauthorized technology transfer, type and sensitivity of..., certificate holder, or other person must advise the NRC within 30 days of any significant events or changes...

  17. 10 CFR 95.17 - Processing facility clearance.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... existing facility clearance granted by a current CSA and authorize possession of license or certificate... concerning the foreign intelligence threat, risk of unauthorized technology transfer, type and sensitivity of..., certificate holder, or other person must advise the NRC within 30 days of any significant events or changes...

  18. Vectorization with SIMD extensions speeds up reconstruction in electron tomography.

    PubMed

    Agulleiro, J I; Garzón, E M; García, I; Fernández, J J

    2010-06-01

    Electron tomography allows structural studies of cellular structures at molecular detail. Large 3D reconstructions are needed to meet the resolution requirements. The processing time to compute these large volumes may be considerable and so, high performance computing techniques have been used traditionally. This work presents a vector approach to tomographic reconstruction that relies on the exploitation of the SIMD extensions available in modern processors in combination to other single processor optimization techniques. This approach succeeds in producing full resolution tomograms with an important reduction in processing time, as evaluated with the most common reconstruction algorithms, namely WBP and SIRT. The main advantage stems from the fact that this approach is to be run on standard computers without the need of specialized hardware, which facilitates the development, use and management of programs. Future trends in processor design open excellent opportunities for vector processing with processor's SIMD extensions in the field of 3D electron microscopy.

  19. Process Evaluation of Communitisation Programme in Public Sector Health Facilities, Mokokchung District, Nagaland, 2015.

    PubMed

    Tushi, Aonungdok; Kaur, Prabhdeep

    2017-01-01

    Public sector health facilities were poorly managed due to a history of conflict in Nagaland, India. Government of Nagaland introduced "Nagaland Communitisation of Public Institutions and Services Act" in 2002. Main objectives of the evaluation were to review the functioning of Health Center Managing Committees (HCMCs), deliver health services in the institutions managed by HCMC, identify strengths as well as challenges perceived by HCMC members in the rural areas of Mokokchung district, Nagaland. The evaluation was made using input, process and output indicators. A doctor, the HCMC Chairman and one member from each of the three community health centers (CHC) and four primary health centers (PHC) were surveyed using a semi-structured questionnaire and an in-depth interview guide. Proportions for quantitative data were computed and key themes from the same were identified. Overall; the infrastructure, equipment and outpatient/inpatient service availability was satisfactory. There was a lack of funds and shortage of doctors, drugs as well as laboratory facilities. HCMCs were in place and carried out administrative activities. HCMCs felt ownership, mobilized community contributions and managed human resources. HCMC members had inadequate funds for their transport and training. They faced challenges in service delivery due to political interference and lack of adequate human, material, financial resources. Communitisation program was operational in the district. HCMC members felt the ownership of health facilities. Administrative, political support and adequate funds from the government are needed for effective functioning of HCMCs and optimal service delivery in public sector facilities.

  20. Alpha image reconstruction (AIR): A new iterative CT image reconstruction approach using voxel-wise alpha blending

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hofmann, Christian; Sawall, Stefan; Knaup, Michael

    2014-06-15

    contrast factor for contrast-resolution plots. Furthermore, the authors calculate the contrast-to-noise ratio with the low contrast disks and the authors compare the agreement of the reconstructions with the ground truth by calculating the normalized cross-correlation and the root-mean-square deviation. To evaluate the clinical performance of the proposed method, the authors reconstruct patient data acquired with a Somatom Definition Flash dual source CT scanner (Siemens Healthcare, Forchheim, Germany). Results: The results of the simulation study show that among the compared algorithms AIR achieves the highest resolution and the highest agreement with the ground truth. Compared to the reference FBP reconstruction AIR is able to reduce the relative pixel noise by up to 50% and at the same time achieve a higher resolution by maintaining the edge information from the basis images. These results can be confirmed with the patient data. Conclusions: To evaluate the AIR algorithm simulated and measured patient data of a state-of-the-art clinical CT system were processed. It is shown, that generating CT images through the reconstruction of weighting coefficients has the potential to improve the resolution noise trade-off and thus to improve the dose usage in clinical CT.« less

  1. Pan-sharpening via compressed superresolution reconstruction and multidictionary learning

    NASA Astrophysics Data System (ADS)

    Shi, Cheng; Liu, Fang; Li, Lingling; Jiao, Licheng; Hao, Hongxia; Shang, Ronghua; Li, Yangyang

    2018-01-01

    In recent compressed sensing (CS)-based pan-sharpening algorithms, pan-sharpening performance is affected by two key problems. One is that there are always errors between the high-resolution panchromatic (HRP) image and the linear weighted high-resolution multispectral (HRM) image, resulting in spatial and spectral information lost. The other is that the dictionary construction process depends on the nontruth training samples. These problems have limited applications to CS-based pan-sharpening algorithm. To solve these two problems, we propose a pan-sharpening algorithm via compressed superresolution reconstruction and multidictionary learning. Through a two-stage implementation, compressed superresolution reconstruction model reduces the error effectively between the HRP and the linear weighted HRM images. Meanwhile, the multidictionary with ridgelet and curvelet is learned for both the two stages in the superresolution reconstruction process. Since ridgelet and curvelet can better capture the structure and directional characteristics, a better reconstruction result can be obtained. Experiments are done on the QuickBird and IKONOS satellites images. The results indicate that the proposed algorithm is competitive compared with the recent CS-based pan-sharpening methods and other well-known methods.

  2. Corrosion Testing of Monofrax K-3 Refractory in Defense Waste Processing Facility (DWPF) Alternate Reductant Feeds

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Williams, M.; Jantzen, C.; Burket, P.

    The Defense Waste Processing Facility (DWPF) at the Savannah River Site (SRS) uses a combination of reductants and oxidants while converting high level waste (HLW) to a borosilicate waste form. A reducing flowsheet is maintained to retain radionuclides in their reduced oxidation states which promotes their incorporation into borosilicate glass. For the last 20 years of processing, the DWPF has used formic acid as the main reductant and nitric acid as the main oxidant. During reaction in the Chemical Process Cell (CPC), formate and formic acid release measurably significant H 2 gas which requires monitoring of certain vessel’s vapor spaces.more » A switch to a nitric acid-glycolic acid (NG) flowsheet from the nitric-formic (NF) flowsheet is desired as the NG flowsheet releases considerably less H 2 gas upon decomposition. This would greatly simplify DWPF processing from a safety standpoint as close monitoring of the H 2 gas concentration could become less critical. In terms of the waste glass melter vapor space flammability, the switch from the NF flowsheet to the NG flowsheet showed a reduction of H 2 gas production from the vitrification process as well. Due to the positive impact of the switch to glycolic acid determined on the flammability issues, evaluation of the other impacts of glycolic acid on the facility must be examined.« less

  3. Modifications in SIFT-based 3D reconstruction from image sequence

    NASA Astrophysics Data System (ADS)

    Wei, Zhenzhong; Ding, Boshen; Wang, Wei

    2014-11-01

    In this paper, we aim to reconstruct 3D points of the scene from related images. Scale Invariant Feature Transform( SIFT) as a feature extraction and matching algorithm has been proposed and improved for years and has been widely used in image alignment and stitching, image recognition and 3D reconstruction. Because of the robustness and reliability of the SIFT's feature extracting and matching algorithm, we use it to find correspondences between images. Hence, we describe a SIFT-based method to reconstruct 3D sparse points from ordered images. In the process of matching, we make a modification in the process of finding the correct correspondences, and obtain a satisfying matching result. By rejecting the "questioned" points before initial matching could make the final matching more reliable. Given SIFT's attribute of being invariant to the image scale, rotation, and variable changes in environment, we propose a way to delete the multiple reconstructed points occurred in sequential reconstruction procedure, which improves the accuracy of the reconstruction. By removing the duplicated points, we avoid the possible collapsed situation caused by the inexactly initialization or the error accumulation. The limitation of some cases that all reprojected points are visible at all times also does not exist in our situation. "The small precision" could make a big change when the number of images increases. The paper shows the contrast between the modified algorithm and not. Moreover, we present an approach to evaluate the reconstruction by comparing the reconstructed angle and length ratio with actual value by using a calibration target in the scene. The proposed evaluation method is easy to be carried out and with a great applicable value. Even without the Internet image datasets, we could evaluate our own results. In this paper, the whole algorithm has been tested on several image sequences both on the internet and in our shots.

  4. Gains in efficiency and scientific potential of continental climate reconstruction provided by the LRC LacCore Facility, University of Minnesota

    NASA Astrophysics Data System (ADS)

    Noren, A.; Brady, K.; Myrbo, A.; Ito, E.

    2007-12-01

    , and stores metadata and analytical data for all cores processed at the facility. Any researcher may submit sample requests for material in archived cores. Supplies for field (e.g., polycarbonate pipe, endcaps), lab (e.g., sample containers, pollen sample spike), and curation (e.g., D-tubes) are sold at cost. In collaboration with facility users, staff continually develop new equipment, supplies, and procedures as needed in order to provide the best and most comprehensive set of services to the research community.

  5. Parametric boundary reconstruction algorithm for industrial CT metrology application.

    PubMed

    Yin, Zhye; Khare, Kedar; De Man, Bruno

    2009-01-01

    High-energy X-ray computed tomography (CT) systems have been recently used to produce high-resolution images in various nondestructive testing and evaluation (NDT/NDE) applications. The accuracy of the dimensional information extracted from CT images is rapidly approaching the accuracy achieved with a coordinate measuring machine (CMM), the conventional approach to acquire the metrology information directly. On the other hand, CT systems generate the sinogram which is transformed mathematically to the pixel-based images. The dimensional information of the scanned object is extracted later by performing edge detection on reconstructed CT images. The dimensional accuracy of this approach is limited by the grid size of the pixel-based representation of CT images since the edge detection is performed on the pixel grid. Moreover, reconstructed CT images usually display various artifacts due to the underlying physical process and resulting object boundaries from the edge detection fail to represent the true boundaries of the scanned object. In this paper, a novel algorithm to reconstruct the boundaries of an object with uniform material composition and uniform density is presented. There are three major benefits in the proposed approach. First, since the boundary parameters are reconstructed instead of image pixels, the complexity of the reconstruction algorithm is significantly reduced. The iterative approach, which can be computationally intensive, will be practical with the parametric boundary reconstruction. Second, the object of interest in metrology can be represented more directly and accurately by the boundary parameters instead of the image pixels. By eliminating the extra edge detection step, the overall dimensional accuracy and process time can be improved. Third, since the parametric reconstruction approach shares the boundary representation with other conventional metrology modalities such as CMM, boundary information from other modalities can be directly

  6. A Guide for Developing Standard Operating Job Procedures for the Screening & Grinding Process Wastewater Treatment Facility. SOJP No. 1.

    ERIC Educational Resources Information Center

    Deal, Gerald A.; Montgomery, James A.

    This guide describes standard operating job procedures for the screening and grinding process of wastewater treatment facilities. The objective of this process is the removal of coarse materials from the raw waste stream for the protection of subsequent equipment and processes. The guide gives step-by-step instructions for safety inspection,…

  7. A Guide for Developing Standard Operating Job Procedures for the Sludge Thickening Process Wastewater Treatment Facility. SOJP No. 9.

    ERIC Educational Resources Information Center

    Schwing, Carl M.

    This guide describes standard operating job procedures for the screening and grinding process of wastewater treatment facilities. The objective of this process is the removal of coarse materials from the raw waste stream for the protection of subsequent equipment and processes. The guide gives step-by-step instructions for safety inspection,…

  8. Adjustment of automatic control systems of production facilities at coal processing plants using multivariant physico- mathematical models

    NASA Astrophysics Data System (ADS)

    Evtushenko, V. F.; Myshlyaev, L. P.; Makarov, G. V.; Ivushkin, K. A.; Burkova, E. V.

    2016-10-01

    The structure of multi-variant physical and mathematical models of control system is offered as well as its application for adjustment of automatic control system (ACS) of production facilities on the example of coal processing plant.

  9. Fast dictionary-based reconstruction for diffusion spectrum imaging.

    PubMed

    Bilgic, Berkin; Chatnuntawech, Itthi; Setsompop, Kawin; Cauley, Stephen F; Yendiki, Anastasia; Wald, Lawrence L; Adalsteinsson, Elfar

    2013-11-01

    Diffusion spectrum imaging reveals detailed local diffusion properties at the expense of substantially long imaging times. It is possible to accelerate acquisition by undersampling in q-space, followed by image reconstruction that exploits prior knowledge on the diffusion probability density functions (pdfs). Previously proposed methods impose this prior in the form of sparsity under wavelet and total variation transforms, or under adaptive dictionaries that are trained on example datasets to maximize the sparsity of the representation. These compressed sensing (CS) methods require full-brain processing times on the order of hours using MATLAB running on a workstation. This work presents two dictionary-based reconstruction techniques that use analytical solutions, and are two orders of magnitude faster than the previously proposed dictionary-based CS approach. The first method generates a dictionary from the training data using principal component analysis (PCA), and performs the reconstruction in the PCA space. The second proposed method applies reconstruction using pseudoinverse with Tikhonov regularization with respect to a dictionary. This dictionary can either be obtained using the K-SVD algorithm, or it can simply be the training dataset of pdfs without any training. All of the proposed methods achieve reconstruction times on the order of seconds per imaging slice, and have reconstruction quality comparable to that of dictionary-based CS algorithm.

  10. Statistical image reconstruction from correlated data with applications to PET

    PubMed Central

    Alessio, Adam; Sauer, Ken; Kinahan, Paul

    2008-01-01

    Most statistical reconstruction methods for emission tomography are designed for data modeled as conditionally independent Poisson variates. In reality, due to scanner detectors, electronics and data processing, correlations are introduced into the data resulting in dependent variates. In general, these correlations are ignored because they are difficult to measure and lead to computationally challenging statistical reconstruction algorithms. This work addresses the second concern, seeking to simplify the reconstruction of correlated data and provide a more precise image estimate than the conventional independent methods. In general, correlated variates have a large non-diagonal covariance matrix that is computationally challenging to use as a weighting term in a reconstruction algorithm. This work proposes two methods to simplify the use of a non-diagonal covariance matrix as the weighting term by (a) limiting the number of dimensions in which the correlations are modeled and (b) adopting flexible, yet computationally tractable, models for correlation structure. We apply and test these methods with simple simulated PET data and data processed with the Fourier rebinning algorithm which include the one-dimensional correlations in the axial direction and the two-dimensional correlations in the transaxial directions. The methods are incorporated into a penalized weighted least-squares 2D reconstruction and compared with a conventional maximum a posteriori approach. PMID:17921576

  11. [Evaluating the activity of the Italian Mental Health Services inpatient and residential facilities: the PRISM (Process Indicator System for Mental health) indicators].

    PubMed

    Picardi, Angelo; Tarolla, Emanuele; de Girolamo, Giovanni; Gigantesco, Antonella; Neri, Giovanni; Rossi, Elisabetta; Biondi, Massimo

    2014-01-01

    This article describes the activities of a project aimed at developing a system of process and process/outcome indicators suitable to monitor over time the quality of psychiatric care of Italian inpatient and residential psychiatric facilities. This system, named PRISM (Process Indicator System for Mental health), was developed by means of a standardized evaluation made by a panel of experts and a consecutive pilot study in 17 inpatient and 13 residential psychiatric facilities. A total of 28 indicators were selected from a set of 251 candidate indicators developed by the most relevant and qualified Italian and international authorities. These indicators are derived by data from medical records and information about characteristics of facilities, and they cover processes of care, operational equipment of facilities, staff training and working, relationships with external agencies, and sentinel events. The procedure followed for the development of the indicator system was reliable and innovative. The data collected from the pilot study suggested a favourable benefit-cost ratio between the workload associated with regular use of the indicators into the context of daily clinical activities and the advantages related to the information gathered through regular use of the indicators. CONCLUSIONS.:The PRISM system provides additional information about the healthcare processes with respect to the information gathered via routine information systems, and it might prove useful for both continuous quality improvement programs and health services research.

  12. Ripple scalings in geothermal facilities, a key to understand the scaling process

    NASA Astrophysics Data System (ADS)

    Köhl, Bernhard; Grundy, James; Baumann, Thomas

    2017-04-01

    Scalings are a widespread problem among geothermal plants which exploit the Malm Aquifer in the Bavarian Molasse Zone. They effect the technical and economic efficiency of geothermal plants. The majority of the scalings observed at geothermal facilities exploring the Malm aquifer in the Bavarian Molasse Basin are carbonates. They are formed due to a disruption of the lime-carbonic-acid equilibrium during production caused by degassing of CO2. These scalings are found in the production pipes, at the pumps and at filters and can nicely be described using existing hydrogeochemical models. This study proposes a second mechanism for the formation of scalings in ground-level facilities. We investigated scalings which accumulated at the inlet to the heat exchanger. Interestingly, the scalings were recovered after the ground level facilities had been cleaned. The scalings showed distinct ripple structures, which is likely a result of solid particle deposition. From the ripple features the the flow conditions during their formation were calculated based on empirical equations (Soulsby, 2012). The calculations suggest that the deposits were formed during maintenance works. Thin section images of the sediments indicate a two-step process: deposition of sediment grains, followed by stabilization with a calcite layer. The latter likely occured during maintenance. To prevent this type of scalings blocking the heat exchangers, the maintenance procedure has to be revised. References: Soulsby, R. L.; Whitehouse, R. J. S.; Marten, K. V.: Prediction of time-evolving sand ripples in shelf seas. Continental Shelf Research 2012, 38, 47-62

  13. Three-dimensional reconstruction of indoor whole elements based on mobile LiDAR point cloud data

    NASA Astrophysics Data System (ADS)

    Gong, Yuejian; Mao, Wenbo; Bi, Jiantao; Ji, Wei; He, Zhanjun

    2014-11-01

    Ground-based LiDAR is one of the most effective city modeling tools at present, which has been widely used for three-dimensional reconstruction of outdoor objects. However, as for indoor objects, there are some technical bottlenecks due to lack of GPS signal. In this paper, based on the high-precision indoor point cloud data which was obtained by LiDAR, an international advanced indoor mobile measuring equipment, high -precision model was fulfilled for all indoor ancillary facilities. The point cloud data we employed also contain color feature, which is extracted by fusion with CCD images. Thus, it has both space geometric feature and spectral information which can be used for constructing objects' surface and restoring color and texture of the geometric model. Based on Autodesk CAD platform and with help of PointSence plug, three-dimensional reconstruction of indoor whole elements was realized. Specifically, Pointools Edit Pro was adopted to edit the point cloud, then different types of indoor point cloud data was processed, including data format conversion, outline extracting and texture mapping of the point cloud model. Finally, three-dimensional visualization of the real-world indoor was completed. Experiment results showed that high-precision 3D point cloud data obtained by indoor mobile measuring equipment can be used for indoor whole elements' 3-d reconstruction and that methods proposed in this paper can efficiently realize the 3 -d construction of indoor whole elements. Moreover, the modeling precision could be controlled within 5 cm, which was proved to be a satisfactory result.

  14. 75 FR 71733 - Requirements for Measurement Facilities Used for the Royalty Valuation of Processed Natural Gas

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-11-24

    ... measurement of inlet production, residue gas, fuel gas, flare gas, condensate, natural gas liquids, or any... governing gas and liquid hydrocarbon production measurement. We have recently completed the first phase of... Requirements for Measurement Facilities Used for the Royalty Valuation of Processed Natural Gas AGENCY: Bureau...

  15. A Guide for Developing Standard Operating Job Procedures for the Digestion Process Wastewater Treatment Facility. SOJP No. 10.

    ERIC Educational Resources Information Center

    Schwing, Carl M.

    This guide describes standard operating job procedures for the digestion process of wastewater treatment facilities. This process is for reducing the volume of sludge to be treated in subsequent units and to reduce the volatile content of sludge. The guide gives step-by-step instructions for pre-startup, startup, continuous operating, shutdown,…

  16. Mars Entry Atmospheric Data System Trajectory Reconstruction Algorithms and Flight Results

    NASA Technical Reports Server (NTRS)

    Karlgaard, Christopher D.; Kutty, Prasad; Schoenenberger, Mark; Shidner, Jeremy; Munk, Michelle

    2013-01-01

    The Mars Entry Atmospheric Data System is a part of the Mars Science Laboratory, Entry, Descent, and Landing Instrumentation project. These sensors are a system of seven pressure transducers linked to ports on the entry vehicle forebody to record the pressure distribution during atmospheric entry. These measured surface pressures are used to generate estimates of atmospheric quantities based on modeled surface pressure distributions. Specifically, angle of attack, angle of sideslip, dynamic pressure, Mach number, and freestream atmospheric properties are reconstructed from the measured pressures. Such data allows for the aerodynamics to become decoupled from the assumed atmospheric properties, allowing for enhanced trajectory reconstruction and performance analysis as well as an aerodynamic reconstruction, which has not been possible in past Mars entry reconstructions. This paper provides details of the data processing algorithms that are utilized for this purpose. The data processing algorithms include two approaches that have commonly been utilized in past planetary entry trajectory reconstruction, and a new approach for this application that makes use of the pressure measurements. The paper describes assessments of data quality and preprocessing, and results of the flight data reduction from atmospheric entry, which occurred on August 5th, 2012.

  17. Reconstructing Face Image from the Thermal Infrared Spectrum to the Visible Spectrum †

    PubMed Central

    Kresnaraman, Brahmastro; Deguchi, Daisuke; Takahashi, Tomokazu; Mekada, Yoshito; Ide, Ichiro; Murase, Hiroshi

    2016-01-01

    During the night or in poorly lit areas, thermal cameras are a better choice instead of normal cameras for security surveillance because they do not rely on illumination. A thermal camera is able to detect a person within its view, but identification from only thermal information is not an easy task. The purpose of this paper is to reconstruct the face image of a person from the thermal spectrum to the visible spectrum. After the reconstruction, further image processing can be employed, including identification/recognition. Concretely, we propose a two-step thermal-to-visible-spectrum reconstruction method based on Canonical Correlation Analysis (CCA). The reconstruction is done by utilizing the relationship between images in both thermal infrared and visible spectra obtained by CCA. The whole image is processed in the first step while the second step processes patches in an image. Results show that the proposed method gives satisfying results with the two-step approach and outperforms comparative methods in both quality and recognition evaluations. PMID:27110781

  18. Facility for orbital material processing

    NASA Astrophysics Data System (ADS)

    Starodubov, D.; McCormick, K.; Dellosa, M.; Erdelyi, E.; Volfson, L.

    2018-05-01

    The sustainable orbital manufacturing with commercially viable and profitable operation has tremendous potential for driving the space exploration industry and human expansion into outer space. This highly challenging task has never been accomplished before. The current relatively high delivery cost of materials represents the business challenge of value proposition for making products in space. FOMS Inc. team identified an opportunity of fluoride optical fiber manufacturing in space that can lead to the first commercial production on orbit. To address continued cost effective International Space Station (ISS) operations FOMS Inc. has developed and demonstrated for the first time a fully operational space facility for orbital remote manufacturing with up to 50 km fiber fabrication capability and strong commercial potential for manufacturing operations on board the ISS.

  19. Carbon nanotubes/magnetite hybrids prepared by a facile synthesis process and their magnetic properties

    NASA Astrophysics Data System (ADS)

    Zhang, Li; Ni, Qing-Qing; Natsuki, Toshiaki; Fu, Yaqin

    2009-07-01

    In this paper, a facile synthesis process is proposed to prepare multiwalled carbon nanotubes/magnetite (MWCNTs/Fe 3O 4) hybrids. The process involves two steps: (1) water-soluble CNTs are synthesized by one-pot modification using potassium persulfate (KPS) as oxidant. (2) Fe 3O 4 is assembled along the treated CNTs by employing a facile hydrothermal process with the presence of hydrazine hydrate as the mineralizer. The treated CNTs can be easily dispersed in aqueous solvent. Moreover, X-ray photoelectron spectroscopy (XPS) analysis reveals that several functional groups such as potassium carboxylate (-COOK), carbonyl (-C dbnd O) and hydroxyl (-C-OH) groups are formed on the nanotube surfaces. The MWCNTs/Fe 3O 4 hybrids are characterized with respect to crystal structure, morphology, element composition and magnetic property by X-ray diffraction (XRD), transmission electron microscopy (TEM), XPS and superconducting quantum interference device (SQUID) magnetometer. XRD and TEM results show that the Fe 3O 4 nanoparticles with diameter in the range of 20-60 nm were firmly assembled on the nanotube surface. The magnetic property investigation indicated that the CNTs/Fe 3O 4 hybrids exhibit a ferromagnetic behavior and possess a saturation magnetization of 32.2 emu/g. Further investigation indicates that the size of assembled Fe 3O 4 nanoparticles can be turned by varying experiment factors. Moreover, a probable growth mechanism for the preparation of CNTs/Fe 3O 4 hybrids was discussed.

  20. Parallel image reconstruction for 3D positron emission tomography from incomplete 2D projection data

    NASA Astrophysics Data System (ADS)

    Guerrero, Thomas M.; Ricci, Anthony R.; Dahlbom, Magnus; Cherry, Simon R.; Hoffman, Edward T.

    1993-07-01

    The problem of excessive computational time in 3D Positron Emission Tomography (3D PET) reconstruction is defined, and we present an approach for solving this problem through the construction of an inexpensive parallel processing system and the adoption of the FAVOR algorithm. Currently, the 3D reconstruction of the 610 images of a total body procedure would require 80 hours and the 3D reconstruction of the 620 images of a dynamic study would require 110 hours. An inexpensive parallel processing system for 3D PET reconstruction is constructed from the integration of board level products from multiple vendors. The system achieves its computational performance through the use of 6U VME four i860 processor boards, the processor boards from five manufacturers are discussed from our perspective. The new 3D PET reconstruction algorithm FAVOR, FAst VOlume Reconstructor, that promises a substantial speed improvement is adopted. Preliminary results from parallelizing FAVOR are utilized in formulating architectural improvements for this problem. In summary, we are addressing the problem of excessive computational time in 3D PET image reconstruction, through the construction of an inexpensive parallel processing system and the parallelization of a 3D reconstruction algorithm that uses the incomplete data set that is produced by current PET systems.

  1. Grammar-based Automatic 3D Model Reconstruction from Terrestrial Laser Scanning Data

    NASA Astrophysics Data System (ADS)

    Yu, Q.; Helmholz, P.; Belton, D.; West, G.

    2014-04-01

    The automatic reconstruction of 3D buildings has been an important research topic during the last years. In this paper, a novel method is proposed to automatically reconstruct the 3D building models from segmented data based on pre-defined formal grammar and rules. Such segmented data can be extracted e.g. from terrestrial or mobile laser scanning devices. Two steps are considered in detail. The first step is to transform the segmented data into 3D shapes, for instance using the DXF (Drawing Exchange Format) format which is a CAD data file format used for data interchange between AutoCAD and other program. Second, we develop a formal grammar to describe the building model structure and integrate the pre-defined grammars into the reconstruction process. Depending on the different segmented data, the selected grammar and rules are applied to drive the reconstruction process in an automatic manner. Compared with other existing approaches, our proposed method allows the model reconstruction directly from 3D shapes and takes the whole building into account.

  2. Three-dimensional reconstruction of neutron, gamma-ray, and x-ray sources using spherical harmonic decomposition

    NASA Astrophysics Data System (ADS)

    Volegov, P. L.; Danly, C. R.; Fittinghoff, D.; Geppert-Kleinrath, V.; Grim, G.; Merrill, F. E.; Wilde, C. H.

    2017-11-01

    Neutron, gamma-ray, and x-ray imaging are important diagnostic tools at the National Ignition Facility (NIF) for measuring the two-dimensional (2D) size and shape of the neutron producing region, for probing the remaining ablator and measuring the extent of the DT plasmas during the stagnation phase of Inertial Confinement Fusion implosions. Due to the difficulty and expense of building these imagers, at most only a few two-dimensional projections images will be available to reconstruct the three-dimensional (3D) sources. In this paper, we present a technique that has been developed for the 3D reconstruction of neutron, gamma-ray, and x-ray sources from a minimal number of 2D projections using spherical harmonics decomposition. We present the detailed algorithms used for this characterization and the results of reconstructed sources from experimental neutron and x-ray data collected at OMEGA and NIF.

  3. EUPHORE: Research facility to study tropospheric transformation processes

    NASA Astrophysics Data System (ADS)

    Wirtz, K.

    2003-04-01

    The EUPHORE simulation chamber consists of two half-spherical Teflon bags, each with a volume of 200 m^3 and a base diameter of 9.2 m. The FEP Teflon has a transmission of about 75% at 280 nm and of more than 80% above 300 nm. Purified and dried ambient air is used to fill the chamber and flush it between experiments. The humidity in the chamber is measured by a dew point hygrometer, and the temperature is monitored by several thermocouples located at different positions inside the chamber. The solar flux is monitored with spectral resolution in the photochemically active spectral region. The simulation chamber is equipped with a number of analytical instruments for the measurement of single VOC components, NO, NO_2, O_3 and other species. In-situ measurements in the ppb range are performed using long-path absorption spectroscopy, in the UV/VIS by DOAS and in the IR by FT-IR. A GC-MS system is used for the sensitive analysis of a variety of reaction products. A newly installed LIF technique allows the in situ measurement of OH and HO_2 radicals during the reaction processes. The technological concept and the organisation structure of the EUPHORE facility will be presented. The integration of quality control measures is an obvious and necessary second step for the successful exploitation of the technically advanced outdoor smog chamber EUPHORE as a research tool. This will underline the leadership of the European scientific community in the important research areas of investigating transformation processes in the troposphere and tracking the influence of human activities on photooxidant formation and its interaction with processes related to global change. In the coming years the main scientific focus will be on testing chemical mechanisms in order to improve the models which describe the atmospheric processes of complex chemical systems. The collaborative work at the EUPHORE outdoor simulation chamber will provide all the users of the installation with a basic

  4. The emergence of care facilities in Thailand for older German-speaking people: structural backgrounds and facility operators as transnational actors.

    PubMed

    Bender, Désirée; Hollstein, Tina; Schweppe, Cornelia

    2017-12-01

    This paper presents findings from an ethnographic study of old age care facilities for German-speaking people in Thailand. It analyses the conditions and processes behind the development and specific designs of such facilities. It first looks at the intertwinement, at the socio-structural level, of different transborder developments in which the facilities' emergence is embedded. Second, it analyses the processes that accompany the emergence, development and organisation of these facilities at the local level. In this regard, it points out the central role of the facility operators as transnational actors who mediate between different frames of reference and groups of actors involved in these facilities. It concludes that the processes of mediation and intertwining are an important and distinctive feature of the emergence of these facilities, necessitated by the fact that, although the facilities are located in Thailand, their 'markets' are in the German-speaking countries of their target groups.

  5. Generalized Fourier slice theorem for cone-beam image reconstruction.

    PubMed

    Zhao, Shuang-Ren; Jiang, Dazong; Yang, Kevin; Yang, Kang

    2015-01-01

    The cone-beam reconstruction theory has been proposed by Kirillov in 1961, Tuy in 1983, Feldkamp in 1984, Smith in 1985, Pierre Grangeat in 1990. The Fourier slice theorem is proposed by Bracewell 1956, which leads to the Fourier image reconstruction method for parallel-beam geometry. The Fourier slice theorem is extended to fan-beam geometry by Zhao in 1993 and 1995. By combining the above mentioned cone-beam image reconstruction theory and the above mentioned Fourier slice theory of fan-beam geometry, the Fourier slice theorem in cone-beam geometry is proposed by Zhao 1995 in short conference publication. This article offers the details of the derivation and implementation of this Fourier slice theorem for cone-beam geometry. Especially the problem of the reconstruction from Fourier domain has been overcome, which is that the value of in the origin of Fourier space is 0/0. The 0/0 type of limit is proper handled. As examples, the implementation results for the single circle and two perpendicular circle source orbits are shown. In the cone-beam reconstruction if a interpolation process is considered, the number of the calculations for the generalized Fourier slice theorem algorithm is O(N^4), which is close to the filtered back-projection method, here N is the image size of 1-dimension. However the interpolation process can be avoid, in that case the number of the calculations is O(N5).

  6. The use of multimedia as an adjunct to the informed consent process for ankle ligament reconstruction surgery.

    PubMed

    Batuyong, Eldridge; Birks, Christopher; Beischer, Andrew D

    2012-06-01

    Obtaining "informed consent" is an integral aspect of surgery that can be fraught with difficulty. This study assessed the efficacy of a multimedia education tool in improving patients' understanding when used as an adjunct to the traditional verbal consent process regarding ankle lateral ligament reconstruction surgery. A total of 56 patients (28 males and 28 females) were recruited with a mean age of 36 years. A standardized verbal discussion regarding surgical treatment was provided to each patient. Understanding was then assessed using a knowledge questionnaire. Subsequently, each patient observed a multimedia educational program following which the knowledge questionnaire was repeated. Additional supplementary questions were then given regarding the ease of understanding and satisfaction with the 2 methods of education delivery. The patients answered 75% of the questions correctly before the multimedia module compared with 88% after it (P < .001). Patients rated the ease of understanding and the amount of information provided by the module highly (9.5 cm and 9.0 cm on a 10-cm Visual Analogue Scale scale, respectively), and 61% of patients considered that the multimedia tool performed as well as the treating surgeon. Multimedia tools used in sequence after a verbal consent resulted in improved patient understanding of pertinent information regarding ankle lateral ligament reconstruction surgery. Therapeutic Level II.

  7. Statistical reconstruction for cosmic ray muon tomography.

    PubMed

    Schultz, Larry J; Blanpied, Gary S; Borozdin, Konstantin N; Fraser, Andrew M; Hengartner, Nicolas W; Klimenko, Alexei V; Morris, Christopher L; Orum, Chris; Sossong, Michael J

    2007-08-01

    Highly penetrating cosmic ray muons constantly shower the earth at a rate of about 1 muon per cm2 per minute. We have developed a technique which exploits the multiple Coulomb scattering of these particles to perform nondestructive inspection without the use of artificial radiation. In prior work [1]-[3], we have described heuristic methods for processing muon data to create reconstructed images. In this paper, we present a maximum likelihood/expectation maximization tomographic reconstruction algorithm designed for the technique. This algorithm borrows much from techniques used in medical imaging, particularly emission tomography, but the statistics of muon scattering dictates differences. We describe the statistical model for multiple scattering, derive the reconstruction algorithm, and present simulated examples. We also propose methods to improve the robustness of the algorithm to experimental errors and events departing from the statistical model.

  8. Exploiting Mirrors in 3d Reconstruction of Small Artefacts

    NASA Astrophysics Data System (ADS)

    Kontogianni, G.; Thomaidis, A. T.; Chliverou, R.; Georgopoulos, A.

    2018-05-01

    3D reconstruction of small artefacts is very significant in order to capture the details of the whole object irrespective of the documentation method which is used (Ranged Based or Image Based). Sometimes it is very difficult to achieve it because of hidden parts, occlusions, and obstructions which the object has. Hence, more data are necessary in order to 3D digitise the whole of the artefact leading to increased time for collecting and consequently processing the data. A methodology is necessary in order to reduce the collection of the data and therefore their processing time especially in cases of mass digitisation. So in this paper, the use of mirrors in particular high-quality mirrors in the data acquisition phase for the 3D reconstruction of small artefacts is investigated. Two case studies of 3D reconstruction are presented: the first one concerns Range-Based modelling especially a Time of Flight laser scanner is utilised and in the second one Image-Based modelling technique is implemented.

  9. The Reconstruction and Failure Analysis of the Space Shuttle Columbia

    NASA Technical Reports Server (NTRS)

    Russell, Richard; Mayeaux, Brian; McDanels, Steven; Piascik, Robert; Sjaj. Samdee[; Jerman, Greg; Collins, Thomas; Woodworth, Warren

    2009-01-01

    Several days following the Columbia accident a team formed and began planning for the reconstruction of Columbia. A hangar at the Kennedy Space Center was selected for this effort due to it's size, available technical workforce and materials science laboratories and access to the vehicle ground processing infrastructure. The Reconstruction team established processes for receiving, handling, decontamination, tracking, identifying, cleaning and assessment of the debris. Initially, a 2-dimensional reconstruction of the Orbiter outer mold line was developed. As the investigation progressed fixtures which allowed a 3-dimensional reconstruction of the forward portions of the left wing's leading edge was developed. To support the reconstructions and forensic analyses a Materials and Processes (M&P) 'team was formed. This M&P team established processes for recording factual observations, debris cleaning, and engineering analysis. Fracture surfaces and thermal effects of selected airframe debris were assessed, and process flows for both nondestructive and destructive sampling and evaluation of debris were developed. The Team also assessed left hand airframe components that were believed to be associated with a structural breach of Columbia. A major portion of this analysis was evaluation of metallic deposits were prevalent on left wing leading edge components. Extensive evaluation of the visual, metallurgical and chemical nature of the deposits provided conclusions that were consistent with the visual assessments and interpretations of the NASA lead teams and the findings of the Columbia Accident Investigation Board. Analytical data collected by the M&P Team showed that a significant thermal event occurred at the left wing leading edge in the proximity of LH RCC Panels 8-9, and a correlation was formed between the deposits and overheating in these areas to the wing leading edge components. The analysis of deposits also showed exposure to temperatures in excess of 1649 C

  10. Advanced Distributed Measurements and Data Processing at the Vibro-Acoustic Test Facility, GRC Space Power Facility, Sandusky, Ohio - an Architecture and an Example

    NASA Technical Reports Server (NTRS)

    Hill, Gerald M.; Evans, Richard K.

    2009-01-01

    A large-scale, distributed, high-speed data acquisition system (HSDAS) is currently being installed at the Space Power Facility (SPF) at NASA Glenn Research Center s Plum Brook Station in Sandusky, OH. This installation is being done as part of a facility construction project to add Vibro-acoustic Test Capabilities (VTC) to the current thermal-vacuum testing capability of SPF in support of the Orion Project s requirement for Space Environments Testing (SET). The HSDAS architecture is a modular design, which utilizes fully-remotely managed components, enables the system to support multiple test locations with a wide-range of measurement types and a very large system channel count. The architecture of the system is presented along with details on system scalability and measurement verification. In addition, the ability of the system to automate many of its processes such as measurement verification and measurement system analysis is also discussed.

  11. Orbital reconstruction in the dog, cat, and horse.

    PubMed

    Wallin-Håkansson, Nils; Berggren, Karin

    2017-07-01

    To describe an adaptable method for reconstruction of the orbit following partial orbitectomy. One horse, one cat, and four dogs. Following partial orbitectomy for removal of bone and soft tissue affected by pathologic processes, reconstruction was achieved. Cerclage wires were used to reconstitute the orbital rim and other salient facial contours involved in excisions. These wires were then covered with a prolene mesh, first inside the orbit and then outwards over the affected extraorbital areas. Thereafter, a collagen sheet was placed over the mesh. Finally, subcutis and skin were closed over the construct. All operated eyes remained visual with normal position, direction, and mobility. Eyelid function, tear production, and nasolacrimal function were preserved. Side effects were mild and temporary, but animals requiring a lateral-posterior surgical approach experienced concavity to the side of the head posterior to the orbital ligament region. One bone tumor out of three recurred. The reconstruction method presented offers excellent results tectonically, cosmetically, and functionally, even following extensive orbitectomy. By adapted application of three reconstruction steps using readily available materials, large defects may be surgically repaired. Once orbitectomy is mastered, reconstruction requires no additional specialized techniques or equipment. © 2016 American College of Veterinary Ophthalmologists.

  12. Multigrid-based reconstruction algorithm for quantitative photoacoustic tomography

    PubMed Central

    Li, Shengfu; Montcel, Bruno; Yuan, Zhen; Liu, Wanyu; Vray, Didier

    2015-01-01

    This paper proposes a multigrid inversion framework for quantitative photoacoustic tomography reconstruction. The forward model of optical fluence distribution and the inverse problem are solved at multiple resolutions. A fixed-point iteration scheme is formulated for each resolution and used as a cost function. The simulated and experimental results for quantitative photoacoustic tomography reconstruction show that the proposed multigrid inversion can dramatically reduce the required number of iterations for the optimization process without loss of reliability in the results. PMID:26203371

  13. Fact Sheet - Final Air Toxics Rule for Steel Pickling and HCI Process Facilities and Hydrochloric Acid Regeneration Plants

    EPA Pesticide Factsheets

    Fact Sheet summarizing the main points of the national emssions standard for hazaradous air pollutants (NESHAP) for Steel Pickling— HCl Process Facilities and Hydrochloric Acid Regeneration Plants as promulgated on June 22, 1999.

  14. Estimation of 3D reconstruction errors in a stereo-vision system

    NASA Astrophysics Data System (ADS)

    Belhaoua, A.; Kohler, S.; Hirsch, E.

    2009-06-01

    The paper presents an approach for error estimation for the various steps of an automated 3D vision-based reconstruction procedure of manufactured workpieces. The process is based on a priori planning of the task and built around a cognitive intelligent sensory system using so-called Situation Graph Trees (SGT) as a planning tool. Such an automated quality control system requires the coordination of a set of complex processes performing sequentially data acquisition, its quantitative evaluation and the comparison with a reference model (e.g., CAD object model) in order to evaluate quantitatively the object. To ensure efficient quality control, the aim is to be able to state if reconstruction results fulfill tolerance rules or not. Thus, the goal is to evaluate independently the error for each step of the stereo-vision based 3D reconstruction (e.g., for calibration, contour segmentation, matching and reconstruction) and then to estimate the error for the whole system. In this contribution, we analyze particularly the segmentation error due to localization errors for extracted edge points supposed to belong to lines and curves composing the outline of the workpiece under evaluation. The fitting parameters describing these geometric features are used as quality measure to determine confidence intervals and finally to estimate the segmentation errors. These errors are then propagated through the whole reconstruction procedure, enabling to evaluate their effect on the final 3D reconstruction result, specifically on position uncertainties. Lastly, analysis of these error estimates enables to evaluate the quality of the 3D reconstruction, as illustrated by the shown experimental results.

  15. Parallel ptychographic reconstruction

    DOE PAGES

    Nashed, Youssef S. G.; Vine, David J.; Peterka, Tom; ...

    2014-12-19

    Ptychography is an imaging method whereby a coherent beam is scanned across an object, and an image is obtained by iterative phasing of the set of diffraction patterns. It is able to be used to image extended objects at a resolution limited by scattering strength of the object and detector geometry, rather than at an optics-imposed limit. As technical advances allow larger fields to be imaged, computational challenges arise for reconstructing the correspondingly larger data volumes, yet at the same time there is also a need to deliver reconstructed images immediately so that one can evaluate the next steps tomore » take in an experiment. Here we present a parallel method for real-time ptychographic phase retrieval. It uses a hybrid parallel strategy to divide the computation between multiple graphics processing units (GPUs) and then employs novel techniques to merge sub-datasets into a single complex phase and amplitude image. Results are shown on a simulated specimen and a real dataset from an X-ray experiment conducted at a synchrotron light source.« less

  16. Fluorescence molecular tomography reconstruction via discrete cosine transform-based regularization

    NASA Astrophysics Data System (ADS)

    Shi, Junwei; Liu, Fei; Zhang, Jiulou; Luo, Jianwen; Bai, Jing

    2015-05-01

    Fluorescence molecular tomography (FMT) as a noninvasive imaging modality has been widely used for biomedical preclinical applications. However, FMT reconstruction suffers from severe ill-posedness, especially when a limited number of projections are used. In order to improve the quality of FMT reconstruction results, a discrete cosine transform (DCT) based reweighted L1-norm regularization algorithm is proposed. In each iteration of the reconstruction process, different reweighted regularization parameters are adaptively assigned according to the values of DCT coefficients to suppress the reconstruction noise. In addition, the permission region of the reconstructed fluorophores is adaptively constructed to increase the convergence speed. In order to evaluate the performance of the proposed algorithm, physical phantom and in vivo mouse experiments with a limited number of projections are carried out. For comparison, different L1-norm regularization strategies are employed. By quantifying the signal-to-noise ratio (SNR) of the reconstruction results in the phantom and in vivo mouse experiments with four projections, the proposed DCT-based reweighted L1-norm regularization shows higher SNR than other L1-norm regularizations employed in this work.

  17. The PRISM (Pliocene Palaeoclimate) reconstruction: Time for a paradigm shift

    USGS Publications Warehouse

    Dowsett, Harry J.; Robinson, Marci M.; Stoll, Danielle K.; Foley, Kevin M.; Johnson, Andrew L. A.; Williams, Mark; Riesselman, Christina

    2013-01-01

    Global palaeoclimate reconstructions have been invaluable to our understanding of the causes and effects of climate change, but single-temperature representations of the oceanic mixed layer for data–model comparisons are outdated, and the time for a paradigm shift in marine palaeoclimate reconstruction is overdue. The new paradigm in marine palaeoclimate reconstruction stems the loss of valuable climate information and instead presents a holistic and nuanced interpretation of multi-dimensional oceanographic processes and responses. A wealth of environmental information is hidden within the US Geological Survey's Pliocene Research,Interpretation and Synoptic Mapping (PRISM) marine palaeoclimate reconstruction, and we introduce here a plan to incorporate all valuable climate data into the next generation of PRISM products. Beyond the global approach and focus, we plan to incorporate regional climate dynamics with emphasis on processes, integrating multiple environmental proxies wherever available in order to better characterize the mixed layer, and developing a finer time slice within the Mid-Piacenzian Age of the Pliocene, complemented by underused proxies that offer snapshots into environmental conditions. The result will be a proxy-rich, temporally nested, process-oriented approach in a digital format - a relational database with geographic information system capabilities comprising a three-dimensional grid representing the surface layer, with a plethora of data in each cell.

  18. Higher order total variation regularization for EIT reconstruction.

    PubMed

    Gong, Bo; Schullcke, Benjamin; Krueger-Ziolek, Sabine; Zhang, Fan; Mueller-Lisse, Ullrich; Moeller, Knut

    2018-01-08

    Electrical impedance tomography (EIT) attempts to reveal the conductivity distribution of a domain based on the electrical boundary condition. This is an ill-posed inverse problem; its solution is very unstable. Total variation (TV) regularization is one of the techniques commonly employed to stabilize reconstructions. However, it is well known that TV regularization induces staircase effects, which are not realistic in clinical applications. To reduce such artifacts, modified TV regularization terms considering a higher order differential operator were developed in several previous studies. One of them is called total generalized variation (TGV) regularization. TGV regularization has been successively applied in image processing in a regular grid context. In this study, we adapted TGV regularization to the finite element model (FEM) framework for EIT reconstruction. Reconstructions using simulation and clinical data were performed. First results indicate that, in comparison to TV regularization, TGV regularization promotes more realistic images. Graphical abstract Reconstructed conductivity changes located on selected vertical lines. For each of the reconstructed images as well as the ground truth image, conductivity changes located along the selected left and right vertical lines are plotted. In these plots, the notation GT in the legend stands for ground truth, TV stands for total variation method, and TGV stands for total generalized variation method. Reconstructed conductivity distributions from the GREIT algorithm are also demonstrated.

  19. Fast Dictionary-Based Reconstruction for Diffusion Spectrum Imaging

    PubMed Central

    Bilgic, Berkin; Chatnuntawech, Itthi; Setsompop, Kawin; Cauley, Stephen F.; Yendiki, Anastasia; Wald, Lawrence L.; Adalsteinsson, Elfar

    2015-01-01

    Diffusion Spectrum Imaging (DSI) reveals detailed local diffusion properties at the expense of substantially long imaging times. It is possible to accelerate acquisition by undersampling in q-space, followed by image reconstruction that exploits prior knowledge on the diffusion probability density functions (pdfs). Previously proposed methods impose this prior in the form of sparsity under wavelet and total variation (TV) transforms, or under adaptive dictionaries that are trained on example datasets to maximize the sparsity of the representation. These compressed sensing (CS) methods require full-brain processing times on the order of hours using Matlab running on a workstation. This work presents two dictionary-based reconstruction techniques that use analytical solutions, and are two orders of magnitude faster than the previously proposed dictionary-based CS approach. The first method generates a dictionary from the training data using Principal Component Analysis (PCA), and performs the reconstruction in the PCA space. The second proposed method applies reconstruction using pseudoinverse with Tikhonov regularization with respect to a dictionary. This dictionary can either be obtained using the K-SVD algorithm, or it can simply be the training dataset of pdfs without any training. All of the proposed methods achieve reconstruction times on the order of seconds per imaging slice, and have reconstruction quality comparable to that of dictionary-based CS algorithm. PMID:23846466

  20. Strategic facility planning improves capital decision making.

    PubMed

    Reeve, J R

    2001-03-01

    A large, Midwestern IDS undertook a strategic facility-planning process to evaluate its facility portfolio and determine how best to allocate future investments in facility development. The IDS assembled a facility-planning team, which initiated the planning process with a market analysis to determine future market demands and identify service areas that warranted facility expansion. The team then analyzed each of the IDS's facilities from the perspective of uniform capacity measurements, highest and best use compared with needs, building condition and investment-worthiness, and facility growth and site development opportunities. Based on results of the analysis, the strategy adopted entailed, in part, shifting some space from inpatient care to ambulatory care services and demolishing and replacing the 11 percent of facilities deemed to be in the worst condition.

  1. Development of a GNSS water vapour tomography system using algebraic reconstruction techniques

    NASA Astrophysics Data System (ADS)

    Bender, Michael; Dick, Galina; Ge, Maorong; Deng, Zhiguo; Wickert, Jens; Kahle, Hans-Gert; Raabe, Armin; Tetzlaff, Gerd

    2011-05-01

    A GNSS water vapour tomography system developed to reconstruct spatially resolved humidity fields in the troposphere is described. The tomography system was designed to process the slant path delays of about 270 German GNSS stations in near real-time with a temporal resolution of 30 min, a horizontal resolution of 40 km and a vertical resolution of 500 m or better. After a short introduction to the GPS slant delay processing the framework of the GNSS tomography is described in detail. Different implementations of the iterative algebraic reconstruction techniques (ART) used to invert the linear inverse problem are discussed. It was found that the multiplicative techniques (MART) provide the best results with least processing time, i.e., a tomographic reconstruction of about 26,000 slant delays on a 8280 cell grid can be obtained in less than 10 min. Different iterative reconstruction techniques are compared with respect to their convergence behaviour and some numerical parameters. The inversion can be considerably stabilized by using additional non-GNSS observations and implementing various constraints. Different strategies for initialising the tomography and utilizing extra information are discussed. At last an example of a reconstructed field of the wet refractivity is presented and compared to the corresponding distribution of the integrated water vapour, an analysis of a numerical weather model (COSMO-DE) and some radiosonde profiles.

  2. Evaluating Fidelity to a Modified NIATx Process Improvement Strategy for Improving HIV Services in Correctional Facilities.

    PubMed

    Pankow, Jennifer; Willett, Jennifer; Yang, Yang; Swan, Holly; Dembo, Richard; Burdon, William M; Patterson, Yvonne; Pearson, Frank S; Belenko, Steven; Frisman, Linda K

    2018-04-01

    In a study aimed at improving the quality of HIV services for inmates, an organizational process improvement strategy using change teams was tested in 14 correctional facilities in 8 US states and Puerto Rico. Data to examine fidelity to the process improvement strategy consisted of quantitative ratings of the structural and process components of the strategy and qualitative notes that explicate challenges in maintaining fidelity to the strategy. Fidelity challenges included (1) lack of communication and leadership within change teams, (2) instability in team membership, and (3) issues with data utilization in decision-making to implement improvements to services delivery.

  3. Phased Demolition of an Occupied Facility

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brede, Lawrence M.; Lauterbach, Merl J.; Witt, Brandon W.

    2008-01-15

    The U.S. government constructed the K-1401 facility in the late 1940's as a support building for various projects supporting the uranium gaseous diffusion process. In 2004 the U.S. Department of Energy authorized Bechtel Jacobs Company, LLC (BJC) to decontaminate and demolish the facility. The K-1401 facility was used for a variety of industrial purposes supporting the gaseous diffusion process. Many different substances were used to support these processes over the years and as a result different parts of the facility were contaminated with fluorine, chlorine trifluoride, uranium and technetium radiological contamination, asbestos, and mercury. The total facility area is 46,015more » m{sup 2} (495,000 sf) including a 6,800 m{sup 2} basement (73,200 sf). In addition to the contamination areas in the facility, a large portion was leased to businesses for re-industrialization when the D and D activities began. The work scope associated with the facility included purging and steam cleaning the former fluorine and chlorine trifluoride systems, decontaminating loose radiologically contaminated and mercury spill areas, dismantling former radiological lines contaminated with uranium oxide compounds and technetium, abating all asbestos containing material, and demolishing the facility. These various situations contributed to the challenge of successfully conducting D and D tasks on the facility. In order to efficiently utilize the work force, demolition equipment, and waste hauling trucks the normal approach of decontaminating the facility of the hazardous materials, and then conducting demolition in series required a project schedule of five years, which is not cost effective. The entire project was planned with continuous demolition as the goal end state. As a result, the first activities, Phase 1, required to prepare sections for demolition, including steam cleaning fluorine and chlorine trifluoride process lines in basement and facility asbestos abatement, were

  4. Bessel Fourier Orientation Reconstruction (BFOR): An Analytical Diffusion Propagator Reconstruction for Hybrid Diffusion Imaging and Computation of q-Space Indices

    PubMed Central

    Hosseinbor, A. Pasha; Chung, Moo K.; Wu, Yu-Chien; Alexander, Andrew L.

    2012-01-01

    The ensemble average propagator (EAP) describes the 3D average diffusion process of water molecules, capturing both its radial and angular contents. The EAP can thus provide richer information about complex tissue microstructure properties than the orientation distribution function (ODF), an angular feature of the EAP. Recently, several analytical EAP reconstruction schemes for multiple q-shell acquisitions have been proposed, such as diffusion propagator imaging (DPI) and spherical polar Fourier imaging (SPFI). In this study, a new analytical EAP reconstruction method is proposed, called Bessel Fourier orientation reconstruction (BFOR), whose solution is based on heat equation estimation of the diffusion signal for each shell acquisition, and is validated on both synthetic and real datasets. A significant portion of the paper is dedicated to comparing BFOR, SPFI, and DPI using hybrid, non-Cartesian sampling for multiple b-value acquisitions. Ways to mitigate the effects of Gibbs ringing on EAP reconstruction are also explored. In addition to analytical EAP reconstruction, the aforementioned modeling bases can be used to obtain rotationally invariant q-space indices of potential clinical value, an avenue which has not yet been thoroughly explored. Three such measures are computed: zero-displacement probability (Po), mean squared displacement (MSD), and generalized fractional anisotropy (GFA). PMID:22963853

  5. Evaluation of a 3D point cloud tetrahedral tomographic reconstruction method

    PubMed Central

    Pereira, N F; Sitek, A

    2011-01-01

    Tomographic reconstruction on an irregular grid may be superior to reconstruction on a regular grid. This is achieved through an appropriate choice of the image space model, the selection of an optimal set of points and the use of any available prior information during the reconstruction process. Accordingly, a number of reconstruction-related parameters must be optimized for best performance. In this work, a 3D point cloud tetrahedral mesh reconstruction method is evaluated for quantitative tasks. A linear image model is employed to obtain the reconstruction system matrix and five point generation strategies are studied. The evaluation is performed using the recovery coefficient, as well as voxel- and template-based estimates of bias and variance measures, computed over specific regions in the reconstructed image. A similar analysis is performed for regular grid reconstructions that use voxel basis functions. The maximum likelihood expectation maximization reconstruction algorithm is used. For the tetrahedral reconstructions, of the five point generation methods that are evaluated, three use image priors. For evaluation purposes, an object consisting of overlapping spheres with varying activity is simulated. The exact parallel projection data of this object are obtained analytically using a parallel projector, and multiple Poisson noise realizations of these exact data are generated and reconstructed using the different point generation strategies. The unconstrained nature of point placement in some of the irregular mesh-based reconstruction strategies has superior activity recovery for small, low-contrast image regions. The results show that, with an appropriately generated set of mesh points, the irregular grid reconstruction methods can out-perform reconstructions on a regular grid for mathematical phantoms, in terms of the performance measures evaluated. PMID:20736496

  6. Evaluation of a 3D point cloud tetrahedral tomographic reconstruction method

    NASA Astrophysics Data System (ADS)

    Pereira, N. F.; Sitek, A.

    2010-09-01

    Tomographic reconstruction on an irregular grid may be superior to reconstruction on a regular grid. This is achieved through an appropriate choice of the image space model, the selection of an optimal set of points and the use of any available prior information during the reconstruction process. Accordingly, a number of reconstruction-related parameters must be optimized for best performance. In this work, a 3D point cloud tetrahedral mesh reconstruction method is evaluated for quantitative tasks. A linear image model is employed to obtain the reconstruction system matrix and five point generation strategies are studied. The evaluation is performed using the recovery coefficient, as well as voxel- and template-based estimates of bias and variance measures, computed over specific regions in the reconstructed image. A similar analysis is performed for regular grid reconstructions that use voxel basis functions. The maximum likelihood expectation maximization reconstruction algorithm is used. For the tetrahedral reconstructions, of the five point generation methods that are evaluated, three use image priors. For evaluation purposes, an object consisting of overlapping spheres with varying activity is simulated. The exact parallel projection data of this object are obtained analytically using a parallel projector, and multiple Poisson noise realizations of these exact data are generated and reconstructed using the different point generation strategies. The unconstrained nature of point placement in some of the irregular mesh-based reconstruction strategies has superior activity recovery for small, low-contrast image regions. The results show that, with an appropriately generated set of mesh points, the irregular grid reconstruction methods can out-perform reconstructions on a regular grid for mathematical phantoms, in terms of the performance measures evaluated.

  7. Hydrogen Production in Radioactive Solutions in the Defense Waste Processing Facility

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    CRAWFORD, CHARLES L.

    2004-05-26

    In the radioactive slurries and solutions to be processed in the Defense Waste Processing Facility (DWPF), hydrogen will be produced continuously by radiolysis. This production results from alpha, beta, and gamma rays from decay of radionuclides in the slurries and solutions interacting with the water. More than 1000 research reports have published data concerning this radiolytic production. The results of these studies have been reviewed in a comprehensive monograph. Information about radiolytic hydrogen production from the different process tanks is necessary to determine air purge rates necessary to prevent flammable mixtures from accumulating in the vapor spaces above these tanks.more » Radiolytic hydrogen production rates are usually presented in terms of G values or molecules of hydrogen produced per 100ev of radioactive decay energy absorbed by the slurry or solution. With the G value for hydrogen production, G(H2), for a particular slurry and the concentrations of radioactive species in that slurry, the rate of H2 production for that slurry can be calculated. An earlier investigation estimated that the maximum rate that hydrogen could be produced from the sludge slurry stream to the DWPF is with a G value of 0.45 molecules per 100ev of radioactive decay energy sorbed by the slurry.« less

  8. Optimization of tomographic reconstruction workflows on geographically distributed resources

    DOE PAGES

    Bicer, Tekin; Gursoy, Doga; Kettimuthu, Rajkumar; ...

    2016-01-01

    New technological advancements in synchrotron light sources enable data acquisitions at unprecedented levels. This emergent trend affects not only the size of the generated data but also the need for larger computational resources. Although beamline scientists and users have access to local computational resources, these are typically limited and can result in extended execution times. Applications that are based on iterative processing as in tomographic reconstruction methods require high-performance compute clusters for timely analysis of data. Here, time-sensitive analysis and processing of Advanced Photon Source data on geographically distributed resources are focused on. Two main challenges are considered: (i) modelingmore » of the performance of tomographic reconstruction workflows and (ii) transparent execution of these workflows on distributed resources. For the former, three main stages are considered: (i) data transfer between storage and computational resources, (i) wait/queue time of reconstruction jobs at compute resources, and (iii) computation of reconstruction tasks. These performance models allow evaluation and estimation of the execution time of any given iterative tomographic reconstruction workflow that runs on geographically distributed resources. For the latter challenge, a workflow management system is built, which can automate the execution of workflows and minimize the user interaction with the underlying infrastructure. The system utilizes Globus to perform secure and efficient data transfer operations. The proposed models and the workflow management system are evaluated by using three high-performance computing and two storage resources, all of which are geographically distributed. Workflows were created with different computational requirements using two compute-intensive tomographic reconstruction algorithms. Experimental evaluation shows that the proposed models and system can be used for selecting the optimum resources, which in

  9. Optimization of tomographic reconstruction workflows on geographically distributed resources

    PubMed Central

    Bicer, Tekin; Gürsoy, Doǧa; Kettimuthu, Rajkumar; De Carlo, Francesco; Foster, Ian T.

    2016-01-01

    New technological advancements in synchrotron light sources enable data acquisitions at unprecedented levels. This emergent trend affects not only the size of the generated data but also the need for larger computational resources. Although beamline scientists and users have access to local computational resources, these are typically limited and can result in extended execution times. Applications that are based on iterative processing as in tomographic reconstruction methods require high-performance compute clusters for timely analysis of data. Here, time-sensitive analysis and processing of Advanced Photon Source data on geographically distributed resources are focused on. Two main challenges are considered: (i) modeling of the performance of tomographic reconstruction workflows and (ii) transparent execution of these workflows on distributed resources. For the former, three main stages are considered: (i) data transfer between storage and computational resources, (i) wait/queue time of reconstruction jobs at compute resources, and (iii) computation of reconstruction tasks. These performance models allow evaluation and estimation of the execution time of any given iterative tomographic reconstruction workflow that runs on geographically distributed resources. For the latter challenge, a workflow management system is built, which can automate the execution of workflows and minimize the user interaction with the underlying infrastructure. The system utilizes Globus to perform secure and efficient data transfer operations. The proposed models and the workflow management system are evaluated by using three high-performance computing and two storage resources, all of which are geographically distributed. Workflows were created with different computational requirements using two compute-intensive tomographic reconstruction algorithms. Experimental evaluation shows that the proposed models and system can be used for selecting the optimum resources, which in turn can

  10. Optimization of tomographic reconstruction workflows on geographically distributed resources

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bicer, Tekin; Gursoy, Doga; Kettimuthu, Rajkumar

    New technological advancements in synchrotron light sources enable data acquisitions at unprecedented levels. This emergent trend affects not only the size of the generated data but also the need for larger computational resources. Although beamline scientists and users have access to local computational resources, these are typically limited and can result in extended execution times. Applications that are based on iterative processing as in tomographic reconstruction methods require high-performance compute clusters for timely analysis of data. Here, time-sensitive analysis and processing of Advanced Photon Source data on geographically distributed resources are focused on. Two main challenges are considered: (i) modelingmore » of the performance of tomographic reconstruction workflows and (ii) transparent execution of these workflows on distributed resources. For the former, three main stages are considered: (i) data transfer between storage and computational resources, (i) wait/queue time of reconstruction jobs at compute resources, and (iii) computation of reconstruction tasks. These performance models allow evaluation and estimation of the execution time of any given iterative tomographic reconstruction workflow that runs on geographically distributed resources. For the latter challenge, a workflow management system is built, which can automate the execution of workflows and minimize the user interaction with the underlying infrastructure. The system utilizes Globus to perform secure and efficient data transfer operations. The proposed models and the workflow management system are evaluated by using three high-performance computing and two storage resources, all of which are geographically distributed. Workflows were created with different computational requirements using two compute-intensive tomographic reconstruction algorithms. Experimental evaluation shows that the proposed models and system can be used for selecting the optimum resources, which in

  11. Low Gravity Freefall Facilities

    NASA Technical Reports Server (NTRS)

    1981-01-01

    Composite of Marshall Space Flight Center's Low-Gravity Free Fall Facilities.These facilities include a 100-meter drop tower and a 100-meter drop tube. The drop tower simulates in-flight microgravity conditions for up to 4.2 seconds for containerless processing experiments, immiscible fluids and materials research, pre-flight hardware design test and flight experiment simulation. The drop tube simulates in-flight microgravity conditions for up to 4.6 seconds and is used extensively for ground-based microgravity convection research in which extremely small samples are studied. The facility can provide deep undercooling for containerless processing experiments that require materials to remain in a liquid phase when cooled below the normal solidification temperature.

  12. Workflows and the Role of Images for Virtual 3d Reconstruction of no Longer Extant Historic Objects

    NASA Astrophysics Data System (ADS)

    Münster, S.

    2013-07-01

    3D reconstruction technologies have gained importance as tools for the research and visualization of no longer extant historic objects during the last decade. Within such reconstruction processes, visual media assumes several important roles: as the most important sources especially for a reconstruction of no longer extant objects, as a tool for communication and cooperation within the production process, as well as for a communication and visualization of results. While there are many discourses about theoretical issues of depiction as sources and as visualization outcomes of such projects, there is no systematic research about the importance of depiction during a 3D reconstruction process and based on empirical findings. Moreover, from a methodological perspective, it would be necessary to understand which role visual media plays during the production process and how it is affected by disciplinary boundaries and challenges specific to historic topics. Research includes an analysis of published work and case studies investigating reconstruction projects. This study uses methods taken from social sciences to gain a grounded view of how production processes would take place in practice and which functions and roles images would play within them. For the investigation of these topics, a content analysis of 452 conference proceedings and journal articles related to 3D reconstruction modeling in the field of humanities has been completed. Most of the projects described in those publications dealt with data acquisition and model building for existing objects. Only a small number of projects focused on structures that no longer or never existed physically. Especially that type of project seems to be interesting for a study of the importance of pictures as sources and as tools for interdisciplinary cooperation during the production process. In the course of the examination the authors of this paper applied a qualitative content analysis for a sample of 26 previously

  13. The Paris to Lexington Road Reconstruction Project.

    DOT National Transportation Integrated Search

    2001-09-01

    This report summarizes the effort to provide the Kentucky Transportation Cabinet with an evaluation of the results obtained for the Paris to Lexington Road Reconstruction Project from 1997 to 2001. A unique pre-qualification process was used for the ...

  14. Ifcwall Reconstruction from Unstructured Point Clouds

    NASA Astrophysics Data System (ADS)

    Bassier, M.; Klein, R.; Van Genechten, B.; Vergauwen, M.

    2018-05-01

    The automated reconstruction of Building Information Modeling (BIM) objects from point cloud data is still ongoing research. A key aspect is the creation of accurate wall geometry as it forms the basis for further reconstruction of objects in a BIM. After segmenting and classifying the initial point cloud, the labelled segments are processed and the wall topology is reconstructed. However, the preocedure is challenging due to noise, occlusions and the complexity of the input data.In this work, a method is presented to automatically reconstruct consistent wall geometry from point clouds. More specifically, the use of room information is proposed to aid the wall topology creation. First, a set of partial walls is constructed based on classified planar primitives. Next, the rooms are identified using the retrieved wall information along with the floors and ceilings. The wall topology is computed by the intersection of the partial walls conditioned on the room information. The final wall geometry is defined by creating IfcWallStandardCase objects conform the IFC4 standard. The result is a set of walls according to the as-built conditions of a building. The experiments prove that the used method is a reliable framework for wall reconstruction from unstructured point cloud data. Also, the implementation of room information reduces the rate of false positives for the wall topology. Given the walls, ceilings and floors, 94% of the rooms is correctly identified. A key advantage of the proposed method is that it deals with complex rooms and is not bound to single storeys.

  15. Virtual Surgical Planning for Inferior Alveolar Nerve Reconstruction.

    PubMed

    Miloro, Michael; Markiewicz, Michael R

    2017-11-01

    The purpose of this study was to assess the outcomes after preoperative virtual surgical planning (VSP) for inferior alveolar nerve (IAN) reconstruction in ablative mandibular surgery. We performed a retrospective evaluation of consecutive surgical cases using standard VSP for hard tissue resection and reconstructive surgery in addition to IAN VSP performed simultaneously during surgery. Cases were assessed regarding the planning time, additional costs involved, surgeon's subjective impression of the process, accuracy of the prediction during surgery, and operative time during surgery compared with cases performed without VSP. The study sample was composed of 5 cases of mandibular resection for benign disease, with bony, soft tissue, and neural reconstruction with the use of VSP. The addition of IAN reconstruction to the VSP session added no additional expense to the planning session but resulted in an additional 22.5 minutes (±7.5 minutes) for the webinar session. From a subjective standpoint, IAN VSP provided the surgeon with a discreet plan for surgery. From an objective standpoint, IAN VSP provided the exact length and diameter of nerve graft required for surgery, facilitated the surgeon's ability to visualize the actual nerve graft procedure, and limited the additional time required for simultaneous nerve reconstruction. Despite perceived prejudice against simultaneous IAN reconstruction with complex mandibular resection and reconstruction, the use of IAN VSP may facilitate the actual surgical procedure and result in considerably improved patient outcomes without considerable additional time or cost associated with this protocol. Copyright © 2017 American Association of Oral and Maxillofacial Surgeons. Published by Elsevier Inc. All rights reserved.

  16. A multiresolution approach to iterative reconstruction algorithms in X-ray computed tomography.

    PubMed

    De Witte, Yoni; Vlassenbroeck, Jelle; Van Hoorebeke, Luc

    2010-09-01

    In computed tomography, the application of iterative reconstruction methods in practical situations is impeded by their high computational demands. Especially in high resolution X-ray computed tomography, where reconstruction volumes contain a high number of volume elements (several giga voxels), this computational burden prevents their actual breakthrough. Besides the large amount of calculations, iterative algorithms require the entire volume to be kept in memory during reconstruction, which quickly becomes cumbersome for large data sets. To overcome this obstacle, we present a novel multiresolution reconstruction, which greatly reduces the required amount of memory without significantly affecting the reconstructed image quality. It is shown that, combined with an efficient implementation on a graphical processing unit, the multiresolution approach enables the application of iterative algorithms in the reconstruction of large volumes at an acceptable speed using only limited resources.

  17. Bayesian Inference for Source Reconstruction: A Real-World Application

    PubMed Central

    Yee, Eugene; Hoffman, Ian; Ungar, Kurt

    2014-01-01

    This paper applies a Bayesian probabilistic inferential methodology for the reconstruction of the location and emission rate from an actual contaminant source (emission from the Chalk River Laboratories medical isotope production facility) using a small number of activity concentration measurements of a noble gas (Xenon-133) obtained from three stations that form part of the International Monitoring System radionuclide network. The sampling of the resulting posterior distribution of the source parameters is undertaken using a very efficient Markov chain Monte Carlo technique that utilizes a multiple-try differential evolution adaptive Metropolis algorithm with an archive of past states. It is shown that the principal difficulty in the reconstruction lay in the correct specification of the model errors (both scale and structure) for use in the Bayesian inferential methodology. In this context, two different measurement models for incorporation of the model error of the predicted concentrations are considered. The performance of both of these measurement models with respect to their accuracy and precision in the recovery of the source parameters is compared and contrasted. PMID:27379292

  18. Bioprosthetic Mesh in Abdominal Wall Reconstruction

    PubMed Central

    Baumann, Donald P.; Butler, Charles E.

    2012-01-01

    Mesh materials have undergone a considerable evolution over the last several decades. There has been enhancement of biomechanical properties, improvement in manufacturing processes, and development of antiadhesive laminate synthetic meshes. The evolution of bioprosthetic mesh materials has markedly changed our indications and methods for complex abdominal wall reconstruction. The authors review the optimal properties of bioprosthetic mesh materials, their evolution over time, and their indications for use. The techniques to optimize outcomes are described using bioprosthetic mesh for complex abdominal wall reconstruction. Bioprosthetic mesh materials clearly have certain advantages over other implantable mesh materials in select indications. Appropriate patient selection and surgical technique are critical to the successful use of bioprosthetic materials for abdominal wall repair. PMID:23372454

  19. Parallel hyperspectral image reconstruction using random projections

    NASA Astrophysics Data System (ADS)

    Sevilla, Jorge; Martín, Gabriel; Nascimento, José M. P.

    2016-10-01

    Spaceborne sensors systems are characterized by scarce onboard computing and storage resources and by communication links with reduced bandwidth. Random projections techniques have been demonstrated as an effective and very light way to reduce the number of measurements in hyperspectral data, thus, the data to be transmitted to the Earth station is reduced. However, the reconstruction of the original data from the random projections may be computationally expensive. SpeCA is a blind hyperspectral reconstruction technique that exploits the fact that hyperspectral vectors often belong to a low dimensional subspace. SpeCA has shown promising results in the task of recovering hyperspectral data from a reduced number of random measurements. In this manuscript we focus on the implementation of the SpeCA algorithm for graphics processing units (GPU) using the compute unified device architecture (CUDA). Experimental results conducted using synthetic and real hyperspectral datasets on the GPU architecture by NVIDIA: GeForce GTX 980, reveal that the use of GPUs can provide real-time reconstruction. The achieved speedup is up to 22 times when compared with the processing time of SpeCA running on one core of the Intel i7-4790K CPU (3.4GHz), with 32 Gbyte memory.

  20. Reconstruction of Canine Mandibular Bone Defects Using a Bone Transport Reconstruction Plate

    PubMed Central

    Elsalanty, Mohammed E.; Zakhary, Ibrahim; Akeel, Sara; Benson, Byron; Mulone, Timothy; Triplett, Gilbert R.; Opperman, Lynne A.

    2010-01-01

    Objectives Reconstruction of mandibular segmental bone defects is a challenging task. This study tests a new device used for reconstructing mandibular defects based on the principle of bone transport distraction osteogenesis. Methods Thirteen beagle dogs were divided into control and experimental groups. In all animals, a 3 cm defect was created on one side of the mandible. In eight control animals, the defect was stabilized with a reconstruction plate without further reconstruction and the animals were sacrificed two to three months after surgery. The remaining five animals were reconstructed with a bone transport reconstruction plate (BTRP), comprising a reconstruction plate with attached intraoral transport unit, and were sacrificed after one month of consolidation. Results Clinical evaluation, cone-beam CT densitometry, three-dimensional histomorphometry, and docking site histology revealed significant new bone formation within the defect in the distracted group. Conclusion The physical dimensions and architectural parameters of the new bone were comparable to the contralateral normal bone. Bone union at the docking site remains a problem. PMID:19770704

  1. 77 FR 6681 - Approval and Promulgation of State Plans for Designated Facilities and Pollutants; State of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-02-09

    ... tons per day of municipal solid waste (MSW). This action corrects an error in the regulatory language... per day of municipal solid waste (MSW), and for which construction, reconstruction, or modification... Municipal Waste Combustor (LMWC) Emissions From Existing Facilities; Correction AGENCY: Environmental...

  2. IMPACTS OF ANTIFOAM ADDITIONS AND ARGON BUBBLING ON DEFENSE WASTE PROCESSING FACILITY REDUCTION/OXIDATION

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jantzen, C.; Johnson, F.

    2012-06-05

    During melting of HLW glass, the REDOX of the melt pool cannot be measured. Therefore, the Fe{sup +2}/{Sigma}Fe ratio in the glass poured from the melter must be related to melter feed organic and oxidant concentrations to ensure production of a high quality glass without impacting production rate (e.g., foaming) or melter life (e.g., metal formation and accumulation). A production facility such as the Defense Waste Processing Facility (DWPF) cannot wait until the melt or waste glass has been made to assess its acceptability, since by then no further changes to the glass composition and acceptability are possible. therefore, themore » acceptability decision is made on the upstream process, rather than on the downstream melt or glass product. That is, it is based on 'feed foward' statistical process control (SPC) rather than statistical quality control (SQC). In SPC, the feed composition to the melter is controlled prior to vitrification. Use of the DWPF REDOX model has controlled the balanjce of feed reductants and oxidants in the Sludge Receipt and Adjustment Tank (SRAT). Once the alkali/alkaline earth salts (both reduced and oxidized) are formed during reflux in the SRAT, the REDOX can only change if (1) additional reductants or oxidants are added to the SRAT, the Slurry Mix Evaporator (SME), or the Melter Feed Tank (MFT) or (2) if the melt pool is bubble dwith an oxidizing gas or sparging gas that imposes a different REDOX target than the chemical balance set during reflux in the SRAT.« less

  3. Photogrammetry for rapid prototyping: development of noncontact 3D reconstruction technologies

    NASA Astrophysics Data System (ADS)

    Knyaz, Vladimir A.

    2002-04-01

    An important stage of rapid prototyping technology is generating computer 3D model of an object to be reproduced. Wide variety of techniques for 3D model generation exists beginning with manual 3D models generation and finishing with full-automated reverse engineering system. The progress in CCD sensors and computers provides the background for integration of photogrammetry as an accurate 3D data source with CAD/CAM. The paper presents the results of developing photogrammetric methods for non-contact spatial coordinates measurements and generation of computer 3D model of real objects. The technology is based on object convergent images processing for calculating its 3D coordinates and surface reconstruction. The hardware used for spatial coordinates measurements is based on PC as central processing unit and video camera as image acquisition device. The original software for Windows 9X realizes the complete technology of 3D reconstruction for rapid input of geometry data in CAD/CAM systems. Technical characteristics of developed systems are given along with the results of applying for various tasks of 3D reconstruction. The paper describes the techniques used for non-contact measurements and the methods providing metric characteristics of reconstructed 3D model. Also the results of system application for 3D reconstruction of complex industrial objects are presented.

  4. PRISM3 DOT1 Atlantic Basin Reconstruction

    USGS Publications Warehouse

    Dowsett, Harry; Robinson, Marci; Dwyer, Gary S.; Chandler, Mark; Cronin, Thomas

    2006-01-01

    PRISM3 DOT1 (Pliocene Research, Interpretation and Synoptic Mapping 3, Deep Ocean Temperature 1) provides a three-dimensional temperature reconstruction for the mid-Pliocene Atlantic basin, the first of several regional data sets that will comprise a global mid-Pliocene reconstruction. DOT1 is an alteration of modern temperature values for the Atlantic Ocean in 4 degree x 5 degree cells in 13 depth layers for December 1 based on Mg/Ca-derived BWT estimates from seventeen DSDP and ODP Sites and SST estimates from the PRISM2 reconstruction (Dowsett et al., 1999). DOT1 reflects a vaguely modern circulation system, assuming similar processes of deep-water formation; however, North Atlantic Deep Water (NADW) production is increased, and Antarctic Bottom Water (AABW) production is decreased. Pliocene NADW was approximately 2 degreesC warmer than modern temperatures, and Pliocene AABW was approximately 0.3 degreesC warmer than modern temperatures.

  5. Methods of reconstruction of multi-particle events in the new coordinate-tracking setup

    NASA Astrophysics Data System (ADS)

    Vorobyev, V. S.; Shutenko, V. V.; Zadeba, E. A.

    2018-01-01

    At the Unique Scientific Facility NEVOD (MEPhI), a large coordinate-tracking detector based on drift chambers for investigations of muon bundles generated by ultrahigh energy primary cosmic rays is being developed. One of the main characteristics of the bundle is muon multiplicity. Three methods of reconstruction of multiple events were investigated: the sequential search method, method of finding the straight line and method of histograms. The last method determines the number of tracks with the same zenith angle in the event. It is most suitable for the determination of muon multiplicity: because of a large distance to the point of generation of muons, their trajectories are quasiparallel. The paper presents results of application of three reconstruction methods to data from the experiment, and also first results of the detector operation.

  6. Development of CFC-Free Cleaning Processes at the NASA White Sands Test Facility

    NASA Technical Reports Server (NTRS)

    Beeson, Harold; Kirsch, Mike; Hornung, Steven; Biesinger, Paul

    1995-01-01

    The NASA White Sands Test Facility (WSTF) is developing cleaning and verification processes to replace currently used chlorofluorocarbon-113- (CFC-113-) based processes. The processes being evaluated include both aqueous- and solvent-based techniques. The presentation will include the findings of investigations of aqueous cleaning and verification processes that are based on a draft of a proposed NASA Kennedy Space Center (KSC) cleaning procedure. Verification testing with known contaminants, such as hydraulic fluid and commonly used oils, established correlations between nonvolatile residue and CFC-113. Recoveries ranged from 35 to 60 percent of theoretical. WSTF is also investigating enhancements to aqueous sampling for organics and particulates. Although aqueous alternatives have been identified for several processes, a need still exists for nonaqueous solvent cleaning, such as the cleaning and cleanliness verification of gauges used for oxygen service. The cleaning effectiveness of tetrachloroethylene (PCE), trichloroethylene (TCE), ethanol, hydrochlorofluorocarbon-225 (HCFC-225), tert-butylmethylether, and n-Hexane was evaluated using aerospace gauges and precision instruments and then compared to the cleaning effectiveness of CFC-113. Solvents considered for use in oxygen systems were also tested for oxygen compatibility using high-pressure oxygen autoignition and liquid oxygen mechanical impact testing.

  7. Expansion method in secondary total ear reconstruction for undesirable reconstructed ear.

    PubMed

    Liu, Tun; Hu, Jintian; Zhou, Xu; Zhang, Qingguo

    2014-09-01

    Ear reconstruction by autologous costal cartilage grafting is the most widely applied technique with fewer complications. However, undesirable ear reconstruction brings more problems to plastic surgeons. Some authors resort to free flap or osseointegration technique with prosthetic ear. In this article, we introduce a secondary total ear reconstruction with expanded skin flap method. From July 2010 to April 2012, 7 cases of undesirable ear reconstruction were repaired by tissue expansion method. Procedures including removal of previous cartilage framework, soft tissue expander insertion, and second stage of cartilage framework insertion were performed to each case regarding their local conditions. The follow-up time ranged from 6 months to 2.5 years. All of the cases recovered well with good 3-dimensional forms, symmetrical auriculocephalic angle, and stable fixation. All these evidence showed that this novel expansion method is safe, stable, and less traumatic for secondary total ear reconstruction. With sufficient expanded skin flap and refabricated cartilage framework, lifelike appearance of reconstructed ear could be acquired without causing additional injury.

  8. Image Reconstruction is a New Frontier of Machine Learning.

    PubMed

    Wang, Ge; Ye, Jong Chu; Mueller, Klaus; Fessler, Jeffrey A

    2018-06-01

    Over past several years, machine learning, or more generally artificial intelligence, has generated overwhelming research interest and attracted unprecedented public attention. As tomographic imaging researchers, we share the excitement from our imaging perspective [item 1) in the Appendix], and organized this special issue dedicated to the theme of "Machine learning for image reconstruction." This special issue is a sister issue of the special issue published in May 2016 of this journal with the theme "Deep learning in medical imaging" [item 2) in the Appendix]. While the previous special issue targeted medical image processing/analysis, this special issue focuses on data-driven tomographic reconstruction. These two special issues are highly complementary, since image reconstruction and image analysis are two of the main pillars for medical imaging. Together we cover the whole workflow of medical imaging: from tomographic raw data/features to reconstructed images and then extracted diagnostic features/readings.

  9. Image processing and reconstruction

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chartrand, Rick

    2012-06-15

    This talk will examine some mathematical methods for image processing and the solution of underdetermined, linear inverse problems. The talk will have a tutorial flavor, mostly accessible to undergraduates, while still presenting research results. The primary approach is the use of optimization problems. We will find that relaxing the usual assumption of convexity will give us much better results.

  10. Simple model for the reconstruction of radionuclide concentrations and radiation exposures along the Techa River

    NASA Technical Reports Server (NTRS)

    Vorobiova, M. I.; Degteva, M. O.; Neta, M. O. (Principal Investigator)

    1999-01-01

    The Techa River (Southern Urals, Russia) was contaminated in 1949-1956 by liquid radioactive wastes from the Mayak complex, the first Russian facility for the production of plutonium. The measurements of environmental contamination were started in 1951. A simple model describing radionuclide transport along the free-flowing river and the accumulation of radionuclides by bottom sediments is presented. This model successfully correlates the rates of radionuclide releases as reconstructed by the Mayak experts, hydrological data, and available environmental monitoring data for the early period of contamination (1949-1951). The model was developed to reconstruct doses for people who lived in the riverside communities during the period of the releases and who were chronically exposed to external and internal irradiation. The model fills the data gaps and permits reconstruction of external gamma-exposure rates in air on the river bank and radionuclide concentrations in river water used for drinking and other household needs in 1949-1951.

  11. Optimization of Stereo Matching in 3D Reconstruction Based on Binocular Vision

    NASA Astrophysics Data System (ADS)

    Gai, Qiyang

    2018-01-01

    Stereo matching is one of the key steps of 3D reconstruction based on binocular vision. In order to improve the convergence speed and accuracy in 3D reconstruction based on binocular vision, this paper adopts the combination method of polar constraint and ant colony algorithm. By using the line constraint to reduce the search range, an ant colony algorithm is used to optimize the stereo matching feature search function in the proposed search range. Through the establishment of the stereo matching optimization process analysis model of ant colony algorithm, the global optimization solution of stereo matching in 3D reconstruction based on binocular vision system is realized. The simulation results show that by the combining the advantage of polar constraint and ant colony algorithm, the stereo matching range of 3D reconstruction based on binocular vision is simplified, and the convergence speed and accuracy of this stereo matching process are improved.

  12. L'Aquila's reconstruction challenges: has Italy learned from its previous earthquake disasters?

    PubMed

    Ozerdem, Alpaslan; Rufini, Gianni

    2013-01-01

    Italy is an earthquake-prone country and its disaster emergency response experiences over the past few decades have varied greatly, with some being much more successful than others. Overall, however, its reconstruction efforts have been criticised for being ad hoc, delayed, ineffective, and untargeted. In addition, while the emergency relief response to the L'Aquila earthquake of 6 April 2009-the primary case study in this evaluation-seems to have been successful, the reconstruction initiative got off to a very problematic start. To explore the root causes of this phenomenon, the paper argues that, owing to the way in which Italian Prime Minister Silvio Berlusconi has politicised the process, the L'Aquila reconstruction endeavour is likely to suffer problems with local ownership, national/regional/municipal coordination, and corruption. It concludes with a set of recommendations aimed at addressing the pitfalls that may confront the L'Aquila reconstruction process over the next few years. © 2013 The Author(s). Journal compilation © Overseas Development Institute, 2013.

  13. Ohio Senator John Glenn tours the Space Station Processing Facility at KSC

    NASA Technical Reports Server (NTRS)

    1998-01-01

    Ohio Senator John Glenn, at right, enjoys a tour of the Space Station Processing Facility at Kennedy Space Center. With Senator Glenn is Stephen Francois, director, Space Station and Shuttle Payloads, NASA. Senator Glenn arrived at KSC on Jan. 20 to tour KSC operational areas and to view the launch of STS-89. Glenn, who made history in 1962 as the first American to orbit the Earth, completing three orbits in a five-hour flight aboard Friendship 7, will fly his second space mission aboard Space Shuttle Discovery this October. Glenn is retiring from the Senate at the end of this year and will be a payload specialist aboard STS-95.

  14. Image reconstruction for PET/CT scanners: past achievements and future challenges

    PubMed Central

    Tong, Shan; Alessio, Adam M; Kinahan, Paul E

    2011-01-01

    PET is a medical imaging modality with proven clinical value for disease diagnosis and treatment monitoring. The integration of PET and CT on modern scanners provides a synergy of the two imaging modalities. Through different mathematical algorithms, PET data can be reconstructed into the spatial distribution of the injected radiotracer. With dynamic imaging, kinetic parameters of specific biological processes can also be determined. Numerous efforts have been devoted to the development of PET image reconstruction methods over the last four decades, encompassing analytic and iterative reconstruction methods. This article provides an overview of the commonly used methods. Current challenges in PET image reconstruction include more accurate quantitation, TOF imaging, system modeling, motion correction and dynamic reconstruction. Advances in these aspects could enhance the use of PET/CT imaging in patient care and in clinical research studies of pathophysiology and therapeutic interventions. PMID:21339831

  15. Microfocal X-ray computed tomography post-processing operations for optimizing reconstruction volumes of stented arteries during 3D computational fluid dynamics modeling.

    PubMed

    Ladisa, John F; Olson, Lars E; Ropella, Kristina M; Molthen, Robert C; Haworth, Steven T; Kersten, Judy R; Warltier, David C; Pagel, Paul S

    2005-08-01

    Restenosis caused by neointimal hyperplasia (NH) remains an important clinical problem after stent implantation. Restenosis varies with stent geometry, and idealized computational fluid dynamics (CFD) models have indicated that geometric properties of the implanted stent may differentially influence NH. However, 3D studies capturing the in vivo flow domain within stented vessels have not been conducted at a resolution sufficient to detect subtle alterations in vascular geometry caused by the stent and the subsequent temporal development of NH. We present the details and limitations of a series of post-processing operations used in conjunction with microfocal X-ray CT imaging and reconstruction to generate geometrically accurate flow domains within the localized region of a stent several weeks after implantation. Microfocal X-ray CT reconstruction volumes were subjected to an automated program to perform arterial thresholding, spatial orientation, and surface smoothing of stented and unstented rabbit iliac arteries several weeks after antegrade implantation. A transfer function was obtained for the current post-processing methodology containing reconstructed 16 mm stents implanted into rabbit iliac arteries for up to 21 days after implantation and resolved at circumferential and axial resolutions of 32 and 50 microm, respectively. The results indicate that the techniques presented are sufficient to resolve distributions of WSS with 80% accuracy in segments containing 16 surface perturbations over a 16 mm stented region. These methods will be used to test the hypothesis that reductions in normalized wall shear stress (WSS) and increases in the spatial disparity of WSS immediately after stent implantation may spatially correlate with the temporal development of NH within the stented region.

  16. Socket welds in nuclear facilities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Anderson, P.A.; Torres, L.L.

    1995-12-31

    Socket welds are easier and faster to make than are butt welds. However, they are often not used in nuclear facilities because the crevices between the pipes and the socket sleeves may be subject to crevice corrosion. If socket welds can be qualified for wider use in facilities that process nuclear materials, the radiation exposures to welders can be significantly reduced. The current tests at the Idaho Chemical Processing Plant (ICPP) are designed to determine if socket welds can be qualified for use in the waste processing system at a nuclear fuel processing plant.

  17. An analysis of workplace exposures to benzene over four decades at a petrochemical processing and manufacturing facility (1962-1999).

    PubMed

    Sahmel, J; Devlin, K; Burns, A; Ferracini, T; Ground, M; Paustenbach, D

    2013-01-01

    Benzene, a known carcinogen, can be generated as a by-product during the use of petroleum-based raw materials in chemical manufacturing. The aim of this study was to analyze a large data set of benzene air concentration measurements collected over nearly 40 years during routine employee exposure monitoring at a petrochemical manufacturing facility. The facility used ethane, propane, and natural gas as raw materials in the production of common commercial materials such as polyethylene, polypropylene, waxes, adhesives, alcohols, and aldehydes. In total, 3607 benzene air samples were collected at the facility from 1962 to 1999. Of these, in total 2359 long-term (>1 h) personal exposure samples for benzene were collected during routine operations at the facility between 1974 and 1999. These samples were analyzed by division, department, and job title to establish employee benzene exposures in different areas of the facility over time. Sampling data were also analyzed by key events over time, including changes in the occupational exposure limits (OELs) for benzene and key equipment process changes at the facility. Although mean benzene concentrations varied according to operation, in nearly all cases measured benzene quantities were below the OEL in place at the time for benzene (10 ppm for 1974-1986 and 1 ppm for 1987-1999). Decreases in mean benzene air concentrations were also found when data were evaluated according to 7- to 10-yr periods following key equipment process changes. Further, an evaluation of mortality rates for a retrospective employee cohort (n = 3938) demonstrated that the average personal benzene exposures at this facility (0.89 ppm for the period 1974-1986 and 0.125 ppm for the period 1987-1999) did not result in increased standardized mortality ratio (SMRs) for diseases or malignancies of the lymphatic system. The robust nature of this data set provides comprehensive exposure information that may be useful for assessing human benzene exposures at

  18. Autologous microtia reconstruction combined with ancillary procedures: a comprehensive reconstructive approach.

    PubMed

    Cugno, S; Farhadieh, R D; Bulstrode, N W

    2013-11-01

    Autologous microtia reconstruction is generally performed in two stages. The second stage presents a unique opportunity to carry out other complementary procedures. The present study describes our approach to microtia reconstruction, wherein the second stage of reconstruction is combined with final refinements to the ear construct and/or additional procedures to enhance facial contour and symmetry. Retrospective analysis of patients who underwent two-stage microtia reconstruction by a single surgeon (NWB) was conducted in order to ascertain those that had ancillary procedures at the time of the second stage. Patient and operative details were collected. Thirty-four patients (male, 15, median age and age range at second stage, 11 and 10-18 years, respectively) who had complementary procedures executed during the second stage of auricular reconstruction were identified. Collectively, these included centralizing genioplasty (n = 1), fat transfer (n = 22), ear piercing (n = 7), and contralateral prominauris correction (n = 7). Six patients had correction for unilateral isolated microtia and in the remaining 28 patients, auricular reconstruction for microtia associated with a named syndrome. All patients reported a high rate of satisfaction with the result achieved and the majority (85%) reported no perceived need for additional surgical refinements to the ear or procedure(s) to achieve further facial symmetry. No peri- or post-operative complications were noted. Combining the final stage of autologous microtia reconstruction with other ancillary procedures affords a superior aesthetic outcome and decreased patient morbidity. Copyright © 2013 British Association of Plastic, Reconstructive and Aesthetic Surgeons. All rights reserved.

  19. Joint reconstruction of PET-MRI by exploiting structural similarity

    NASA Astrophysics Data System (ADS)

    Ehrhardt, Matthias J.; Thielemans, Kris; Pizarro, Luis; Atkinson, David; Ourselin, Sébastien; Hutton, Brian F.; Arridge, Simon R.

    2015-01-01

    Recent advances in technology have enabled the combination of positron emission tomography (PET) with magnetic resonance imaging (MRI). These PET-MRI scanners simultaneously acquire functional PET and anatomical or functional MRI data. As function and anatomy are not independent of one another the images to be reconstructed are likely to have shared structures. We aim to exploit this inherent structural similarity by reconstructing from both modalities in a joint reconstruction framework. The structural similarity between two modalities can be modelled in two different ways: edges are more likely to be at similar positions and/or to have similar orientations. We analyse the diffusion process generated by minimizing priors that encapsulate these different models. It turns out that the class of parallel level set priors always corresponds to anisotropic diffusion which is sometimes forward and sometimes backward diffusion. We perform numerical experiments where we jointly reconstruct from blurred Radon data with Poisson noise (PET) and under-sampled Fourier data with Gaussian noise (MRI). Our results show that both modalities benefit from each other in areas of shared edge information. The joint reconstructions have less artefacts and sharper edges compared to separate reconstructions and the ℓ2-error can be reduced in all of the considered cases of under-sampling.

  20. Computed Tomography Image Quality Evaluation of a New Iterative Reconstruction Algorithm in the Abdomen (Adaptive Statistical Iterative Reconstruction-V) a Comparison With Model-Based Iterative Reconstruction, Adaptive Statistical Iterative Reconstruction, and Filtered Back Projection Reconstructions.

    PubMed

    Goodenberger, Martin H; Wagner-Bartak, Nicolaus A; Gupta, Shiva; Liu, Xinming; Yap, Ramon Q; Sun, Jia; Tamm, Eric P; Jensen, Corey T

    The purpose of this study was to compare abdominopelvic computed tomography images reconstructed with adaptive statistical iterative reconstruction-V (ASIR-V) with model-based iterative reconstruction (Veo 3.0), ASIR, and filtered back projection (FBP). Abdominopelvic computed tomography scans for 36 patients (26 males and 10 females) were reconstructed using FBP, ASIR (80%), Veo 3.0, and ASIR-V (30%, 60%, 90%). Mean ± SD patient age was 32 ± 10 years with mean ± SD body mass index of 26.9 ± 4.4 kg/m. Images were reviewed by 2 independent readers in a blinded, randomized fashion. Hounsfield unit, noise, and contrast-to-noise ratio (CNR) values were calculated for each reconstruction algorithm for further comparison. Phantom evaluation of low-contrast detectability (LCD) and high-contrast resolution was performed. Adaptive statistical iterative reconstruction-V 30%, ASIR-V 60%, and ASIR 80% were generally superior qualitatively compared with ASIR-V 90%, Veo 3.0, and FBP (P < 0.05). Adaptive statistical iterative reconstruction-V 90% showed superior LCD and had the highest CNR in the liver, aorta, and, pancreas, measuring 7.32 ± 3.22, 11.60 ± 4.25, and 4.60 ± 2.31, respectively, compared with the next best series of ASIR-V 60% with respective CNR values of 5.54 ± 2.39, 8.78 ± 3.15, and 3.49 ± 1.77 (P <0.0001). Veo 3.0 and ASIR 80% had the best and worst spatial resolution, respectively. Adaptive statistical iterative reconstruction-V 30% and ASIR-V 60% provided the best combination of qualitative and quantitative performance. Adaptive statistical iterative reconstruction 80% was equivalent qualitatively, but demonstrated inferior spatial resolution and LCD.

  1. Migration of Beryllium via Multiple Exposure Pathways among Work Processes in Four Different Facilities

    PubMed Central

    Armstrong, Jenna L.; Day, Gregory A.; Park, Ji Young; Stefaniak, Aleksandr B.; Stanton, Marcia L.; Deubner, David C.; Kent, Michael S.; Schuler, Christine R.; Virji, M. Abbas

    2016-01-01

    Inhalation of beryllium is associated with the development of sensitization; however, dermal exposure may also be important. The primary aim of this study was to elucidate relationships among exposure pathways in four different manufacturing and finishing facilities. Secondary aims were to identify jobs with increased levels of beryllium in air, on skin, and on surfaces; identify potential discrepancies in exposure pathways, and determine if these are related to jobs with previously identified risk. Beryllium was measured in air, on cotton gloves, and on work surfaces. Summary statistics were calculated and correlations among all three measurement types were examined at the facility and job level. Exposure ranking strategies were used to identify jobs with higher exposures. The highest air, glove, and surface measurements were observed in beryllium metal production and beryllium oxide ceramics manufacturing jobs that involved hot processes and handling powders. Two finishing and distribution facilities that handle solid alloy products had lower exposures than the primary production facilities, and there were differences observed among jobs. For all facilities combined, strong correlations were found between air-surface (rp ≥ 0.77), glove-surface (rp ≥ 0.76), and air-glove measurements (rp ≥ 0.69). In jobs where higher risk of beryllium sensitization or disease has been reported, exposure levels for all three measurement types were higher than in jobs with lower risk, though they were not the highest. Some jobs with low air concentrations had higher levels of beryllium on glove and surface wipe samples, suggesting a need to further evaluate the causes of the discrepant levels. Although such correlations provide insight on where beryllium is located throughout the workplace, they cannot identify the direction of the pathways between air, surface, or skin. Ranking strategies helped to identify jobs with the highest combined air, glove, and/or surface exposures

  2. Aeropropulsion facilities configuration control: Procedures manual

    NASA Technical Reports Server (NTRS)

    Lavelle, James J.

    1990-01-01

    Lewis Research Center senior management directed that the aeropropulsion facilities be put under configuration control. A Configuration Management (CM) program was established by the Facilities Management Branch of the Aeropropulsion Facilities and Experiments Division. Under the CM program, a support service contractor was engaged to staff and implement the program. The Aeronautics Directorate has over 30 facilities at Lewis of various sizes and complexities. Under the program, a Facility Baseline List (FBL) was established for each facility, listing which systems and their documents were to be placed under configuration control. A Change Control System (CCS) was established requiring that any proposed changes to FBL systems or their documents were to be processed as per the CCS. Limited access control of the FBL master drawings was implemented and an audit system established to ensure all facility changes are properly processed. This procedures manual sets forth the policy and responsibilities to ensure all key documents constituting a facilities configuration are kept current, modified as needed, and verified to reflect any proposed change. This is the essence of the CM program.

  3. First principles investigation of the initial stage of H-induced missing-row reconstruction of Pd(110) surface

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Padama, Allan Abraham B.; Kasai, Hideaki, E-mail: kasai@dyn.ap.eng.osaka-u.ac.jp; Center for Atomic and Molecular Technologies, Osaka University, Suita, Osaka 565-0871

    2014-06-28

    The pathway of H diffusion that will induce the migration of Pd atom is investigated by employing first principles calculations based on density functional theory to explain the origin of missing-row reconstruction of Pd(110).The calculated activation barrier and the H-induced reconstruction energy reveal that the long bridge-to-tetrahedral configuration is the energetically favored process for the initial stage of reconstruction phenomenon. While the H diffusion triggers the migration of Pd atom, it is the latter process that significantly contributes to the activated missing-row reconstruction of Pd(110). Nonetheless, the strong interaction between the diffusing H and the Pd atoms dictates the occurrencemore » of reconstructed surface.« less

  4. Breast reconstruction - natural tissue

    MedlinePlus

    ... flap; TUG; Mastectomy - breast reconstruction with natural tissue; Breast cancer - breast reconstruction with natural tissue ... it harder to find a tumor if your breast cancer comes back. The advantage of breast reconstruction with ...

  5. Process-Response Modeling and the Scientific Process.

    ERIC Educational Resources Information Center

    Fichter, Lynn S.

    1988-01-01

    Discusses the process-response model (PRM) in its theoretical and practical forms. Describes how geologists attempt to reconstruct the process from the response (the geologic phenomenon) being studied. (TW)

  6. Optimal contact definition for reconstruction of contact maps.

    PubMed

    Duarte, Jose M; Sathyapriya, Rajagopal; Stehr, Henning; Filippis, Ioannis; Lappe, Michael

    2010-05-27

    Contact maps have been extensively used as a simplified representation of protein structures. They capture most important features of a protein's fold, being preferred by a number of researchers for the description and study of protein structures. Inspired by the model's simplicity many groups have dedicated a considerable amount of effort towards contact prediction as a proxy for protein structure prediction. However a contact map's biological interest is subject to the availability of reliable methods for the 3-dimensional reconstruction of the structure. We use an implementation of the well-known distance geometry protocol to build realistic protein 3-dimensional models from contact maps, performing an extensive exploration of many of the parameters involved in the reconstruction process. We try to address the questions: a) to what accuracy does a contact map represent its corresponding 3D structure, b) what is the best contact map representation with regard to reconstructability and c) what is the effect of partial or inaccurate contact information on the 3D structure recovery. Our results suggest that contact maps derived from the application of a distance cutoff of 9 to 11A around the Cbeta atoms constitute the most accurate representation of the 3D structure. The reconstruction process does not provide a single solution to the problem but rather an ensemble of conformations that are within 2A RMSD of the crystal structure and with lower values for the pairwise average ensemble RMSD. Interestingly it is still possible to recover a structure with partial contact information, although wrong contacts can lead to dramatic loss in reconstruction fidelity. Thus contact maps represent a valid approximation to the structures with an accuracy comparable to that of experimental methods. The optimal contact definitions constitute key guidelines for methods based on contact maps such as structure prediction through contacts and structural alignments based on maximum

  7. Parallel programming of gradient-based iterative image reconstruction schemes for optical tomography.

    PubMed

    Hielscher, Andreas H; Bartel, Sebastian

    2004-02-01

    Optical tomography (OT) is a fast developing novel imaging modality that uses near-infrared (NIR) light to obtain cross-sectional views of optical properties inside the human body. A major challenge remains the time-consuming, computational-intensive image reconstruction problem that converts NIR transmission measurements into cross-sectional images. To increase the speed of iterative image reconstruction schemes that are commonly applied for OT, we have developed and implemented several parallel algorithms on a cluster of workstations. Static process distribution as well as dynamic load balancing schemes suitable for heterogeneous clusters and varying machine performances are introduced and tested. The resulting algorithms are shown to accelerate the reconstruction process to various degrees, substantially reducing the computation times for clinically relevant problems.

  8. CLARA: CLAS12 Reconstruction and Analysis Framework

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gyurjyan, Vardan; Matta, Sebastian Mancilla; Oyarzun, Ricardo

    2016-11-01

    In this paper we present SOA based CLAS12 event Reconstruction and Analyses (CLARA) framework. CLARA design focus is on two main traits: real-time data stream processing, and service-oriented architecture (SOA) in a flow based programming (FBP) paradigm. Data driven and data centric architecture of CLARA presents an environment for developing agile, elastic, multilingual data processing applications. The CLARA framework presents solutions capable of processing large volumes of data interactively and substantially faster than batch systems.

  9. Containerless Processing in Reduced Gravity Using the TEMPUS Facility during MSL-1 and MSL-1R

    NASA Technical Reports Server (NTRS)

    Rogers, Jan R.

    1998-01-01

    Containerless processing provides a high purity environment for the study of high-temperature, very reactive materials. It is an important method which provides access to the metastable state of an undercooled melt. In the absence of container walls, the nucleation rate is greatly reduced and undercooling up to (Tm-Tn)/Tm approx. equal to 0.2 can be obtained, where Tm and Tn are the melting and nucleation temperatures, respectively. Electromagnetic levitation represents a method particularly well-suited for the study of metallic melts. The TEMPUS (Tiegelfreies ElektroMagnetisches Prozessieren Unter Schwerelosgkeit) facility is a research instrument designed to perform electromagnetic levitation studies in reduced gravity. TEMPUS is a joint undertaking between DARA, the German Space Agency, and the Microgravity Science and Applications Division of NASA. The George C. Marshall Space Flight Center provides the leadership for scientific and management efforts which support the four US PI teams which performed experiments in the TEMPUS facility. The facility is sensitive to accelerations in the 1-10 Hz range. This became evident during the MSL-1 mission. Analysis of accelerometer and video data indicated that loss of sample control occurred during crew exercise periods which created disturbances in this frequency range. Prior to the MSL-1R flight the TEMPUS team, the accelerometer support groups and the mission operations team developed a strategy to provide for the operation of the facility without such disturbances. The successful implementation of this plan led to the highly successful operation of this facility during MSL-1R.

  10. A very efficient RCS data compression and reconstruction technique, volume 4

    NASA Technical Reports Server (NTRS)

    Tseng, N. Y.; Burnside, W. D.

    1992-01-01

    A very efficient compression and reconstruction scheme for RCS measurement data was developed. The compression is done by isolating the scattering mechanisms on the target and recording their individual responses in the frequency and azimuth scans, respectively. The reconstruction, which is an inverse process of the compression, is granted by the sampling theorem. Two sets of data, the corner reflectors and the F-117 fighter model, were processed and the results were shown to be convincing. The compression ratio can be as large as several hundred, depending on the target's geometry and scattering characteristics.

  11. Impact of reconstructive transplantation on the future of plastic and reconstructive surgery.

    PubMed

    Siemionow, Maria

    2012-10-01

    This article summarizes the current knowledge on the new developing field of reconstructive transplantation. A brief outline of vascularized composite allografts (VCA) such as human hand, face, larynx, and abdominal wall transplants is provided. The clinical applications and indications for these new reconstructive transplantation procedures are outlined. The advantages, disadvantages, and complications and concerns surrounding clinical VCA are discussed. Finally, the impact of reconstructive transplantation on the future of plastic and reconstructive surgery is presented. Copyright © 2012 Elsevier Inc. All rights reserved.

  12. State Requirements for Educational Facilities, 1999.

    ERIC Educational Resources Information Center

    Florida State Dept. of Education, Tallahassee. Office of Educational Facilities.

    This updated, two-volume document provides guidance for those involved in the educational facilities procurement process, and includes recent legislative changes affecting the state of Florida's building code. The first volume is organized by the sequence of steps required in the facilities procurement process and presents state requirements for…

  13. 3D reconstruction of the magnetic vector potential using model based iterative reconstruction

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Prabhat, K. C.; Aditya Mohan, K.; Phatak, Charudatta

    Lorentz transmission electron microscopy (TEM) observations of magnetic nanoparticles contain information on the magnetic and electrostatic potentials. Vector field electron tomography (VFET) can be used to reconstruct electromagnetic potentials of the nanoparticles from their corresponding LTEM images. The VFET approach is based on the conventional filtered back projection approach to tomographic reconstructions and the availability of an incomplete set of measurements due to experimental limitations means that the reconstructed vector fields exhibit significant artifacts. In this paper, we outline a model-based iterative reconstruction (MBIR) algorithm to reconstruct the magnetic vector potential of magnetic nanoparticles. We combine a forward model formore » image formation in TEM experiments with a prior model to formulate the tomographic problem as a maximum a-posteriori probability estimation problem (MAP). The MAP cost function is minimized iteratively to determine the vector potential. Here, a comparative reconstruction study of simulated as well as experimental data sets show that the MBIR approach yields quantifiably better reconstructions than the VFET approach.« less

  14. Long-term functional outcome of mandibular reconstruction with stainless steel AO reconstruction plates.

    PubMed

    van Minnen, B; Nauta, J M; Vermey, A; Bos, R R M; Roodenburg, J L N

    2002-04-01

    Mandibular continuity defects are usually reconstructed with bone grafts. However, factors associated with the tumour and the patient can still be reasons to choose reconstruction plates. The aim of this study was to find out the results of mandibular reconstructions with stainless steel AO reconstruction plates after a long follow-up period. The records of 36 patients were reviewed for personal data and the history of disease, treatment and complications. Patients with failed reconstructions were compared with those in whom the procedure had been successful. Patients and surgeons gave their opinion on the functional and cosmetic results. The mean follow-up was 39 months (range 4-99); 4 patients were withdrawn because they developed early recurrent disease and in 17 patients the reconstruction failed. We found no significant differences between the successful and the failed group. Fourteen patients could be evaluated for functional outcome, 10 of whom were totally or satisfactorily rehabilitated. Therefore, stainless steel reconstruction plates can be used in patients when other options are inappropriate.

  15. 3D reconstruction of the magnetic vector potential using model based iterative reconstruction

    DOE PAGES

    Prabhat, K. C.; Aditya Mohan, K.; Phatak, Charudatta; ...

    2017-07-03

    Lorentz transmission electron microscopy (TEM) observations of magnetic nanoparticles contain information on the magnetic and electrostatic potentials. Vector field electron tomography (VFET) can be used to reconstruct electromagnetic potentials of the nanoparticles from their corresponding LTEM images. The VFET approach is based on the conventional filtered back projection approach to tomographic reconstructions and the availability of an incomplete set of measurements due to experimental limitations means that the reconstructed vector fields exhibit significant artifacts. In this paper, we outline a model-based iterative reconstruction (MBIR) algorithm to reconstruct the magnetic vector potential of magnetic nanoparticles. We combine a forward model formore » image formation in TEM experiments with a prior model to formulate the tomographic problem as a maximum a-posteriori probability estimation problem (MAP). The MAP cost function is minimized iteratively to determine the vector potential. Here, a comparative reconstruction study of simulated as well as experimental data sets show that the MBIR approach yields quantifiably better reconstructions than the VFET approach.« less

  16. The Learning Reconstruction of Particle System and Linear Momentum Conservation in Introductory Physics Course

    NASA Astrophysics Data System (ADS)

    Karim, S.; Saepuzaman, D.; Sriyansyah, S. P.

    2016-08-01

    This study is initiated by low achievement of prospective teachers in understanding concepts in introductory physics course. In this case, a problem has been identified that students cannot develop their thinking skills required for building physics concepts. Therefore, this study will reconstruct a learning process, emphasizing a physics concept building. The outcome will design physics lesson plans for the concepts of particle system as well as linear momentum conservation. A descriptive analysis method will be used in order to investigate the process of learning reconstruction carried out by students. In this process, the students’ conceptual understanding will be evaluated using essay tests for concepts of particle system and linear momentum conservation. The result shows that the learning reconstruction has successfully supported the students’ understanding of physics concept.

  17. Spectral Reconstruction for Obtaining Virtual Hyperspectral Images

    NASA Astrophysics Data System (ADS)

    Perez, G. J. P.; Castro, E. C.

    2016-12-01

    Hyperspectral sensors demonstrated its capabalities in identifying materials and detecting processes in a satellite scene. However, availability of hyperspectral images are limited due to the high development cost of these sensors. Currently, most of the readily available data are from multi-spectral instruments. Spectral reconstruction is an alternative method to address the need for hyperspectral information. The spectral reconstruction technique has been shown to provide a quick and accurate detection of defects in an integrated circuit, recovers damaged parts of frescoes, and it also aids in converting a microscope into an imaging spectrometer. By using several spectral bands together with a spectral library, a spectrum acquired by a sensor can be expressed as a linear superposition of elementary signals. In this study, spectral reconstruction is used to estimate the spectra of different surfaces imaged by Landsat 8. Four atmospherically corrected surface reflectance from three visible bands (499 nm, 585 nm, 670 nm) and one near-infrared band (872 nm) of Landsat 8, and a spectral library of ground elements acquired from the United States Geological Survey (USGS) are used. The spectral library is limited to 420-1020 nm spectral range, and is interpolated at one nanometer resolution. Singular Value Decomposition (SVD) is used to calculate the basis spectra, which are then applied to reconstruct the spectrum. The spectral reconstruction is applied for test cases within the library consisting of vegetation communities. This technique was successful in reconstructing a hyperspectral signal with error of less than 12% for most of the test cases. Hence, this study demonstrated the potential of simulating information at any desired wavelength, creating a virtual hyperspectral sensor without the need for additional satellite bands.

  18. [From Wolff law, Ilizarov technology to natural reconstruction theory].

    PubMed

    Zang, Jian-cheng; Qin, Si-He

    2013-04-01

    Wolff law was an adaptable principle of bone, Tension-Stress Principle was equal to Distraction Osteogenesis or Distraction Tissue Regeneration, The Natural Reconstruction theory was a new orthopedic perspective proposed by Prof. QIN after deformity correction using Ilizarov technology. The thought about their relationship originated from a social phenomena, that the crowds and the confusion about export choice in Beijing's subway. Ilizarov technology and Wolff law were one concept related to Mechanics, and the former is completely in line with the latter. In other words, Ilizarov technology is an extension of Wolff law, is a repeated process of micro-trauma and continuous repair of bone trabecular initiated by moden engineering, just trabecular formed along the tension-stress direction. With adjustment of mechanical force,doctor can control the process of fracture healing and bone remolding to a certain extent. Natural Reconstruction theory enlarged the defined range of Wolff law obviously. Not only guided orthopedics clinical and basic research,but also related to the dialectical thinking of the doctor-patient relationship in sociology. There was an inevitable connection among Wolff law, Ilizarov technology and Natural Reconstruction theory. The history of discovery and understanding was a continuous process of thinking,practice and integration.

  19. Yeast 5 – an expanded reconstruction of the Saccharomyces cerevisiae metabolic network

    PubMed Central

    2012-01-01

    Background Efforts to improve the computational reconstruction of the Saccharomyces cerevisiae biochemical reaction network and to refine the stoichiometrically constrained metabolic models that can be derived from such a reconstruction have continued since the first stoichiometrically constrained yeast genome scale metabolic model was published in 2003. Continuing this ongoing process, we have constructed an update to the Yeast Consensus Reconstruction, Yeast 5. The Yeast Consensus Reconstruction is a product of efforts to forge a community-based reconstruction emphasizing standards compliance and biochemical accuracy via evidence-based selection of reactions. It draws upon models published by a variety of independent research groups as well as information obtained from biochemical databases and primary literature. Results Yeast 5 refines the biochemical reactions included in the reconstruction, particularly reactions involved in sphingolipid metabolism; updates gene-reaction annotations; and emphasizes the distinction between reconstruction and stoichiometrically constrained model. Although it was not a primary goal, this update also improves the accuracy of model prediction of viability and auxotrophy phenotypes and increases the number of epistatic interactions. This update maintains an emphasis on standards compliance, unambiguous metabolite naming, and computer-readable annotations available through a structured document format. Additionally, we have developed MATLAB scripts to evaluate the model’s predictive accuracy and to demonstrate basic model applications such as simulating aerobic and anaerobic growth. These scripts, which provide an independent tool for evaluating the performance of various stoichiometrically constrained yeast metabolic models using flux balance analysis, are included as Additional files 1, 2 and 3. Conclusions Yeast 5 expands and refines the computational reconstruction of yeast metabolism and improves the predictive accuracy of a

  20. Facile solution-processed aqueous MoOx for feasible application in organic light-emitting diode

    NASA Astrophysics Data System (ADS)

    Zheng, Qinghong; Qu, Disui; Zhang, Yan; Li, Wanshu; Xiong, Jian; Cai, Ping; Xue, Xiaogang; Liu, Liming; Wang, Honghang; Zhang, Xiaowen

    2018-05-01

    Solution-processed techniques attract increasing attentions in organic electronics for their low-cost and scalable manufacturing. We demonstrate the favorite hole injection material of solution-processed aqueous MoOx (s-MoOx) with facile fabrication process and cast successful application to constructing efficient organic light-emitting diodes (OLEDs). Atomic force microscopy and X-ray photoelectron spectroscopy analysis show that s-MoOx behaves superior film morphology and non-stoichiometry with slight oxygen deficiency. With tris(8-hydroxy-quinolinato)aluminium as emitting layer, s-MoOx based OLED shows maximum luminous efficiency of 7.9 cd/A and power efficiency of 5.9 lm/W, which have been enhanced by 43.6% and 73.5%, respectively, in comparison with the counterpart using conventional vacuum thermal evaporation MoOx. Current-voltage, impedance-voltage, phase-voltage and capacitance-voltage characteristics of hole-only devices indicate that s-MoOx with two processes of "spin-coating/annealing" shows mostly enhanced hole injection capacity and thus promoting device performance. Our experiments provide an alternative approach for constructing efficient OLED with solution process.