Sample records for code scanning system

  1. QR codes: next level of social media.

    PubMed

    Gottesman, Wesley; Baum, Neil

    2013-01-01

    The OR code, which is short for quick response code, system was invented in Japan for the auto industry. Its purpose was to track vehicles during manufacture; it was designed to allow high-speed component scanning. Now the scanning can be easily accomplished via cell phone, making the technology useful and within reach of your patients. There are numerous applications for OR codes in the contemporary medical practice. This article describes QR codes and how they might be applied for marketing and practice management.

  2. Modeling And Simulation Of Bar Code Scanners Using Computer Aided Design Software

    NASA Astrophysics Data System (ADS)

    Hellekson, Ron; Campbell, Scott

    1988-06-01

    Many optical systems have demanding requirements to package the system in a small 3 dimensional space. The use of computer graphic tools can be a tremendous aid to the designer in analyzing the optical problems created by smaller and less costly systems. The Spectra Physics grocery store bar code scanner employs an especially complex 3 dimensional scan pattern to read bar code labels. By using a specially written program which interfaces with a computer aided design system, we have simulated many of the functions of this complex optical system. In this paper we will illustrate how a recent version of the scanner has been designed. We will discuss the use of computer graphics in the design process including interactive tweaking of the scan pattern, analysis of collected light, analysis of the scan pattern density, and analysis of the manufacturing tolerances used to build the scanner.

  3. Structured Light Based 3d Scanning for Specular Surface by the Combination of Gray Code and Phase Shifting

    NASA Astrophysics Data System (ADS)

    Zhang, Yujia; Yilmaz, Alper

    2016-06-01

    Surface reconstruction using coded structured light is considered one of the most reliable techniques for high-quality 3D scanning. With a calibrated projector-camera stereo system, a light pattern is projected onto the scene and imaged by the camera. Correspondences between projected and recovered patterns are computed in the decoding process, which is used to generate 3D point cloud of the surface. However, the indirect illumination effects on the surface, such as subsurface scattering and interreflections, will raise the difficulties in reconstruction. In this paper, we apply maximum min-SW gray code to reduce the indirect illumination effects of the specular surface. We also analysis the errors when comparing the maximum min-SW gray code and the conventional gray code, which justifies that the maximum min-SW gray code has significant superiority to reduce the indirect illumination effects. To achieve sub-pixel accuracy, we project high frequency sinusoidal patterns onto the scene simultaneously. But for specular surface, the high frequency patterns are susceptible to decoding errors. Incorrect decoding of high frequency patterns will result in a loss of depth resolution. Our method to resolve this problem is combining the low frequency maximum min-SW gray code and the high frequency phase shifting code, which achieves dense 3D reconstruction for specular surface. Our contributions include: (i) A complete setup of the structured light based 3D scanning system; (ii) A novel combination technique of the maximum min-SW gray code and phase shifting code. First, phase shifting decoding with sub-pixel accuracy. Then, the maximum min-SW gray code is used to resolve the ambiguity resolution. According to the experimental results and data analysis, our structured light based 3D scanning system enables high quality dense reconstruction of scenes with a small number of images. Qualitative and quantitative comparisons are performed to extract the advantages of our new combined coding method.

  4. InterProScan 5: genome-scale protein function classification

    PubMed Central

    Jones, Philip; Binns, David; Chang, Hsin-Yu; Fraser, Matthew; Li, Weizhong; McAnulla, Craig; McWilliam, Hamish; Maslen, John; Mitchell, Alex; Nuka, Gift; Pesseat, Sebastien; Quinn, Antony F.; Sangrador-Vegas, Amaia; Scheremetjew, Maxim; Yong, Siew-Yit; Lopez, Rodrigo; Hunter, Sarah

    2014-01-01

    Motivation: Robust large-scale sequence analysis is a major challenge in modern genomic science, where biologists are frequently trying to characterize many millions of sequences. Here, we describe a new Java-based architecture for the widely used protein function prediction software package InterProScan. Developments include improvements and additions to the outputs of the software and the complete reimplementation of the software framework, resulting in a flexible and stable system that is able to use both multiprocessor machines and/or conventional clusters to achieve scalable distributed data analysis. InterProScan is freely available for download from the EMBl-EBI FTP site and the open source code is hosted at Google Code. Availability and implementation: InterProScan is distributed via FTP at ftp://ftp.ebi.ac.uk/pub/software/unix/iprscan/5/ and the source code is available from http://code.google.com/p/interproscan/. Contact: http://www.ebi.ac.uk/support or interhelp@ebi.ac.uk or mitchell@ebi.ac.uk PMID:24451626

  5. Spiral computed tomography phase-space source model in the BEAMnrc/EGSnrc Monte Carlo system: implementation and validation.

    PubMed

    Kim, Sangroh; Yoshizumi, Terry T; Yin, Fang-Fang; Chetty, Indrin J

    2013-04-21

    Currently, the BEAMnrc/EGSnrc Monte Carlo (MC) system does not provide a spiral CT source model for the simulation of spiral CT scanning. We developed and validated a spiral CT phase-space source model in the BEAMnrc/EGSnrc system. The spiral phase-space source model was implemented in the DOSXYZnrc user code of the BEAMnrc/EGSnrc system by analyzing the geometry of spiral CT scan-scan range, initial angle, rotational direction, pitch, slice thickness, etc. Table movement was simulated by changing the coordinates of the isocenter as a function of beam angles. Some parameters such as pitch, slice thickness and translation per rotation were also incorporated into the model to make the new phase-space source model, designed specifically for spiral CT scan simulations. The source model was hard-coded by modifying the 'ISource = 8: Phase-Space Source Incident from Multiple Directions' in the srcxyznrc.mortran and dosxyznrc.mortran files in the DOSXYZnrc user code. In order to verify the implementation, spiral CT scans were simulated in a CT dose index phantom using the validated x-ray tube model of a commercial CT simulator for both the original multi-direction source (ISOURCE = 8) and the new phase-space source model in the DOSXYZnrc system. Then the acquired 2D and 3D dose distributions were analyzed with respect to the input parameters for various pitch values. In addition, surface-dose profiles were also measured for a patient CT scan protocol using radiochromic film and were compared with the MC simulations. The new phase-space source model was found to simulate the spiral CT scanning in a single simulation run accurately. It also produced the equivalent dose distribution of the ISOURCE = 8 model for the same CT scan parameters. The MC-simulated surface profiles were well matched to the film measurement overall within 10%. The new spiral CT phase-space source model was implemented in the BEAMnrc/EGSnrc system. This work will be beneficial in estimating the spiral CT scan dose in the BEAMnrc/EGSnrc system.

  6. Spiral computed tomography phase-space source model in the BEAMnrc/EGSnrc Monte Carlo system: implementation and validation

    NASA Astrophysics Data System (ADS)

    Kim, Sangroh; Yoshizumi, Terry T.; Yin, Fang-Fang; Chetty, Indrin J.

    2013-04-01

    Currently, the BEAMnrc/EGSnrc Monte Carlo (MC) system does not provide a spiral CT source model for the simulation of spiral CT scanning. We developed and validated a spiral CT phase-space source model in the BEAMnrc/EGSnrc system. The spiral phase-space source model was implemented in the DOSXYZnrc user code of the BEAMnrc/EGSnrc system by analyzing the geometry of spiral CT scan—scan range, initial angle, rotational direction, pitch, slice thickness, etc. Table movement was simulated by changing the coordinates of the isocenter as a function of beam angles. Some parameters such as pitch, slice thickness and translation per rotation were also incorporated into the model to make the new phase-space source model, designed specifically for spiral CT scan simulations. The source model was hard-coded by modifying the ‘ISource = 8: Phase-Space Source Incident from Multiple Directions’ in the srcxyznrc.mortran and dosxyznrc.mortran files in the DOSXYZnrc user code. In order to verify the implementation, spiral CT scans were simulated in a CT dose index phantom using the validated x-ray tube model of a commercial CT simulator for both the original multi-direction source (ISOURCE = 8) and the new phase-space source model in the DOSXYZnrc system. Then the acquired 2D and 3D dose distributions were analyzed with respect to the input parameters for various pitch values. In addition, surface-dose profiles were also measured for a patient CT scan protocol using radiochromic film and were compared with the MC simulations. The new phase-space source model was found to simulate the spiral CT scanning in a single simulation run accurately. It also produced the equivalent dose distribution of the ISOURCE = 8 model for the same CT scan parameters. The MC-simulated surface profiles were well matched to the film measurement overall within 10%. The new spiral CT phase-space source model was implemented in the BEAMnrc/EGSnrc system. This work will be beneficial in estimating the spiral CT scan dose in the BEAMnrc/EGSnrc system.

  7. Eye-Movements During Search for Coded and Uncoded Targets

    DTIC Science & Technology

    1974-06-28

    effective a coding system as solor . Subjects did not exhibit a reliable, characteristic scan-path, except for two subjects in the uncoded condition...Psychophysics 8, 171- 172, 1970. 31. Clarke, S. E. Retrieval of color information from the pre-perceptu- al storage system . J Exp Psychol 82, 263

  8. Scan-Line Methods in Spatial Data Systems

    DTIC Science & Technology

    1990-09-04

    algorithms in detail to show some of the implementation issues. Data Compression Storage and transmission times can be reduced by using compression ...goes through the data . Luckily, there are good one-directional compression algorithms , such as run-length coding 13 in which each scan line can be...independently compressed . These are the algorithms to use in a parallel scan-line system. Data compression is usually only used for long-term storage of

  9. A new security solution to JPEG using hyper-chaotic system and modified zigzag scan coding

    NASA Astrophysics Data System (ADS)

    Ji, Xiao-yong; Bai, Sen; Guo, Yu; Guo, Hui

    2015-05-01

    Though JPEG is an excellent compression standard of images, it does not provide any security performance. Thus, a security solution to JPEG was proposed in Zhang et al. (2014). But there are some flaws in Zhang's scheme and in this paper we propose a new scheme based on discrete hyper-chaotic system and modified zigzag scan coding. By shuffling the identifiers of zigzag scan encoded sequence with hyper-chaotic sequence and accurately encrypting the certain coefficients which have little relationship with the correlation of the plain image in zigzag scan encoded domain, we achieve high compression performance and robust security simultaneously. Meanwhile we present and analyze the flaws in Zhang's scheme through theoretical analysis and experimental verification, and give the comparisons between our scheme and Zhang's. Simulation results verify that our method has better performance in security and efficiency.

  10. The FORTRAN static source code analyzer program (SAP) system description

    NASA Technical Reports Server (NTRS)

    Decker, W.; Taylor, W.; Merwarth, P.; Oneill, M.; Goorevich, C.; Waligora, S.

    1982-01-01

    A source code analyzer program (SAP) designed to assist personnel in conducting studies of FORTRAN programs is described. The SAP scans FORTRAN source code and produces reports that present statistics and measures of statements and structures that make up a module. The processing performed by SAP and of the routines, COMMON blocks, and files used by SAP are described. The system generation procedure for SAP is also presented.

  11. Evaluation and implementation of QR Code Identity Tag system for Healthcare in Turkey.

    PubMed

    Uzun, Vassilya; Bilgin, Sami

    2016-01-01

    For this study, we designed a QR Code Identity Tag system to integrate into the Turkish healthcare system. This system provides QR code-based medical identification alerts and an in-hospital patient identification system. Every member of the medical system is assigned a unique QR Code Tag; to facilitate medical identification alerts, the QR Code Identity Tag can be worn as a bracelet or necklace or carried as an ID card. Patients must always possess the QR Code Identity bracelets within hospital grounds. These QR code bracelets link to the QR Code Identity website, where detailed information is stored; a smartphone or standalone QR code scanner can be used to scan the code. The design of this system allows authorized personnel (e.g., paramedics, firefighters, or police) to access more detailed patient information than the average smartphone user: emergency service professionals are authorized to access patient medical histories to improve the accuracy of medical treatment. In Istanbul, we tested the self-designed system with 174 participants. To analyze the QR Code Identity Tag system's usability, the participants completed the System Usability Scale questionnaire after using the system.

  12. Laser Scanning Reader For Automated Data Entry Operations

    NASA Astrophysics Data System (ADS)

    Cheng, Charles C. K.

    1980-02-01

    The use of the Universal Product Code (UPC) in conjunction with the laser-scanner-equipped electronic checkout system has made it technologically possible for supermarket stores to operate more efficiently and accurately. At present, more than 90% of the packages in grocery stores have been marked by the manufacturer with laser-scannable UPC symbols and the installation of laser scanning systems is expected to expand into all major chain stores. Areas to be discussed are: system design features, laser-scanning pattern generation, signal-processing logical considerations, UPC characteristics and encodation.

  13. Optimization of GATE and PHITS Monte Carlo code parameters for spot scanning proton beam based on simulation with FLUKA general-purpose code

    NASA Astrophysics Data System (ADS)

    Kurosu, Keita; Das, Indra J.; Moskvin, Vadim P.

    2016-01-01

    Spot scanning, owing to its superior dose-shaping capability, provides unsurpassed dose conformity, in particular for complex targets. However, the robustness of the delivered dose distribution and prescription has to be verified. Monte Carlo (MC) simulation has the potential to generate significant advantages for high-precise particle therapy, especially for medium containing inhomogeneities. However, the inherent choice of computational parameters in MC simulation codes of GATE, PHITS and FLUKA that is observed for uniform scanning proton beam needs to be evaluated. This means that the relationship between the effect of input parameters and the calculation results should be carefully scrutinized. The objective of this study was, therefore, to determine the optimal parameters for the spot scanning proton beam for both GATE and PHITS codes by using data from FLUKA simulation as a reference. The proton beam scanning system of the Indiana University Health Proton Therapy Center was modeled in FLUKA, and the geometry was subsequently and identically transferred to GATE and PHITS. Although the beam transport is managed by spot scanning system, the spot location is always set at the center of a water phantom of 600 × 600 × 300 mm3, which is placed after the treatment nozzle. The percentage depth dose (PDD) is computed along the central axis using 0.5 × 0.5 × 0.5 mm3 voxels in the water phantom. The PDDs and the proton ranges obtained with several computational parameters are then compared to those of FLUKA, and optimal parameters are determined from the accuracy of the proton range, suppressed dose deviation, and computational time minimization. Our results indicate that the optimized parameters are different from those for uniform scanning, suggesting that the gold standard for setting computational parameters for any proton therapy application cannot be determined consistently since the impact of setting parameters depends on the proton irradiation technique. We therefore conclude that customization parameters must be set with reference to the optimized parameters of the corresponding irradiation technique in order to render them useful for achieving artifact-free MC simulation for use in computational experiments and clinical treatments.

  14. 77 FR 16158 - Current Good Manufacturing Practice in Manufacturing, Processing, Packing, or Holding of Drugs...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-03-20

    ... ``cut'' from a sheet or roll of labels--is used. Persistent problems with drug product mislabeling and... believe that development and use of advanced code scanning equipment has made many current electronic... and other advanced scanning techniques have made current electronic systems reliable to the 100...

  15. 40 CFR 86.1806-04 - On-board diagnostics.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... codes shall be consistent with SAE J2012 “Diagnostic Trouble Code Definitions—Equivalent to ISO/DIS... sent to the scan tool over a J1850 data link shall use the Cyclic Redundancy Check and the three byte..., definitions and abbreviations shall be formatted according to SAE J1930 “Electrical/Electronic Systems...

  16. 40 CFR 86.1806-04 - On-board diagnostics.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... codes shall be consistent with SAE J2012 “Diagnostic Trouble Code Definitions—Equivalent to ISO/DIS... sent to the scan tool over a J1850 data link shall use the Cyclic Redundancy Check and the three byte..., definitions and abbreviations shall be formatted according to SAE J1930 “Electrical/Electronic Systems...

  17. 40 CFR 86.1806-04 - On-board diagnostics.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... codes shall be consistent with SAE J2012 “Diagnostic Trouble Code Definitions—Equivalent to ISO/DIS... sent to the scan tool over a J1850 data link shall use the Cyclic Redundancy Check and the three byte..., definitions and abbreviations shall be formatted according to SAE J1930 “Electrical/Electronic Systems...

  18. Relationships between Translation and Transcription Processes during fMRI Connectivity Scanning and Coded Translation and Transcription in Writing Products after Scanning in Children with and without Transcription Disabilities

    PubMed Central

    Wallis, Peter; Richards, Todd; Boord, Peter; Abbott, Robert; Berninger, Virginia

    2018-01-01

    Students with transcription disabilities (dysgraphia/impaired handwriting, n = 13 or dyslexia/impaired word spelling, n = 16) or without transcription disabilities (controls) completed transcription and translation (idea generating, planning, and creating) writing tasks during fMRI connectivity scanning and compositions after scanning, which were coded for transcription and translation variables. Compositions in both groups showed diversity in genre beyond usual narrative-expository distinction; groups differed in coded transcription but not translation variables. For the control group specific transcription or translation tasks during scanning correlated with corresponding coded transcription or translation skills in composition, but connectivity during scanning was not correlated with coded handwriting during composing in dysgraphia group and connectivity during translating was not correlated with any coded variable during composing in dyslexia group. Results are discussed in reference to the trend in neuroscience to use connectivity from relevant seed points while performing tasks and trends in education to recognize the generativity (creativity) of composing at both the genre and syntax levels. PMID:29600113

  19. Real-time chirp-coded imaging with a programmable ultrasound biomicroscope.

    PubMed

    Bosisio, Mattéo R; Hasquenoph, Jean-Michel; Sandrin, Laurent; Laugier, Pascal; Bridal, S Lori; Yon, Sylvain

    2010-03-01

    Ultrasound biomicroscopy (UBM) of mice can provide a testing ground for new imaging strategies. The UBM system presented in this paper facilitates the development of imaging and measurement methods with programmable design, arbitrary waveform coding, broad bandwidth (2-80 MHz), digital filtering, programmable processing, RF data acquisition, multithread/multicore real-time display, and rapid mechanical scanning (

  20. 19 CFR 142.46 - Presentation of invoice and assignment of entry number.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... transportation, the appropriate manifest document. (b) Verification of data. If after scanning the bar code at the Line Release site, the Customs officer verifies the data on the bar code with the information on... assigned to the transaction. If there are any differences between the system data and the invoice and bar...

  1. 19 CFR 142.46 - Presentation of invoice and assignment of entry number.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... transportation, the appropriate manifest document. (b) Verification of data. If after scanning the bar code at the Line Release site, the Customs officer verifies the data on the bar code with the information on... assigned to the transaction. If there are any differences between the system data and the invoice and bar...

  2. 19 CFR 142.46 - Presentation of invoice and assignment of entry number.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... transportation, the appropriate manifest document. (b) Verification of data. If after scanning the bar code at the Line Release site, the Customs officer verifies the data on the bar code with the information on... assigned to the transaction. If there are any differences between the system data and the invoice and bar...

  3. 19 CFR 142.46 - Presentation of invoice and assignment of entry number.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... transportation, the appropriate manifest document. (b) Verification of data. If after scanning the bar code at the Line Release site, the Customs officer verifies the data on the bar code with the information on... assigned to the transaction. If there are any differences between the system data and the invoice and bar...

  4. 19 CFR 142.46 - Presentation of invoice and assignment of entry number.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... transportation, the appropriate manifest document. (b) Verification of data. If after scanning the bar code at the Line Release site, the Customs officer verifies the data on the bar code with the information on... assigned to the transaction. If there are any differences between the system data and the invoice and bar...

  5. Mode-dependent templates and scan order for H.264/AVC-based intra lossless coding.

    PubMed

    Gu, Zhouye; Lin, Weisi; Lee, Bu-Sung; Lau, Chiew Tong; Sun, Ming-Ting

    2012-09-01

    In H.264/advanced video coding (AVC), lossless coding and lossy coding share the same entropy coding module. However, the entropy coders in the H.264/AVC standard were original designed for lossy video coding and do not yield adequate performance for lossless video coding. In this paper, we analyze the problem with the current lossless coding scheme and propose a mode-dependent template (MD-template) based method for intra lossless coding. By exploring the statistical redundancy of the prediction residual in the H.264/AVC intra prediction modes, more zero coefficients are generated. By designing a new scan order for each MD-template, the scanned coefficients sequence fits the H.264/AVC entropy coders better. A fast implementation algorithm is also designed. With little computation increase, experimental results confirm that the proposed fast algorithm achieves about 7.2% bit saving compared with the current H.264/AVC fidelity range extensions high profile.

  6. The development of efficient coding for an electronic mail system

    NASA Technical Reports Server (NTRS)

    Rice, R. F.

    1983-01-01

    Techniques for efficiently representing scanned electronic documents were investigated. Major results include the definition and preliminary performance results of a Universal System for Efficient Electronic Mail (USEEM), offering a potential order of magnitude improvement over standard facsimile techniques for representing textual material.

  7. SCANS (Shipping Cask ANalysis System) a microcomputer-based analysis system for shipping cask design review: User`s manual to Version 3a. Volume 1, Revision 2

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mok, G.C.; Thomas, G.R.; Gerhard, M.A.

    SCANS (Shipping Cask ANalysis System) is a microcomputer-based system of computer programs and databases developed at the Lawrence Livermore National Laboratory (LLNL) for evaluating safety analysis reports on spent fuel shipping casks. SCANS is an easy-to-use system that calculates the global response to impact loads, pressure loads and thermal conditions, providing reviewers with an independent check on analyses submitted by licensees. SCANS is based on microcomputers compatible with the IBM-PC family of computers. The system is composed of a series of menus, input programs, cask analysis programs, and output display programs. All data is entered through fill-in-the-blank input screens thatmore » contain descriptive data requests. Analysis options are based on regulatory cases described in the Code of Federal Regulations 10 CFR 71 and Regulatory Guides published by the US Nuclear Regulatory Commission in 1977 and 1978.« less

  8. MO-F-CAMPUS-I-04: Characterization of Fan Beam Coded Aperture Coherent Scatter Spectral Imaging Methods for Differentiation of Normal and Neoplastic Breast Structures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Morris, R; Albanese, K; Lakshmanan, M

    Purpose: This study intends to characterize the spectral and spatial resolution limits of various fan beam geometries for differentiation of normal and neoplastic breast structures via coded aperture coherent scatter spectral imaging techniques. In previous studies, pencil beam raster scanning methods using coherent scatter computed tomography and selected volume tomography have yielded excellent results for tumor discrimination. However, these methods don’t readily conform to clinical constraints; primarily prolonged scan times and excessive dose to the patient. Here, we refine a fan beam coded aperture coherent scatter imaging system to characterize the tradeoffs between dose, scan time and image quality formore » breast tumor discrimination. Methods: An X-ray tube (125kVp, 400mAs) illuminated the sample with collimated fan beams of varying widths (3mm to 25mm). Scatter data was collected via two linear-array energy-sensitive detectors oriented parallel and perpendicular to the beam plane. An iterative reconstruction algorithm yields images of the sample’s spatial distribution and respective spectral data for each location. To model in-vivo tumor analysis, surgically resected breast tumor samples were used in conjunction with lard, which has a form factor comparable to adipose (fat). Results: Quantitative analysis with current setup geometry indicated optimal performance for beams up to 10mm wide, with wider beams producing poorer spatial resolution. Scan time for a fixed volume was reduced by a factor of 6 when scanned with a 10mm fan beam compared to a 1.5mm pencil beam. Conclusion: The study demonstrates the utility of fan beam coherent scatter spectral imaging for differentiation of normal and neoplastic breast tissues has successfully reduced dose and scan times whilst sufficiently preserving spectral and spatial resolution. Future work to alter the coded aperture and detector geometries could potentially allow the use of even wider fans, thereby making coded aperture coherent scatter imaging a clinically viable method for breast cancer detection. United States Department of Homeland Security; Duke University Medical Center - Department of Radiology; Carl E Ravin Advanced Imaging Laboratories; Duke University Medical Physics Graduate Program.« less

  9. Patient safety with blood products administration using wireless and bar-code technology.

    PubMed

    Porcella, Aleta; Walker, Kristy

    2005-01-01

    Supported by a grant from the Agency for Healthcare Research and Quality, a University of Iowa Hospitals and Clinics interdisciplinary research team created an online data-capture-response tool utilizing wireless mobile devices and bar code technology to track and improve blood products administration process. The tool captures 1) sample collection, 2) sample arrival in the blood bank, 3) blood product dispense from blood bank, and 4) administration. At each step, the scanned patient wristband ID bar code is automatically compared to scanned identification barcode on requisition, sample, and/or product, and the system presents either a confirmation or an error message to the user. Following an eight-month, 5 unit, staged pilot, a 'big bang,' hospital-wide implementation occurred on February 7, 2005. Preliminary results from pilot data indicate that the new barcode process captures errors 3 to 10 times better than the old manual process.

  10. Cotton phenotyping with lidar from a track-mounted platform

    NASA Astrophysics Data System (ADS)

    French, Andrew N.; Gore, Michael A.; Thompson, Alison

    2016-05-01

    High-Throughput Phenotyping (HTP) is a discipline for rapidly identifying plant architectural and physiological responses to environmental factors such as heat and water stress. Experiments conducted since 2010 at Maricopa, Arizona with a three-fold sensor group, including thermal infrared radiometers, active visible/near infrared reflectance sensors, and acoustic plant height sensors, have shown the validity of HTP with a tractor-based system. However, results from these experiments also show that accuracy of plant phenotyping is limited by the system's inability to discriminate plant components and their local environmental conditions. This limitation may be overcome with plant imaging and laser scanning which can help map details in plant architecture and sunlit/shaded leaves. To test the capability for mapping cotton plants with a laser system, a track-mounted platform was deployed in 2015 over a full canopy and defoliated cotton crop consisting of a scanning LIDAR driven by Arduinocontrolled stepper motors. Using custom Python and Tkinter code, the platform moved autonomously along a pipe-track at 0.1 m/s while collecting LIDAR scans at 25 Hz (0.1667 deg. beam). These tests showed that an autonomous LIDAR platform can reduce HTP logistical problems and provide the capability to accurately map cotton plants and cotton bolls. A prototype track-mounted platform was developed to test the use of LIDAR scanning for High- Throughput Phenotyping (HTP). The platform was deployed in 2015 at Maricopa, Arizona over a senescent cotton crop. Using custom Python and Tkinter code, the platform moved autonomously along a pipe-track at <1 m/s while collecting LIDAR scans at 25 Hz (0.1667 deg. beam). Scanning data mapped the canopy heights and widths, and detected cotton bolls.

  11. Variable Coding and Modulation Experiment Using NASA's Space Communication and Navigation Testbed

    NASA Technical Reports Server (NTRS)

    Downey, Joseph A.; Mortensen, Dale J.; Evans, Michael A.; Tollis, Nicholas S.

    2016-01-01

    National Aeronautics and Space Administration (NASA)'s Space Communication and Navigation Testbed on the International Space Station provides a unique opportunity to evaluate advanced communication techniques in an operational system. The experimental nature of the Testbed allows for rapid demonstrations while using flight hardware in a deployed system within NASA's networks. One example is variable coding and modulation, which is a method to increase data-throughput in a communication link. This paper describes recent flight testing with variable coding and modulation over S-band using a direct-to-earth link between the SCaN Testbed and the Glenn Research Center. The testing leverages the established Digital Video Broadcasting Second Generation (DVB-S2) standard to provide various modulation and coding options. The experiment was conducted in a challenging environment due to the multipath and shadowing caused by the International Space Station structure. Performance of the variable coding and modulation system is evaluated and compared to the capacity of the link, as well as standard NASA waveforms.

  12. The design of the CMOS wireless bar code scanner applying optical system based on ZigBee

    NASA Astrophysics Data System (ADS)

    Chen, Yuelin; Peng, Jian

    2008-03-01

    The traditional bar code scanner is influenced by the length of data line, but the farthest distance of the wireless bar code scanner of wireless communication is generally between 30m and 100m on the market. By rebuilding the traditional CCD optical bar code scanner, a CMOS code scanner is designed based on the ZigBee to meet the demands of market. The scan system consists of the CMOS image sensor and embedded chip S3C2401X, when the two dimensional bar code is read, the results show the inaccurate and wrong code bar, resulted from image defile, disturber, reads image condition badness, signal interference, unstable system voltage. So we put forward the method which uses the matrix evaluation and Read-Solomon arithmetic to solve them. In order to construct the whole wireless optics of bar code system and to ensure its ability of transmitting bar code image signals digitally with long distances, ZigBee is used to transmit data to the base station, and this module is designed based on image acquisition system, and at last the wireless transmitting/receiving CC2430 module circuit linking chart is established. And by transplanting the embedded RTOS system LINUX to the MCU, an applying wireless CMOS optics bar code scanner and multi-task system is constructed. Finally, performance of communication is tested by evaluation software Smart RF. In broad space, every ZIGBEE node can realize 50m transmission with high reliability. When adding more ZigBee nodes, the transmission distance can be several thousands of meters long.

  13. An introduction to QR Codes: linking libraries and mobile patrons.

    PubMed

    Hoy, Matthew B

    2011-01-01

    QR codes, or "Quick Response" codes, are two-dimensional barcodes that can be scanned by mobile smartphone cameras. These codes can be used to provide fast access to URLs, telephone numbers, and short passages of text. With the rapid adoption of smartphones, librarians are able to use QR codes to promote services and help library users find materials quickly and independently. This article will explain what QR codes are, discuss how they can be used in the library, and describe issues surrounding their use. A list of resources for generating and scanning QR codes is also provided.

  14. NHEXAS PHASE I ARIZONA STUDY--STANDARD OPERATING PROCEDURE FOR GLOBAL CODING FOR SCANNED FORMS (UA-D-31.1)

    EPA Science Inventory

    The purpose of this SOP is to define the strategy for the Global Coding of Scanned Forms. This procedure applies to the Arizona NHEXAS project and the "Border" study. Keywords: Coding; scannable forms.

    The National Human Exposure Assessment Survey (NHEXAS) is a federal interag...

  15. Code-modulated interferometric imaging system using phased arrays

    NASA Astrophysics Data System (ADS)

    Chauhan, Vikas; Greene, Kevin; Floyd, Brian

    2016-05-01

    Millimeter-wave (mm-wave) imaging provides compelling capabilities for security screening, navigation, and bio- medical applications. Traditional scanned or focal-plane mm-wave imagers are bulky and costly. In contrast, phased-array hardware developed for mass-market wireless communications and automotive radar promise to be extremely low cost. In this work, we present techniques which can allow low-cost phased-array receivers to be reconfigured or re-purposed as interferometric imagers, removing the need for custom hardware and thereby reducing cost. Since traditional phased arrays power combine incoming signals prior to digitization, orthogonal code-modulation is applied to each incoming signal using phase shifters within each front-end and two-bit codes. These code-modulated signals can then be combined and processed coherently through a shared hardware path. Once digitized, visibility functions can be recovered through squaring and code-demultiplexing operations. Pro- vided that codes are selected such that the product of two orthogonal codes is a third unique and orthogonal code, it is possible to demultiplex complex visibility functions directly. As such, the proposed system modulates incoming signals but demodulates desired correlations. In this work, we present the operation of the system, a validation of its operation using behavioral models of a traditional phased array, and a benchmarking of the code-modulated interferometer against traditional interferometer and focal-plane arrays.

  16. Three-dimensional integral imaging displays using a quick-response encoded elemental image array: an overview

    NASA Astrophysics Data System (ADS)

    Markman, A.; Javidi, B.

    2016-06-01

    Quick-response (QR) codes are barcodes that can store information such as numeric data and hyperlinks. The QR code can be scanned using a QR code reader, such as those built into smartphone devices, revealing the information stored in the code. Moreover, the QR code is robust to noise, rotation, and illumination when scanning due to error correction built in the QR code design. Integral imaging is an imaging technique used to generate a three-dimensional (3D) scene by combining the information from two-dimensional (2D) elemental images (EIs) each with a different perspective of a scene. Transferring these 2D images in a secure manner can be difficult. In this work, we overview two methods to store and encrypt EIs in multiple QR codes. The first method uses run-length encoding with Huffman coding and the double-random-phase encryption (DRPE) to compress and encrypt an EI. This information is then stored in a QR code. An alternative compression scheme is to perform photon-counting on the EI prior to compression. Photon-counting is a non-linear transformation of data that creates redundant information thus improving image compression. The compressed data is encrypted using the DRPE. Once information is stored in the QR codes, it is scanned using a smartphone device. The information scanned is decompressed and decrypted and an EI is recovered. Once all EIs have been recovered, a 3D optical reconstruction is generated.

  17. Technical Note: Defining cyclotron-based clinical scanning proton machines in a FLUKA Monte Carlo system.

    PubMed

    Fiorini, Francesca; Schreuder, Niek; Van den Heuvel, Frank

    2018-02-01

    Cyclotron-based pencil beam scanning (PBS) proton machines represent nowadays the majority and most affordable choice for proton therapy facilities, however, their representation in Monte Carlo (MC) codes is more complex than passively scattered proton system- or synchrotron-based PBS machines. This is because degraders are used to decrease the energy from the cyclotron maximum energy to the desired energy, resulting in a unique spot size, divergence, and energy spread depending on the amount of degradation. This manuscript outlines a generalized methodology to characterize a cyclotron-based PBS machine in a general-purpose MC code. The code can then be used to generate clinically relevant plans starting from commercial TPS plans. The described beam is produced at the Provision Proton Therapy Center (Knoxville, TN, USA) using a cyclotron-based IBA Proteus Plus equipment. We characterized the Provision beam in the MC FLUKA using the experimental commissioning data. The code was then validated using experimental data in water phantoms for single pencil beams and larger irregular fields. Comparisons with RayStation TPS plans are also presented. Comparisons of experimental, simulated, and planned dose depositions in water plans show that same doses are calculated by both programs inside the target areas, while penumbrae differences are found at the field edges. These differences are lower for the MC, with a γ(3%-3 mm) index never below 95%. Extensive explanations on how MC codes can be adapted to simulate cyclotron-based scanning proton machines are given with the aim of using the MC as a TPS verification tool to check and improve clinical plans. For all the tested cases, we showed that dose differences with experimental data are lower for the MC than TPS, implying that the created FLUKA beam model is better able to describe the experimental beam. © 2017 The Authors. Medical Physics published by Wiley Periodicals, Inc. on behalf of American Association of Physicists in Medicine.

  18. Electronic Fingerprinting for Industry

    NASA Technical Reports Server (NTRS)

    1995-01-01

    Veritec's VeriSystem is a complete identification and tracking system for component traceability, improved manufacturing and processing, and automated shop floor applications. The system includes the Vericode Symbol, a more accurate and versatile alternative to the traditional bar code, that is scanned by charge coupled device (CCD) cameras. The system was developed by Veritec, Rockwell International and Marshall Space Flight Center to identify and track Space Shuttle parts.

  19. Enhancing Chemical Inventory Management in Laboratory through a Mobile-Based QR Code Tag

    NASA Astrophysics Data System (ADS)

    Shukran, M. A. M.; Ishak, M. S.; Abdullah, M. N.

    2017-08-01

    The demand for a greater inventory management system which can provide a lot of useful information from a single scan has made laboratory inventory management using barcode technology more difficult. Since the barcode technology lacks the ability to overcome the problem and is not capable of providing information needed to manage the chemicals in the laboratory, thus employing a QR code technology is the best solution. In this research, the main idea is to develop a standalone application running with its own database that is periodically synchronized with the inventory software hosted by the computer and connected to a specialized network as well. The first process required to establish this centralized system is to determine all inventory available in the chemical laboratory by referring to the documented data in order to develop the database. Several customization and enhancement were made to the open source QR code technology to ensure the developed application is dedicated for its main purposes. As the end of the research, it was proven that the system is able to track the position of all inventory and showing real time information about the scanned chemical labels. This paper intends to give an overview about the QR tag inventory system that was developed and its implementation at the National Defence University of Malaysia’s (NDUM) chemical laboratory.

  20. STEMsalabim: A high-performance computing cluster friendly code for scanning transmission electron microscopy image simulations of thin specimens.

    PubMed

    Oelerich, Jan Oliver; Duschek, Lennart; Belz, Jürgen; Beyer, Andreas; Baranovskii, Sergei D; Volz, Kerstin

    2017-06-01

    We present a new multislice code for the computer simulation of scanning transmission electron microscope (STEM) images based on the frozen lattice approximation. Unlike existing software packages, the code is optimized to perform well on highly parallelized computing clusters, combining distributed and shared memory architectures. This enables efficient calculation of large lateral scanning areas of the specimen within the frozen lattice approximation and fine-grained sweeps of parameter space. Copyright © 2017 Elsevier B.V. All rights reserved.

  1. Single-intensity-recording optical encryption technique based on phase retrieval algorithm and QR code

    NASA Astrophysics Data System (ADS)

    Wang, Zhi-peng; Zhang, Shuai; Liu, Hong-zhao; Qin, Yi

    2014-12-01

    Based on phase retrieval algorithm and QR code, a new optical encryption technology that only needs to record one intensity distribution is proposed. In this encryption process, firstly, the QR code is generated from the information to be encrypted; and then the generated QR code is placed in the input plane of 4-f system to have a double random phase encryption. For only one intensity distribution in the output plane is recorded as the ciphertext, the encryption process is greatly simplified. In the decryption process, the corresponding QR code is retrieved using phase retrieval algorithm. A priori information about QR code is used as support constraint in the input plane, which helps solve the stagnation problem. The original information can be recovered without distortion by scanning the QR code. The encryption process can be implemented either optically or digitally, and the decryption process uses digital method. In addition, the security of the proposed optical encryption technology is analyzed. Theoretical analysis and computer simulations show that this optical encryption system is invulnerable to various attacks, and suitable for harsh transmission conditions.

  2. Emergency medicine summary code for reporting CT scan results: implementation and survey results.

    PubMed

    Lam, Joanne; Coughlin, Ryan; Buhl, Luce; Herbst, Meghan; Herbst, Timothy; Martillotti, Jared; Coughlin, Bret

    2018-06-01

    The purpose of the study was to assess the emergency department (ED) providers' interest and satisfaction with ED CT result reporting before and after the implementation of a standardized summary code for all CT scan reporting. A summary code was provided at the end of all CTs ordered through the ED from August to October of 2016. A retrospective review was completed on all studies performed during this period. A pre- and post-survey was given to both ED and radiology providers. A total of 3980 CT scans excluding CTAs were ordered with 2240 CTs dedicated to the head and neck, 1685 CTs dedicated to the torso, and 55 CTs dedicated to the extremities. Approximately 74% CT scans were contrast enhanced. Of the 3980 ED CT examination ordered, 69% had a summary code assigned to it. Fifteen percent of the coded CTs had a critical or diagnostic positive result. The introduction of an ED CT summary code did not show a definitive improvement in communication. However, the ED providers are in consensus that radiology reports are crucial their patients' management. There is slightly increased satisfaction with the providers with less than 5 years of experience with the ED CT codes compared to more seasoned providers. The implementation of a user-friendly summary code may allow better analysis of results, practice improvement, and quality measurements in the future.

  3. U.S.-MEXICO BORDER PROGRAM ARIZONA BORDER STUDY--STANDARD OPERATING PROCEDURE FOR GLOBAL CODING FOR SCANNED FORMS (UA-D-31.1)

    EPA Science Inventory

    The purpose of this SOP is to define the strategy for the global coding of scanned forms. This procedure applies to the Arizona NHEXAS project and the Border study. Keywords: Coding; scannable forms.

    The U.S.-Mexico Border Program is sponsored by the Environmental Health Workg...

  4. Method and apparatus for measuring areas of photoelectric cells and photoelectric cell performance parameters

    DOEpatents

    Osterwald, C.R.; Emery, K.A.

    1984-05-29

    A laser scanning system for scanning the surface of photovoltaic cell in a precise, stepped raster pattern includes electric current detecting and measuring equipment for sensing the current response of the scanned cell to the laser beam at each stepped irradiated spot or pixel on the cell surface. A computer is used to control and monitor the raster position of the laser scan as well as monitoring the corresponding current responses, storing this data, operating on it, and for feeding the data to a graphical plotter for producing a visual, color-coded image of the current response of the cell to the laser scan. A translation platform driven by stepper motors in precise X and Y distances holds and rasters the cell being scanned under a stationary spot-focused laser beam.

  5. Method and apparatus for measuring areas of photoelectric cells and photoelectric cell performance parameters

    DOEpatents

    Osterwald, Carl R.; Emery, Keith A.

    1987-01-01

    A laser scanning system for scanning the surface of a photovoltaic cell in a precise, stepped raster pattern includes electric current detecting and measuring equipment for sensing the current response of the scanned cell to the laser beam at each stepped irradiated spot or pixel on the cell surface. A computer is used to control and monitor the raster position of the laser scan as well as monitoring the corresponding current responses, storing this data, operating on it, and for feeding the data to a graphic plotter for producing a visual, color-coded image of the current response of the cell to the laser scan. A translation platform driven by stepper motors in precise X and Y distances holds and rasters the cell being scanned under a stationary spot-focused laser beam.

  6. Accuracy assessment and characterization of x-ray coded aperture coherent scatter spectral imaging for breast cancer classification

    PubMed Central

    Lakshmanan, Manu N.; Greenberg, Joel A.; Samei, Ehsan; Kapadia, Anuj J.

    2017-01-01

    Abstract. Although transmission-based x-ray imaging is the most commonly used imaging approach for breast cancer detection, it exhibits false negative rates higher than 15%. To improve cancer detection accuracy, x-ray coherent scatter computed tomography (CSCT) has been explored to potentially detect cancer with greater consistency. However, the 10-min scan duration of CSCT limits its possible clinical applications. The coded aperture coherent scatter spectral imaging (CACSSI) technique has been shown to reduce scan time through enabling single-angle imaging while providing high detection accuracy. Here, we use Monte Carlo simulations to test analytical optimization studies of the CACSSI technique, specifically for detecting cancer in ex vivo breast samples. An anthropomorphic breast tissue phantom was modeled, a CACSSI imaging system was virtually simulated to image the phantom, a diagnostic voxel classification algorithm was applied to all reconstructed voxels in the phantom, and receiver-operator characteristics analysis of the voxel classification was used to evaluate and characterize the imaging system for a range of parameters that have been optimized in a prior analytical study. The results indicate that CACSSI is able to identify the distribution of cancerous and healthy tissues (i.e., fibroglandular, adipose, or a mix of the two) in tissue samples with a cancerous voxel identification area-under-the-curve of 0.94 through a scan lasting less than 10 s per slice. These results show that coded aperture scatter imaging has the potential to provide scatter images that automatically differentiate cancerous and healthy tissue within ex vivo samples. Furthermore, the results indicate potential CACSSI imaging system configurations for implementation in subsequent imaging development studies. PMID:28331884

  7. Optical noise-free image encryption based on quick response code and high dimension chaotic system in gyrator transform domain

    NASA Astrophysics Data System (ADS)

    Sui, Liansheng; Xu, Minjie; Tian, Ailing

    2017-04-01

    A novel optical image encryption scheme is proposed based on quick response code and high dimension chaotic system, where only the intensity distribution of encoded information is recorded as ciphertext. Initially, the quick response code is engendered from the plain image and placed in the input plane of the double random phase encoding architecture. Then, the code is encrypted to the ciphertext with noise-like distribution by using two cascaded gyrator transforms. In the process of encryption, the parameters such as rotation angles and random phase masks are generated as interim variables and functions based on Chen system. A new phase retrieval algorithm is designed to reconstruct the initial quick response code in the process of decryption, in which a priori information such as three position detection patterns is used as the support constraint. The original image can be obtained without any energy loss by scanning the decrypted code with mobile devices. The ciphertext image is the real-valued function which is more convenient for storing and transmitting. Meanwhile, the security of the proposed scheme is enhanced greatly due to high sensitivity of initial values of Chen system. Extensive cryptanalysis and simulation have performed to demonstrate the feasibility and effectiveness of the proposed scheme.

  8. Coded aperture coherent scatter spectral imaging for assessment of breast cancers: an ex-vivo demonstration

    NASA Astrophysics Data System (ADS)

    Spencer, James R.; Carter, Joshua E.; Leung, Crystal K.; McCall, Shannon J.; Greenberg, Joel A.; Kapadia, Anuj J.

    2017-03-01

    A Coded Aperture Coherent Scatter Spectral Imaging (CACSSI) system was developed in our group to differentiate cancer and healthy tissue in the breast. The utility of the experimental system was previously demonstrated using anthropomorphic breast phantoms and breast biopsy specimens. Here we demonstrate CACSSI utility in identifying tumor margins in real time using breast lumpectomy specimens. Fresh lumpectomy specimens were obtained from Surgical Pathology with the suspected cancerous area designated on the specimen. The specimens were scanned using CACSSI to obtain spectral scatter signatures at multiple locations within the tumor and surrounding tissue. The spectral reconstructions were matched with literature form-factors to classify the tissue as cancerous or non-cancerous. The findings were then compared against pathology reports to confirm the presence and location of the tumor. The system was found to be capable of consistently differentiating cancerous and healthy regions in the breast with spatial resolution of 5 mm. Tissue classification results from the scanned specimens could be correlated with pathology results. We now aim to develop CACSSI as a clinical imaging tool to aid breast cancer assessment and other diagnostic purposes.

  9. QR code for medical information uses.

    PubMed

    Fontelo, Paul; Liu, Fang; Ducut, Erick G

    2008-11-06

    We developed QR code online tools, simulated and tested QR code applications for medical information uses including scanning QR code labels, URLs and authentication. Our results show possible applications for QR code in medicine.

  10. Quick Response codes for surgical safety: a prospective pilot study.

    PubMed

    Dixon, Jennifer L; Smythe, William Roy; Momsen, Lara S; Jupiter, Daniel; Papaconstantinou, Harry T

    2013-09-01

    Surgical safety programs have been shown to reduce patient harm; however, there is variable compliance. The purpose of this study is to determine if innovative technology such as Quick Response (QR) codes can facilitate surgical safety initiatives. We prospectively evaluated the use of QR codes during the surgical time-out for 40 operations. Feasibility and accuracy were assessed. Perceptions of the current time-out process and the QR code application were evaluated through surveys using a 5-point Likert scale and binomial yes or no questions. At baseline (n = 53), survey results from the surgical team agreed or strongly agreed that the current time-out process was efficient (64%), easy to use (77%), and provided clear information (89%). However, 65% of surgeons felt that process improvements were needed. Thirty-seven of 40 (92.5%) QR codes scanned successfully, of which 100% were accurate. Three scan failures resulted from excessive curvature or wrinkling of the QR code label on the body. Follow-up survey results (n = 33) showed that the surgical team agreed or strongly agreed that the QR program was clearer (70%), easier to use (57%), and more accurate (84%). Seventy-four percent preferred the QR system to the current time-out process. QR codes accurately transmit patient information during the time-out procedure and are preferred to the current process by surgical team members. The novel application of this technology may improve compliance, accuracy, and outcomes. Copyright © 2013 Elsevier Inc. All rights reserved.

  11. In-Space Networking on NASA's SCAN Testbed

    NASA Technical Reports Server (NTRS)

    Brooks, David E.; Eddy, Wesley M.; Clark, Gilbert J.; Johnson, Sandra K.

    2016-01-01

    The NASA Space Communications and Navigation (SCaN) Testbed, an external payload onboard the International Space Station, is equipped with three software defined radios and a flight computer for supporting in-space communication research. New technologies being studied using the SCaN Testbed include advanced networking, coding, and modulation protocols designed to support the transition of NASAs mission systems from primarily point to point data links and preplanned routes towards adaptive, autonomous internetworked operations needed to meet future mission objectives. Networking protocols implemented on the SCaN Testbed include the Advanced Orbiting Systems (AOS) link-layer protocol, Consultative Committee for Space Data Systems (CCSDS) Encapsulation Packets, Internet Protocol (IP), Space Link Extension (SLE), CCSDS File Delivery Protocol (CFDP), and Delay-Tolerant Networking (DTN) protocols including the Bundle Protocol (BP) and Licklider Transmission Protocol (LTP). The SCaN Testbed end-to-end system provides three S-band data links and one Ka-band data link to exchange space and ground data through NASAs Tracking Data Relay Satellite System or a direct-to-ground link to ground stations. The multiple data links and nodes provide several upgradable elements on both the space and ground systems. This paper will provide a general description of the testbeds system design and capabilities, discuss in detail the design and lessons learned in the implementation of the network protocols, and describe future plans for continuing research to meet the communication needs for evolving global space systems.

  12. Potential digitization/compression techniques for Shuttle video

    NASA Technical Reports Server (NTRS)

    Habibi, A.; Batson, B. H.

    1978-01-01

    The Space Shuttle initially will be using a field-sequential color television system but it is possible that an NTSC color TV system may be used for future missions. In addition to downlink color TV transmission via analog FM links, the Shuttle will use a high resolution slow-scan monochrome system for uplink transmission of text and graphics information. This paper discusses the characteristics of the Shuttle video systems, and evaluates digitization and/or bandwidth compression techniques for the various links. The more attractive techniques for the downlink video are based on a two-dimensional DPCM encoder that utilizes temporal and spectral as well as the spatial correlation of the color TV imagery. An appropriate technique for distortion-free coding of the uplink system utilizes two-dimensional HCK codes.

  13. Three-dimensional dynamic deformation monitoring using a laser-scanning system

    NASA Astrophysics Data System (ADS)

    Al-Hanbali, Nedal N.; Teskey, William F.

    1994-10-01

    Non-contact dynamic deformation monitoring (e.g. with a laser scanning system) is very useful in monitoring changes in alignment and changes in size and shape of coupled operating machines. If relative movements between coupled operating machines are large, excessive wear in the machines or unplanned shutdowns due to machinery failure will occur. The purpose of non-contact dynamic deformation monitoring is to identify the causes of large movements and point to remedial action that can be taken to prevent them. The laser scanning system is a laser-based 3D vision system. The system-technique is based on an auto- synchronized triangulation scanning scheme. The system provides accurate, fast, and reliable 3D measurements and can measure objects between 0.5 m to 100 m with a field of view of 40 degree(s) X 50 degree(s). The system is flexible in terms of providing control over the scanned area and depth. The system also provides the user with the intensity image in addition to the depth coded image. This paper reports on the preliminary testing of this system to monitor surface movements and target (point) movements. The monitoring resolution achieved for an operating motorized alignment test rig in the lab was 1 mm for surface movements and 0.50 m for target movements. Raw data manipulation, local calibration, and the method of relating measurements to control points will be discussed. Possibilities for improving the resolution and recommendations for future development will also be presented.

  14. Adaptive Coding and Modulation Experiment With NASA's Space Communication and Navigation Testbed

    NASA Technical Reports Server (NTRS)

    Downey, Joseph; Mortensen, Dale; Evans, Michael; Briones, Janette; Tollis, Nicholas

    2016-01-01

    National Aeronautics and Space Administration (NASA)'s Space Communication and Navigation Testbed is an advanced integrated communication payload on the International Space Station. This paper presents results from an adaptive coding and modulation (ACM) experiment over S-band using a direct-to-earth link between the SCaN Testbed and the Glenn Research Center. The testing leverages the established Digital Video Broadcasting Second Generation (DVB-S2) standard to provide various modulation and coding options, and uses the Space Data Link Protocol (Consultative Committee for Space Data Systems (CCSDS) standard) for the uplink and downlink data framing. The experiment was conducted in a challenging environment due to the multipath and shadowing caused by the International Space Station structure. Several approaches for improving the ACM system are presented, including predictive and learning techniques to accommodate signal fades. Performance of the system is evaluated as a function of end-to-end system latency (round-trip delay), and compared to the capacity of the link. Finally, improvements over standard NASA waveforms are presented.

  15. Adaptive Coding and Modulation Experiment With NASA's Space Communication and Navigation Testbed

    NASA Technical Reports Server (NTRS)

    Downey, Joseph A.; Mortensen, Dale J.; Evans, Michael A.; Briones, Janette C.; Tollis, Nicholas

    2016-01-01

    National Aeronautics and Space Administration (NASA)'s Space Communication and Navigation Testbed is an advanced integrated communication payload on the International Space Station. This paper presents results from an adaptive coding and modulation (ACM) experiment over S-band using a direct-to-earth link between the SCaN Testbed and the Glenn Research Center. The testing leverages the established Digital Video Broadcasting Second Generation (DVB-S2) standard to provide various modulation and coding options, and uses the Space Data Link Protocol (Consultative Committee for Space Data Systems (CCSDS) standard) for the uplink and downlink data framing. The experiment was con- ducted in a challenging environment due to the multipath and shadowing caused by the International Space Station structure. Several approaches for improving the ACM system are presented, including predictive and learning techniques to accommodate signal fades. Performance of the system is evaluated as a function of end-to-end system latency (round- trip delay), and compared to the capacity of the link. Finally, improvements over standard NASA waveforms are presented.

  16. RIES - Rijnland Internet Election System: A Cursory Study of Published Source Code

    NASA Astrophysics Data System (ADS)

    Gonggrijp, Rop; Hengeveld, Willem-Jan; Hotting, Eelco; Schmidt, Sebastian; Weidemann, Frederik

    The Rijnland Internet Election System (RIES) is a system designed for voting in public elections over the internet. A rather cursory scan of the source code to RIES showed a significant lack of security-awareness among the programmers which - among other things - appears to have left RIES vulnerable to near-trivial attacks. If it had not been for independent studies finding problems, RIES would have been used in the 2008 Water Board elections, possibly handling a million votes or more. While RIES was more extensively studied to find cryptographic shortcomings, our work shows that more down-to-earth secure design practices can be at least as important, and the aspects need to be examined much sooner than right before an election.

  17. Seeing the Invisible: Embedding Tests in Code That Cannot be Modified

    NASA Technical Reports Server (NTRS)

    O'Malley, Owen; Mansouri-Samani, Masoud; Mehlitz, Peter; Penix, John

    2005-01-01

    The difficulty of characterizing and observing valid software behavior during testing can be very difficult in flight systems. To address this issue, we evaluated several approaches to increasing test observability on the Shuttle Abort Flight Management (SAFM) system. To increase test observability, we added probes into the running system to evaluate the internal state and analyze test data. To minimize the impact of the instrumentation and reduce manual effort, we used Aspect-Oriented Programming (AOP) tools to instrument the source code. We developed and elicited a spectrum of properties, from generic to application specific properties, to be monitored via the instrumentation. To evaluate additional approaches, SAFM was ported to Linux, enabling the use of gcov for measuring test coverage, Valgrind for looking for memory usage errors, and libraries for finding non-normal floating point values. An in-house C++ source code scanning tool was also used to identify violations of SAFM coding standards, and other potentially problematic C++ constructs. Using these approaches with the existing test data sets, we were able to verify several important properties, confirm several problems and identify some previously unidentified issues.

  18. A usability evaluation of an interactive application for halal products using optical character recognition and augmented reality technologies

    NASA Astrophysics Data System (ADS)

    Lam, Meng Chun; Nizam, Siti Soleha Muhammad; Arshad, Haslina; A'isyah Ahmad Shukri, Saidatul; Hashim, Nurhazarifah Che; Putra, Haekal Mozzia; Abidin, Rimaniza Zainal

    2017-10-01

    This article discusses the usability of an interactive application for halal products using Optical Character Recognition (OCR) and Augmented Reality (AR) technologies. Among the problems that have been identified in this study is that consumers have little knowledge about the E-Code. Therefore, users often have doubts about the halal status of the product. Nowadays, the integrity of halal status can be doubtful due to the actions of some irresponsible people spreading false information about a product. Therefore, an application that uses OCR and AR technology developed in this study will help the users to identify the information content of a product by scanning the E-Code label and by scanning the product's brand to know the halal status of the product. In this application, E-Code on the label of a product is scanned using OCR technology to display information about the E-Code. The product's brand is scan using augmented reality technology to display halal status of the product. The findings reveal that users are satisfied with this application and it is useful and easy to use.

  19. Ultrasonic Array for Obstacle Detection Based on CDMA with Kasami Codes

    PubMed Central

    Diego, Cristina; Hernández, Álvaro; Jiménez, Ana; Álvarez, Fernando J.; Sanz, Rebeca; Aparicio, Joaquín

    2011-01-01

    This paper raises the design of an ultrasonic array for obstacle detection based on Phased Array (PA) techniques, which steers the acoustic beam through the environment by electronics rather than mechanical means. The transmission of every element in the array has been encoded, according to Code Division for Multiple Access (CDMA), which allows multiple beams to be transmitted simultaneously. All these features together enable a parallel scanning system which does not only improve the image rate but also achieves longer inspection distances in comparison with conventional PA techniques. PMID:22247675

  20. Progressive fracture of fiber composites

    NASA Technical Reports Server (NTRS)

    Irvin, T. B.; Ginty, C. A.

    1983-01-01

    Refined models and procedures are described for determining progressive composite fracture in graphite/epoxy angleplied laminates. Lewis Research Center capabilities are utilized including the Real Time Ultrasonic C Scan (RUSCAN) experimental facility and the Composite Durability Structural Analysis (CODSTRAN) computer code. The CODSTRAN computer code is used to predict the fracture progression based on composite mechanics, finite element stress analysis, and fracture criteria modules. The RUSCAN facility, CODSTRAN computer code, and scanning electron microscope are used to determine durability and identify failure mechanisms in graphite/epoxy composites.

  1. Medical conditions associated with the use of CT in children and young adults, Great Britain, 1995–2008

    PubMed Central

    McHugh, Kieran; Harbron, Richard W; Pearce, Mark S; Berrington De Gonzalez, Amy

    2016-01-01

    Objective: To describe the medical conditions associated with the use of CT in children or young adults with no previous cancer diagnosis. Methods: Radiologist reports for scans performed in 1995–2008 in non-cancer patients less than 22 years of age were collected from the radiology information system in 44 hospitals of Great Britain. By semantic search, an automated procedure identified 185 medical conditions within the radiologist reports. Manual validation of a subsample by a paediatric radiologist showed a satisfactory performance of the automatic coding procedure. Results: Medical information was extracted for 37,807 scans; 19.5% scans were performed in children less than 5 years old; 52.0% scans were performed in 2000 or after. Trauma, diseases of the nervous (mainly hydrocephalus) or the circulatory system were each mentioned in 25–30% of scans. Hydrocephalus was mentioned in 19% of all scans, 59% of scans repeated ≥5 times in a year, and was the most frequent condition in children less than 5 years of age. Congenital diseases/malformations, disorders of the musculoskeletal system/connective tissues and infectious or respiratory diseases were each mentioned in 5–10% of scans. Suspicionor diagnosis of benign or malignant tumour was identified in 5% of scans. Conclusion: This study describes the medical conditions that likely underlie the use of CT in children in Great Britain. It shows that patients with hydrocephalus may receive high cumulative radiation exposures from CT in early life, i.e. at ages when they are most sensitive to radiation. Advances in knowledge: The majority of scans were unrelated to cancer suspicion. Repeated scans over time were mainly associated with the management of hydrocephalus. PMID:27767331

  2. Medical conditions associated with the use of CT in children and young adults, Great Britain, 1995-2008.

    PubMed

    Journy, Neige M; McHugh, Kieran; Harbron, Richard W; Pearce, Mark S; Berrington De Gonzalez, Amy

    2016-12-01

    To describe the medical conditions associated with the use of CT in children or young adults with no previous cancer diagnosis. Radiologist reports for scans performed in 1995-2008 in non-cancer patients less than 22 years of age were collected from the radiology information system in 44 hospitals of Great Britain. By semantic search, an automated procedure identified 185 medical conditions within the radiologist reports. Manual validation of a subsample by a paediatric radiologist showed a satisfactory performance of the automatic coding procedure. Medical information was extracted for 37,807 scans; 19.5% scans were performed in children less than 5 years old; 52.0% scans were performed in 2000 or after. Trauma, diseases of the nervous (mainly hydrocephalus) or the circulatory system were each mentioned in 25-30% of scans. Hydrocephalus was mentioned in 19% of all scans, 59% of scans repeated ≥5 times in a year, and was the most frequent condition in children less than 5 years of age. Congenital diseases/malformations, disorders of the musculoskeletal system/connective tissues and infectious or respiratory diseases were each mentioned in 5-10% of scans. Suspicionor diagnosis of benign or malignant tumour was identified in 5% of scans. This study describes the medical conditions that likely underlie the use of CT in children in Great Britain. It shows that patients with hydrocephalus may receive high cumulative radiation exposures from CT in early life, i.e. at ages when they are most sensitive to radiation. Advances in knowledge: The majority of scans were unrelated to cancer suspicion. Repeated scans over time were mainly associated with the management of hydrocephalus.

  3. Common data buffer

    NASA Technical Reports Server (NTRS)

    Byrne, F.

    1981-01-01

    Time-shared interface speeds data processing in distributed computer network. Two-level high-speed scanning approach routes information to buffer, portion of which is reserved for series of "first-in, first-out" memory stacks. Buffer address structure and memory are protected from noise or failed components by error correcting code. System is applicable to any computer or processing language.

  4. Feasibility study of a synthesis procedure for array feeds to improve radiation performance of large distorted reflector antennas

    NASA Technical Reports Server (NTRS)

    Stutzman, W. L.; Takamizawa, K.; Werntz, P.; Lapean, J.; Barts, R.

    1991-01-01

    The following subject areas are covered: General Reflector Antenna Systems Program version 7(GRASP7); Multiple Reflector Analysis Program for Cylindrical Antennas (MRAPCA); Tri-Reflector 2D Synthesis Code (TRTDS); a geometrical optics and a physical optics synthesis techniques; beam scanning reflector, the type 2 and 6 reflectors, spherical reflector, and multiple reflector imaging systems; and radiometric array design.

  5. A 94/183 GHz aircraft radiometer system for Project Storm Fury

    NASA Technical Reports Server (NTRS)

    Gagliano, J. A.; Stratigos, J. A.; Forsythe, R. E.; Schuchardt, J. M.; Welch, J. M.; Gallentine, D. O.

    1980-01-01

    A radiometer design suitable for use in NASA's WB-57F aircraft to collect data from severe storm regions was developed. The design recommended was a 94/183 GHz scanning radiometer with 3 IF channels on either side of the 183.3 GHz water vapor line and a single IF channel for a low loss atmospheric window channel at 94 GHz. The development and construction of the 94/183 GHz scanning radiometer known as the Advanced Microwave Moisture Sounder (AMMS) is presented. The radiometer scans the scene below the aircraft over an angle of + or - 45 degrees with the beamwidth of the scene viewed of approximately 2 degrees at 94 GHz and 1 degree at 183 GHz. The AMMS data collection system consists of a microcomputer used to store the radiometer data on the flight cartridge recorder, operate the stepper motor driven scanner, and collect housekeeping data such as thermistor temperature readings and aircraft time code.

  6. The Mystery Behind the Code: Differentiated Instruction with Quick Response Codes in Secondary Physical Education

    ERIC Educational Resources Information Center

    Adkins, Megan; Wajciechowski, Misti R.; Scantling, Ed

    2013-01-01

    Quick response codes, better known as QR codes, are small barcodes scanned to receive information about a specific topic. This article explains QR code technology and the utility of QR codes in the delivery of physical education instruction. Consideration is given to how QR codes can be used to accommodate learners of varying ability levels as…

  7. Design of monitoring system for mail-sorting based on the Profibus S7 series PLC

    NASA Astrophysics Data System (ADS)

    Zhang, W.; Jia, S. H.; Wang, Y. H.; Liu, H.; Tang, G. C.

    2017-01-01

    With the rapid development of the postal express, the workload of mail sorting is increasing, but the automatic technology of mail sorting is not mature enough. In view of this, the system uses Siemens S7-300 PLC as the main station controller, PLC of Siemens S7-200/400 is from the station controller, through the man-machine interface configuration software MCGS, PROFIBUS-DP communication, RFID technology and mechanical sorting hand achieve mail classification sorting monitoring. Among them, distinguish mail-sorting by scanning RFID posted in the mail electronic bar code (fixed code), the system uses the corresponding controller on the acquisition of information processing, the processed information transmit to the sorting manipulator by PROFIBUS-DP. The system can realize accurate and efficient mail sorting, which will promote the development of mail sorting technology.

  8. Benchmark studies of the gyro-Landau-fluid code and gyro-kinetic codes on kinetic ballooning modes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tang, T. F.; Lawrence Livermore National Laboratory, Livermore, California 94550; Xu, X. Q.

    2016-03-15

    A Gyro-Landau-Fluid (GLF) 3 + 1 model has been recently implemented in BOUT++ framework, which contains full Finite-Larmor-Radius effects, Landau damping, and toroidal resonance [Ma et al., Phys. Plasmas 22, 055903 (2015)]. A linear global beta scan has been conducted using the JET-like circular equilibria (cbm18 series), showing that the unstable modes are kinetic ballooning modes (KBMs). In this work, we use the GYRO code, which is a gyrokinetic continuum code widely used for simulation of the plasma microturbulence, to benchmark with GLF 3 + 1 code on KBMs. To verify our code on the KBM case, we first perform the beta scan basedmore » on “Cyclone base case parameter set.” We find that the growth rate is almost the same for two codes, and the KBM mode is further destabilized as beta increases. For JET-like global circular equilibria, as the modes localize in peak pressure gradient region, a linear local beta scan using the same set of equilibria has been performed at this position for comparison. With the drift kinetic electron module in the GYRO code by including small electron-electron collision to damp electron modes, GYRO generated mode structures and parity suggest that they are kinetic ballooning modes, and the growth rate is comparable to the GLF results. However, a radial scan of the pedestal for a particular set of cbm18 equilibria, using GYRO code, shows different trends for the low-n and high-n modes. The low-n modes show that the linear growth rate peaks at peak pressure gradient position as GLF results. However, for high-n modes, the growth rate of the most unstable mode shifts outward to the bottom of pedestal and the real frequency of what was originally the KBMs in ion diamagnetic drift direction steadily approaches and crosses over to the electron diamagnetic drift direction.« less

  9. Location Based Service in Indoor Environment Using Quick Response Code Technology

    NASA Astrophysics Data System (ADS)

    Hakimpour, F.; Zare Zardiny, A.

    2014-10-01

    Today by extensive use of intelligent mobile phones, increased size of screens and enriching the mobile phones by Global Positioning System (GPS) technology use of location based services have been considered by public users more than ever.. Based on the position of users, they can receive the desired information from different LBS providers. Any LBS system generally includes five main parts: mobile devices, communication network, positioning system, service provider and data provider. By now many advances have been gained in relation to any of these parts; however the users positioning especially in indoor environments is propounded as an essential and critical issue in LBS. It is well known that GPS performs too poorly inside buildings to provide usable indoor positioning. On the other hand, current indoor positioning technologies such as using RFID or WiFi network need different hardware and software infrastructures. In this paper, we propose a new method to overcome these challenges. This method is using the Quick Response (QR) Code Technology. QR Code is a 2D encrypted barcode with a matrix structure which consists of black modules arranged in a square grid. Scanning and data retrieving process from QR Code is possible by use of different camera-enabled mobile phones only by installing the barcode reader software. This paper reviews the capabilities of QR Code technology and then discusses the advantages of using QR Code in Indoor LBS (ILBS) system in comparison to other technologies. Finally, some prospects of using QR Code are illustrated through implementation of a scenario. The most important advantages of using this new technology in ILBS are easy implementation, spending less expenses, quick data retrieval, possibility of printing the QR Code on different products and no need for complicated hardware and software infrastructures.

  10. Coded aperture detector: an image sensor with sub 20-nm pixel resolution.

    PubMed

    Miyakawa, Ryan; Mayer, Rafael; Wojdyla, Antoine; Vannier, Nicolas; Lesser, Ian; Aron-Dine, Shifrah; Naulleau, Patrick

    2014-08-11

    We describe the coded aperture detector, a novel image sensor based on uniformly redundant arrays (URAs) with customizable pixel size, resolution, and operating photon energy regime. In this sensor, a coded aperture is scanned laterally at the image plane of an optical system, and the transmitted intensity is measured by a photodiode. The image intensity is then digitally reconstructed using a simple convolution. We present results from a proof-of-principle optical prototype, demonstrating high-fidelity image sensing comparable to a CCD. A 20-nm half-pitch URA fabricated by the Center for X-ray Optics (CXRO) nano-fabrication laboratory is presented that is suitable for high-resolution image sensing at EUV and soft X-ray wavelengths.

  11. DARKDROID: Exposing the Dark Side of Android Marketplaces

    DTIC Science & Technology

    2016-06-01

    Moreover, our approaches can detect apps containing both intentional and unintentional vulnerabilities, such as unsafe code loading mechanisms and...Security, Static Analysis, Dynamic Analysis, Malware Detection , Vulnerability Scanning 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF ABSTRACT UU 18...applications in a DoD context. ................... 1 1.2.2 Develop sophisticated whole-system static analyses to detect malicious Android applications

  12. A Flexile and High Precision Calibration Method for Binocular Structured Light Scanning System

    PubMed Central

    Yuan, Jianying; Wang, Qiong; Li, Bailin

    2014-01-01

    3D (three-dimensional) structured light scanning system is widely used in the field of reverse engineering, quality inspection, and so forth. Camera calibration is the key for scanning precision. Currently, 2D (two-dimensional) or 3D fine processed calibration reference object is usually applied for high calibration precision, which is difficult to operate and the cost is high. In this paper, a novel calibration method is proposed with a scale bar and some artificial coded targets placed randomly in the measuring volume. The principle of the proposed method is based on hierarchical self-calibration and bundle adjustment. We get initial intrinsic parameters from images. Initial extrinsic parameters in projective space are estimated with the method of factorization and then upgraded to Euclidean space with orthogonality of rotation matrix and rank 3 of the absolute quadric as constraint. Last, all camera parameters are refined through bundle adjustment. Real experiments show that the proposed method is robust, and has the same precision level as the result using delicate artificial reference object, but the hardware cost is very low compared with the current calibration method used in 3D structured light scanning system. PMID:25202736

  13. Tactical Utility of Tailored Systems

    DTIC Science & Technology

    2015-07-10

    create a procurement system to produce materiel at a cost low enough to make equipment disposable. Further cost savings may be realized by upgrading...replacement parts. With the advent of 3-d printing and digital manufacturing, a new part may be procured as easily as scanning a bar code and pressing...9/11 Commission Report “Imagination is not a gift usually associated with bureaucracies. It is therefore crucial to find a way of routinizing, even

  14. The complete mitogenome of the whale shark parasitic copepod Pandarus rhincodonicus norman, Newbound & Knott (Crustacea; Siphonostomatoida; Pandaridae)--a new gene order for the copepoda.

    PubMed

    Austin, Christopher M; Tan, Mun Hua; Lee, Yin Peng; Croft, Laurence J; Meekan, Mark G; Pierce, Simon J; Gan, Han Ming

    2016-01-01

    The complete mitochondrial genome of the parasitic copepod Pandarus rhincodonicus was obtained from a partial genome scan using the HiSeq sequencing system. The Pandarus rhincodonicus mitogenome has 14,480 base pairs (62% A+T content) made up of 12 protein-coding genes, 2 ribosomal subunit genes, 22 transfer RNAs, and a putative 384 bp non-coding AT-rich region. This Pandarus mitogenome sequence is the first for the family Pandaridae, the second for the order Siphonostomatoida and the sixth for the Copepoda.

  15. SU-E-T-254: Optimization of GATE and PHITS Monte Carlo Code Parameters for Uniform Scanning Proton Beam Based On Simulation with FLUKA General-Purpose Code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kurosu, K; Department of Medical Physics ' Engineering, Osaka University Graduate School of Medicine, Osaka; Takashina, M

    Purpose: Monte Carlo codes are becoming important tools for proton beam dosimetry. However, the relationships between the customizing parameters and percentage depth dose (PDD) of GATE and PHITS codes have not been reported which are studied for PDD and proton range compared to the FLUKA code and the experimental data. Methods: The beam delivery system of the Indiana University Health Proton Therapy Center was modeled for the uniform scanning beam in FLUKA and transferred identically into GATE and PHITS. This computational model was built from the blue print and validated with the commissioning data. Three parameters evaluated are the maximummore » step size, cut off energy and physical and transport model. The dependence of the PDDs on the customizing parameters was compared with the published results of previous studies. Results: The optimal parameters for the simulation of the whole beam delivery system were defined by referring to the calculation results obtained with each parameter. Although the PDDs from FLUKA and the experimental data show a good agreement, those of GATE and PHITS obtained with our optimal parameters show a minor discrepancy. The measured proton range R90 was 269.37 mm, compared to the calculated range of 269.63 mm, 268.96 mm, and 270.85 mm with FLUKA, GATE and PHITS, respectively. Conclusion: We evaluated the dependence of the results for PDDs obtained with GATE and PHITS Monte Carlo generalpurpose codes on the customizing parameters by using the whole computational model of the treatment nozzle. The optimal parameters for the simulation were then defined by referring to the calculation results. The physical model, particle transport mechanics and the different geometrybased descriptions need accurate customization in three simulation codes to agree with experimental data for artifact-free Monte Carlo simulation. This study was supported by Grants-in Aid for Cancer Research (H22-3rd Term Cancer Control-General-043) from the Ministry of Health, Labor and Welfare of Japan, Grants-in-Aid for Scientific Research (No. 23791419), and JSPS Core-to-Core program (No. 23003). The authors have no conflict of interest.« less

  16. Scanning for safety: an integrated approach to improved bar-code medication administration.

    PubMed

    Early, Cynde; Riha, Chris; Martin, Jennifer; Lowdon, Karen W; Harvey, Ellen M

    2011-03-01

    This is a review of lessons learned in the postimplementation evaluation of a bar-code medication administration technology implemented at a major tertiary-care hospital in 2001. In 2006, with a bar-code medication administration scan compliance rate of 82%, a near-miss sentinel event prompted review of this technology as part of an institutional recommitment to a "culture of safety." Multifaceted problems with bar-code medication administration created an environment of circumventing safeguards as demonstrated by an increase in manual overrides to ensure timely medication administration. A multiprofessional team composed of nursing, pharmacy, human resources, quality, and technical services formalized. Each step in the bar-code medication administration process was reviewed. Technology, process, and educational solutions were identified and implemented systematically. Overall compliance with bar-code medication administration rose from 82% to 97%, which resulted in a calculated cost avoidance of more than $2.8 million during this time frame of the project.

  17. A novel use of QR code stickers after orthopaedic cast application.

    PubMed

    Gough, A T; Fieraru, G; Gaffney, Pav; Butler, M; Kincaid, R J; Middleton, R G

    2017-07-01

    INTRODUCTION We present a novel solution to ensure that information and contact details are always available to patients while in cast. An information sticker containing both telephone numbers and a Quick Response (QR) code is applied to the cast. When scanned with a smartphone, the QR code loads the plaster team's webpage. This contains information and videos about cast care, complications and enhancing recovery. METHODS A sticker was designed and applied to all synthetic casts fitted in our fracture clinic. On cast removal, patients completed a questionnaire about the sticker. A total of 101 patients were surveyed between November 2015 and February 2016. The questionnaire comprised ten binary choice questions. RESULTS The vast majority (97%) of patients had the sticker still on their cast when they returned to clinic for cast removal. Eighty-four per cent of all patients felt reassured by the presence of the QR code sticker. Nine per cent used the contact details on the cast to seek advice. Over half (56%) had a smartphone and a third (33%) of these scanned the QR code. Of those who scanned the code, 95% found the information useful. CONCLUSIONS This study indicates that use of a QR code reassures patients and is an effective tool in the proactive management of potential cast problems. The QR code sticker is now applied to all casts across our trust. In line with NHS England's Five Year Forward View calling for enhanced use of smartphone technology, our trust is continuing to expand its portfolio of patient information accessible via QR codes. Other branches of medicine may benefit from incorporating QR codes as portals to access such information.

  18. Low-cost compact MEMS scanning ladar system for robotic applications

    NASA Astrophysics Data System (ADS)

    Moss, Robert; Yuan, Ping; Bai, Xiaogang; Quesada, Emilio; Sudharsanan, Rengarajan; Stann, Barry L.; Dammann, John F.; Giza, Mark M.; Lawler, William B.

    2012-06-01

    Future robots and autonomous vehicles require compact low-cost Laser Detection and Ranging (LADAR) systems for autonomous navigation. Army Research Laboratory (ARL) had recently demonstrated a brass-board short-range eye-safe MEMS scanning LADAR system for robotic applications. Boeing Spectrolab is doing a tech-transfer (CRADA) of this system and has built a compact MEMS scanning LADAR system with additional improvements in receiver sensitivity, laser system, and data processing system. Improved system sensitivity, low-cost, miniaturization, and low power consumption are the main goals for the commercialization of this LADAR system. The receiver sensitivity has been improved by 2x using large-area InGaAs PIN detectors with low-noise amplifiers. The FPGA code has been updated to extend the range to 50 meters and detect up to 3 targets per pixel. Range accuracy has been improved through the implementation of an optical T-Zero input line. A compact commercially available erbium fiber laser operating at 1550 nm wavelength is used as a transmitter, thus reducing the size of the LADAR system considerably from the ARL brassboard system. The computer interface has been consolidated to allow image data and configuration data (configuration settings and system status) to pass through a single Ethernet port. In this presentation we will discuss the system architecture and future improvements to receiver sensitivity using avalanche photodiodes.

  19. StarScan: a web server for scanning small RNA targets from degradome sequencing data.

    PubMed

    Liu, Shun; Li, Jun-Hao; Wu, Jie; Zhou, Ke-Ren; Zhou, Hui; Yang, Jian-Hua; Qu, Liang-Hu

    2015-07-01

    Endogenous small non-coding RNAs (sRNAs), including microRNAs, PIWI-interacting RNAs and small interfering RNAs, play important gene regulatory roles in animals and plants by pairing to the protein-coding and non-coding transcripts. However, computationally assigning these various sRNAs to their regulatory target genes remains technically challenging. Recently, a high-throughput degradome sequencing method was applied to identify biologically relevant sRNA cleavage sites. In this study, an integrated web-based tool, StarScan (sRNA target Scan), was developed for scanning sRNA targets using degradome sequencing data from 20 species. Given a sRNA sequence from plants or animals, our web server performs an ultrafast and exhaustive search for potential sRNA-target interactions in annotated and unannotated genomic regions. The interactions between small RNAs and target transcripts were further evaluated using a novel tool, alignScore. A novel tool, degradomeBinomTest, was developed to quantify the abundance of degradome fragments located at the 9-11th nucleotide from the sRNA 5' end. This is the first web server for discovering potential sRNA-mediated RNA cleavage events in plants and animals, which affords mechanistic insights into the regulatory roles of sRNAs. The StarScan web server is available at http://mirlab.sysu.edu.cn/starscan/. © The Author(s) 2015. Published by Oxford University Press on behalf of Nucleic Acids Research.

  20. Inter-comparison of Dose Distributions Calculated by FLUKA, GEANT4, MCNP, and PHITS for Proton Therapy

    NASA Astrophysics Data System (ADS)

    Yang, Zi-Yi; Tsai, Pi-En; Lee, Shao-Chun; Liu, Yen-Chiang; Chen, Chin-Cheng; Sato, Tatsuhiko; Sheu, Rong-Jiun

    2017-09-01

    The dose distributions from proton pencil beam scanning were calculated by FLUKA, GEANT4, MCNP, and PHITS, in order to investigate their applicability in proton radiotherapy. The first studied case was the integrated depth dose curves (IDDCs), respectively from a 100 and a 226-MeV proton pencil beam impinging a water phantom. The calculated IDDCs agree with each other as long as each code employs 75 eV for the ionization potential of water. The second case considered a similar condition of the first case but with proton energies in a Gaussian distribution. The comparison to the measurement indicates the inter-code differences might not only due to different stopping power but also the nuclear physics models. How the physics parameter setting affect the computation time was also discussed. In the third case, the applicability of each code for pencil beam scanning was confirmed by delivering a uniform volumetric dose distribution based on the treatment plan, and the results showed general agreement between each codes, the treatment plan, and the measurement, except that some deviations were found in the penumbra region. This study has demonstrated that the selected codes are all capable of performing dose calculations for therapeutic scanning proton beams with proper physics settings.

  1. Bar Code Medication Administration Technology: Characterization of High-Alert Medication Triggers and Clinician Workarounds.

    PubMed

    Miller, Daniel F; Fortier, Christopher R; Garrison, Kelli L

    2011-02-01

    Bar code medication administration (BCMA) technology is gaining acceptance for its ability to prevent medication administration errors. However, studies suggest that improper use of BCMA technology can yield unsatisfactory error prevention and introduction of new potential medication errors. To evaluate the incidence of high-alert medication BCMA triggers and alert types and discuss the type of nursing and pharmacy workarounds occurring with the use of BCMA technology and the electronic medication administration record (eMAR). Medication scanning and override reports from January 1, 2008, through November 30, 2008, for all adult medical/surgical units were retrospectively evaluated for high-alert medication system triggers, alert types, and override reason documentation. An observational study of nursing workarounds on an adult medicine step-down unit was performed and an analysis of potential pharmacy workarounds affecting BCMA and the eMAR was also conducted. Seventeen percent of scanned medications triggered an error alert of which 55% were for high-alert medications. Insulin aspart, NPH insulin, hydromorphone, potassium chloride, and morphine were the top 5 high-alert medications that generated alert messages. Clinician override reasons for alerts were documented in only 23% of administrations. Observational studies assessing for nursing workarounds revealed a median of 3 clinician workarounds per administration. Specific nursing workarounds included a failure to scan medications/patient armband and scanning the bar code once the dosage has been removed from the unit-dose packaging. Analysis of pharmacy order entry process workarounds revealed the potential for missed doses, duplicate doses, and doses being scheduled at the wrong time. BCMA has the potential to prevent high-alert medication errors by alerting clinicians through alert messages. Nursing and pharmacy workarounds can limit the recognition of optimal safety outcomes and therefore workflow processes must be continually analyzed and restructured to yield the intended full benefits of BCMA technology. © 2011 SAGE Publications.

  2. Usability of a barcode scanning system as a means of data entry on a PDA for self-report health outcome questionnaires: a pilot study in individuals over 60 years of age

    PubMed Central

    Boissy, Patrick; Jacobs, Karen; Roy, Serge H

    2006-01-01

    Background Throughout the medical and paramedical professions, self-report health status questionnaires are used to gather patient-reported outcome measures. The objective of this pilot study was to evaluate in individuals over 60 years of age the usability of a PDA-based barcode scanning system with a text-to-speech synthesizer to collect data electronically from self-report health outcome questionnaires. Methods Usability of the system was tested on a sample of 24 community-living older adults (7 men, 17 women) ranging in age from 63 to 93 years. After receiving a brief demonstration on the use of the barcode scanner, participants were randomly assigned to complete two sets of 16 questions using the bar code wand scanner for one set and a pen for the other. Usability was assessed using directed interviews with a usability questionnaire and performance-based metrics (task times, errors, sources of errors). Results Overall, participants found barcode scanning easy to learn, easy to use, and pleasant. Participants were marginally faster in completing the 16 survey questions when using pen entry (20/24 participants). The mean response time with the barcode scanner was 31 seconds longer than traditional pen entry for a subset of 16 questions (p = 0.001). The responsiveness of the scanning system, expressed as first scan success rate, was less than perfect, with approximately one-third of first scans requiring a rescan to successfully capture the data entry. The responsiveness of the system can be explained by a combination of factors such as the location of the scanning errors, the type of barcode used as an answer field in the paper version, and the optical characteristics of the barcode scanner. Conclusion The results presented in this study offer insights regarding the feasibility, usability and effectiveness of using a barcode scanner with older adults as an electronic data entry method on a PDA. While participants in this study found their experience with the barcode scanning system enjoyable and learned to become proficient in its use, the responsiveness of the system constitutes a barrier to wide-scale use of such a system. Optimizing the graphical presentation of the information on paper should significantly increase the system's responsiveness. PMID:17184533

  3. Use Them ... or Lose Them? The Case for and against Using QR Codes

    ERIC Educational Resources Information Center

    Cunningham, Chuck; Dull, Cassie

    2011-01-01

    A quick-response (QR) code is a two-dimensional, black-and-white square barcode and links directly to a URL of one's choice. When the code is scanned with a smartphone, it will automatically redirect the user to the designated URL. QR codes are popping up everywhere--billboards, magazines, posters, shop windows, TVs, computer screens, and more.…

  4. Quality Traceability System of Traditional Chinese Medicine Based on Two Dimensional Barcode Using Mobile Intelligent Technology.

    PubMed

    Cai, Yong; Li, Xiwen; Wang, Runmiao; Yang, Qing; Li, Peng; Hu, Hao

    2016-01-01

    Currently, the chemical fingerprint comparison and analysis is mainly based on professional equipment and software, it's expensive and inconvenient. This study aims to integrate QR (Quick Response) code with quality data and mobile intelligent technology to develop a convenient query terminal for tracing quality in the whole industrial chain of TCM (traditional Chinese medicine). Three herbal medicines were randomly selected and their chemical two-dimensional barcode (2D) barcodes fingerprints were constructed. Smartphone application (APP) based on Android system was developed to read initial data of 2D chemical barcodes, and compared multiple fingerprints from different batches of same species or different species. It was demonstrated that there were no significant differences between original and scanned TCM chemical fingerprints. Meanwhile, different TCM chemical fingerprint QR codes could be rendered in the same coordinate and showed the differences very intuitively. To be able to distinguish the variations of chemical fingerprint more directly, linear interpolation angle cosine similarity algorithm (LIACSA) was proposed to get similarity ratio. This study showed that QR codes can be used as an effective information carrier to transfer quality data. Smartphone application can rapidly read quality information in QR codes and convert data into TCM chemical fingerprints.

  5. LIDAR pulse coding for high resolution range imaging at improved refresh rate.

    PubMed

    Kim, Gunzung; Park, Yongwan

    2016-10-17

    In this study, a light detection and ranging system (LIDAR) was designed that codes pixel location information in its laser pulses using the direct- sequence optical code division multiple access (DS-OCDMA) method in conjunction with a scanning-based microelectromechanical system (MEMS) mirror. This LIDAR can constantly measure the distance without idle listening time for the return of reflected waves because its laser pulses include pixel location information encoded by applying the DS-OCDMA. Therefore, this emits in each bearing direction without waiting for the reflected wave to return. The MEMS mirror is used to deflect and steer the coded laser pulses in the desired bearing direction. The receiver digitizes the received reflected pulses using a low-temperature-grown (LTG) indium gallium arsenide (InGaAs) based photoconductive antenna (PCA) and the time-to-digital converter (TDC) and demodulates them using the DS-OCDMA. When all of the reflected waves corresponding to the pixels forming a range image are received, the proposed LIDAR generates a point cloud based on the time-of-flight (ToF) of each reflected wave. The results of simulations performed on the proposed LIDAR are compared with simulations of existing LIDARs.

  6. Optimization of GATE and PHITS Monte Carlo code parameters for uniform scanning proton beam based on simulation with FLUKA general-purpose code

    NASA Astrophysics Data System (ADS)

    Kurosu, Keita; Takashina, Masaaki; Koizumi, Masahiko; Das, Indra J.; Moskvin, Vadim P.

    2014-10-01

    Although three general-purpose Monte Carlo (MC) simulation tools: Geant4, FLUKA and PHITS have been used extensively, differences in calculation results have been reported. The major causes are the implementation of the physical model, preset value of the ionization potential or definition of the maximum step size. In order to achieve artifact free MC simulation, an optimized parameters list for each simulation system is required. Several authors have already proposed the optimized lists, but those studies were performed with a simple system such as only a water phantom. Since particle beams have a transport, interaction and electromagnetic processes during beam delivery, establishment of an optimized parameters-list for whole beam delivery system is therefore of major importance. The purpose of this study was to determine the optimized parameters list for GATE and PHITS using proton treatment nozzle computational model. The simulation was performed with the broad scanning proton beam. The influences of the customizing parameters on the percentage depth dose (PDD) profile and the proton range were investigated by comparison with the result of FLUKA, and then the optimal parameters were determined. The PDD profile and the proton range obtained from our optimized parameters list showed different characteristics from the results obtained with simple system. This led to the conclusion that the physical model, particle transport mechanics and different geometry-based descriptions need accurate customization in planning computational experiments for artifact-free MC simulation.

  7. Huffman scanning: using language models within fixed-grid keyboard emulation☆

    PubMed Central

    Roark, Brian; Beckley, Russell; Gibbons, Chris; Fried-Oken, Melanie

    2012-01-01

    Individuals with severe motor impairments commonly enter text using a single binary switch and symbol scanning methods. We present a new scanning method –Huffman scanning – which uses Huffman coding to select the symbols to highlight during scanning, thus minimizing the expected bits per symbol. With our method, the user can select the intended symbol even after switch activation errors. We describe two varieties of Huffman scanning – synchronous and asynchronous –and present experimental results, demonstrating speedups over row/column and linear scanning. PMID:24244070

  8. Scanning-time evaluation of Digimarc Barcode

    NASA Astrophysics Data System (ADS)

    Gerlach, Rebecca; Pinard, Dan; Weaver, Matt; Alattar, Adnan

    2015-03-01

    This paper presents a speed comparison between the use of Digimarc® Barcodes and the Universal Product Code (UPC) for customer checkout at point of sale (POS). The recently introduced Digimarc Barcode promises to increase the speed of scanning packaged goods at POS. When this increase is exploited by workforce optimization systems, the retail industry could potentially save billions of dollars. The Digimarc Barcode is based on Digimarc's watermarking technology, and it is imperceptible, very robust, and does not require any special ink, material, or printing processes. Using an image-based scanner, a checker can quickly scan consumer packaged goods (CPG) embedded with the Digimarc Barcode without the need to reorient the packages with respect to the scanner. Faster scanning of packages saves money and enhances customer satisfaction. It reduces the length of the queues at checkout, reduces the cost of cashier labor, and makes self-checkout more convenient. This paper quantifies the increase in POS scanning rates resulting from the use of the Digimarc Barcode versus the traditional UPC. It explains the testing methodology, describes the experimental setup, and analyzes the obtained results. It concludes that the Digimarc Barcode increases number of items per minute (IPM) scanned at least 50% over traditional UPC.

  9. Field-programmable beam reconfiguring based on digitally-controlled coding metasurface

    NASA Astrophysics Data System (ADS)

    Wan, Xiang; Qi, Mei Qing; Chen, Tian Yi; Cui, Tie Jun

    2016-02-01

    Digital phase shifters have been applied in traditional phased array antennas to realize beam steering. However, the phase shifter deals with the phase of the induced current; hence, it has to be in the path of each element of the antenna array, making the phased array antennas very expensive. Metamaterials and/or metasurfaces enable the direct modulation of electromagnetic waves by designing subwavelength structures, which opens a new way to control the beam scanning. Here, we present a direct digital mechanism to control the scattered electromagnetic waves using coding metasurface, in which each unit cell loads a pin diode to produce binary coding states of “1” and “0”. Through data lines, the instant communications are established between the coding metasurface and the internal memory of field-programmable gate arrays (FPGA). Thus, we realize the digital modulation of electromagnetic waves, from which we present the field-programmable reflective antenna with good measurement performance. The proposed mechanism and functional device have great application potential in new-concept radar and communication systems.

  10. The location and recognition of anti-counterfeiting code image with complex background

    NASA Astrophysics Data System (ADS)

    Ni, Jing; Liu, Quan; Lou, Ping; Han, Ping

    2017-07-01

    The order of cigarette market is a key issue in the tobacco business system. The anti-counterfeiting code, as a kind of effective anti-counterfeiting technology, can identify counterfeit goods, and effectively maintain the normal order of market and consumers' rights and interests. There are complex backgrounds, light interference and other problems in the anti-counterfeiting code images obtained by the tobacco recognizer. To solve these problems, the paper proposes a locating method based on Susan operator, combined with sliding window and line scanning,. In order to reduce the interference of background and noise, we extract the red component of the image and convert the color image into gray image. For the confusing characters, recognition results correction based on the template matching method has been adopted to improve the recognition rate. In this method, the anti-counterfeiting code can be located and recognized correctly in the image with complex background. The experiment results show the effectiveness and feasibility of the approach.

  11. Organic Phase Change Nanoparticles for in-Product Labeling of Agrochemicals.

    PubMed

    Wang, Miao; Duong, Binh; Su, Ming

    2015-10-28

    There is an urgent need to develop in-product covert barcodes for anti-counterfeiting of agrochemicals. This paper reports a new organic nanoparticle-based in-product barcode system, in which a panel of organic phase change nanoparticles is added as a barcode into in a variety of chemicals (herein agrochemicals). The barcode is readout by detecting melting peaks of organic nanoparticles using differential scanning calorimetry. This method has high labeling capacity due to small sizes of nanoparticles, sharp melting peaks, and large scan range of thermal analysis. The in-product barcode can be effectively used to protect agrochemical products from being counterfeited due to its large coding capacity, technical readiness, covertness, and robustness.

  12. More Than Bar Codes: Integrating Global Standards-Based Bar Code Technology Into National Health Information Systems in Ethiopia and Pakistan to Increase End-to-End Supply Chain Visibility.

    PubMed

    Hara, Liuichi; Guirguis, Ramy; Hummel, Keith; Villanueva, Monica

    2017-12-28

    The United Nations Population Fund (UNFPA) and the United States Agency for International Development (USAID) DELIVER PROJECT work together to strengthen public health commodity supply chains by standardizing bar coding under a single set of global standards. From 2015, UNFPA and USAID collaborated to pilot test how tracking and tracing of bar coded health products could be operationalized in the public health supply chains of Ethiopia and Pakistan and inform the ecosystem needed to begin full implementation. Pakistan had been using proprietary bar codes for inventory management of contraceptive supplies but transitioned to global standards-based bar codes during the pilot. The transition allowed Pakistan to leverage the original bar codes that were preprinted by global manufacturers as opposed to printing new bar codes at the central warehouse. However, barriers at lower service delivery levels prevented full realization of end-to-end data visibility. Key barriers at the district level were the lack of a digital inventory management system and absence of bar codes at the primary-level packaging level, such as single blister packs. The team in Ethiopia developed an open-sourced smartphone application that allowed the team to scan bar codes using the mobile phone's camera and to push the captured data to the country's data mart. Real-time tracking and tracing occurred from the central warehouse to the Addis Ababa distribution hub and to 2 health centers. These pilots demonstrated that standardized product identification and bar codes can significantly improve accuracy over manual stock counts while significantly streamlining the stock-taking process, resulting in efficiencies. The pilots also showed that bar coding technology by itself is not sufficient to ensure data visibility. Rather, by using global standards for identification and data capture of pharmaceuticals and medical devices, and integrating the data captured into national and global tracking systems, countries are able to lay the foundation for interoperability and ensure a harmonized language between global health stakeholders. © Hara et al.

  13. More Than Bar Codes: Integrating Global Standards-Based Bar Code Technology Into National Health Information Systems in Ethiopia and Pakistan to Increase End-to-End Supply Chain Visibility

    PubMed Central

    Hara, Liuichi; Guirguis, Ramy; Hummel, Keith; Villanueva, Monica

    2017-01-01

    The United Nations Population Fund (UNFPA) and the United States Agency for International Development (USAID) DELIVER PROJECT work together to strengthen public health commodity supply chains by standardizing bar coding under a single set of global standards. From 2015, UNFPA and USAID collaborated to pilot test how tracking and tracing of bar coded health products could be operationalized in the public health supply chains of Ethiopia and Pakistan and inform the ecosystem needed to begin full implementation. Pakistan had been using proprietary bar codes for inventory management of contraceptive supplies but transitioned to global standards-based bar codes during the pilot. The transition allowed Pakistan to leverage the original bar codes that were preprinted by global manufacturers as opposed to printing new bar codes at the central warehouse. However, barriers at lower service delivery levels prevented full realization of end-to-end data visibility. Key barriers at the district level were the lack of a digital inventory management system and absence of bar codes at the primary-level packaging level, such as single blister packs. The team in Ethiopia developed an open-sourced smartphone application that allowed the team to scan bar codes using the mobile phone's camera and to push the captured data to the country's data mart. Real-time tracking and tracing occurred from the central warehouse to the Addis Ababa distribution hub and to 2 health centers. These pilots demonstrated that standardized product identification and bar codes can significantly improve accuracy over manual stock counts while significantly streamlining the stock-taking process, resulting in efficiencies. The pilots also showed that bar coding technology by itself is not sufficient to ensure data visibility. Rather, by using global standards for identification and data capture of pharmaceuticals and medical devices, and integrating the data captured into national and global tracking systems, countries are able to lay the foundation for interoperability and ensure a harmonized language between global health stakeholders. PMID:29284701

  14. TH-AB-209-10: Breast Cancer Identification Through X-Ray Coherent Scatter Spectral Imaging

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kapadia, A; Morris, R; Albanese, K

    Purpose: We have previously described the development and testing of a coherent-scatter spectral imaging system for identification of cancer. Our prior evaluations were performed using either tissue surrogate phantoms or formalin-fixed tissue obtained from pathology. Here we present the first results from a scatter imaging study using fresh breast tumor tissues obtained through surgical excision. Methods: A coherent-scatter imaging system was built using a clinical X-ray tube, photon counting detectors, and custom-designed coded-apertures. System performance was characterized using calibration phantoms of biological materials. Fresh breast tumors were obtained from patients undergoing mastectomy and lumpectomy surgeries for breast cancer. Each specimenmore » was vacuum-sealed, scanned using the scatter imaging system, and then sent to pathology for histological workup. Scatter images were generated separately for each tissue specimen and analyzed to identify voxels containing malignant tissue. The images were compared against histological analysis (H&E + pathologist identification of tumors) to assess the match between scatter-based and histological diagnosis. Results: In all specimens scanned, the scatter images showed the location of cancerous regions within the specimen. The detection and classification was performed through automated spectral matching without the need for manual intervention. The scatter spectra corresponding to cancer tissue were found to be in agreement with those reported in literature. Inter-patient variability was found to be within limits reported in literature. The scatter images showed agreement with pathologist-identified regions of cancer. Spatial resolution for this configuration of the scanner was determined to be 2–3 mm, and the total scan time for each specimen was under 15 minutes. Conclusion: This work demonstrates the utility of coherent scatter imaging in identifying cancer based on the scatter properties of the tissue. It presents the first results from coherent scatter imaging of fresh (unfixed) breast tissue using our coded-aperture scatter imaging approach for cancer identification.« less

  15. A comprehensive spectrometry study of a stray neutron radiation field in scanning proton therapy.

    PubMed

    Mares, Vladimir; Romero-Expósito, Maite; Farah, Jad; Trinkl, Sebastian; Domingo, Carles; Dommert, Martin; Stolarczyk, Liliana; Van Ryckeghem, Laurent; Wielunski, Marek; Olko, Pawel; Harrison, Roger M

    2016-06-07

    The purpose of this study is to characterize the stray neutron radiation field in scanning proton therapy considering a pediatric anthropomorphic phantom and a clinically-relevant beam condition. Using two extended-range Bonner sphere spectrometry systems (ERBSS), Working Group 9 of the European Radiation Dosimetry Group measured neutron spectra at ten different positions around a pediatric anthropomorphic phantom irradiated for a brain tumor with a scanning proton beam. This study compares the different systems and unfolding codes as well as neutron spectra measured in similar conditions around a water tank phantom. The ten spectra measured with two ERBSS systems show a generally similar thermal component regardless of the position around the phantom while high energy neutrons (above 20 MeV) were only registered at positions near the beam axis (at 0°, 329° and 355°). Neutron spectra, fluence and ambient dose equivalent, H (*)(10), values of both systems were in good agreement (<15%) while the unfolding code proved to have a limited effect. The highest H (*)(10) value of 2.7 μSv Gy(-1) was measured at 329° to the beam axis and 1.63 m from the isocenter where high-energy neutrons (E  ⩾  20 MeV) contribute with about 53%. The neutron mapping within the gantry room showed that H (*)(10) values significantly decreased with distance and angular position with respect to the beam axis dropping to 0.52 μSv Gy(-1) at 90° and 3.35 m. Spectra at angles of 45° and 135° with respect to the beam axis measured here with an anthropomorphic phantom showed a similar peak structure at the thermal, fast and high energy range as in the previous water-tank experiments. Meanwhile, at 90°, small differences at the high-energy range were observed. Using ERBSS systems, neutron spectra mapping was performed to characterize the exposure of scanning proton therapy patients. The ten measured spectra provide precise information about the exposure of healthy organs to thermal, epithermal, evaporation and intra-nuclear cascade neutrons. This comprehensive spectrometry analysis can also help in understanding the tremendous literature data based rem-counters while also being of great value for general neutron shielding and radiation safety studies.

  16. Biosurveillance applying scan statistics with multiple, disparate data sources.

    PubMed

    Burkom, Howard S

    2003-06-01

    Researchers working on the Department of Defense Global Emerging Infections System (DoD-GEIS) pilot system, the Electronic Surveillance System for the Early Notification of Community-Based Epidemics (ESSENCE), have applied scan statistics for early outbreak detection using both traditional and nontraditional data sources. These sources include medical data indexed by International Classification of Disease, 9th Revision (ICD-9) diagnosis codes, as well as less-specific, but potentially timelier, indicators such as records of over-the-counter remedy sales and of school absenteeism. Early efforts employed the Kulldorff scan statistic as implemented in the SaTScan software of the National Cancer Institute. A key obstacle to this application is that the input data streams are typically based on time-varying factors, such as consumer behavior, rather than simply on the populations of the component subregions. We have used both modeling and recent historical data distributions to obtain background spatial distributions. Data analyses have provided guidance on how to condition and model input data to avoid excessive clustering. We have used this methodology in combining data sources for both retrospective studies of known outbreaks and surveillance of high-profile events of concern to local public health authorities. We have integrated the scan statistic capability into a Microsoft Access-based system in which we may include or exclude data sources, vary time windows separately for different data sources, censor data from subsets of individual providers or subregions, adjust the background computation method, and run retrospective or simulated studies.

  17. Handheld laser scanner automatic registration based on random coding

    NASA Astrophysics Data System (ADS)

    He, Lei; Yu, Chun-ping; Wang, Li

    2011-06-01

    Current research on Laser Scanner often focuses mainly on the static measurement. Little use has been made of dynamic measurement, that are appropriate for more problems and situations. In particular, traditional Laser Scanner must Keep stable to scan and measure coordinate transformation parameters between different station. In order to make the scanning measurement intelligently and rapidly, in this paper ,we developed a new registration algorithm for handleheld laser scanner based on the positon of target, which realize the dynamic measurement of handheld laser scanner without any more complex work. the double camera on laser scanner can take photograph of the artificial target points to get the three-dimensional coordinates, this points is designed by random coding. And then, a set of matched points is found from control points to realize the orientation of scanner by the least-square common points transformation. After that the double camera can directly measure the laser point cloud in the surface of object and get the point cloud data in an unified coordinate system. There are three major contributions in the paper. Firstly, a laser scanner based on binocular vision is designed with double camera and one laser head. By those, the real-time orientation of laser scanner is realized and the efficiency is improved. Secondly, the coding marker is introduced to solve the data matching, a random coding method is proposed. Compared with other coding methods,the marker with this method is simple to match and can avoid the shading for the object. Finally, a recognition method of coding maker is proposed, with the use of the distance recognition, it is more efficient. The method present here can be used widely in any measurement from small to huge obiect, such as vehicle, airplane which strengthen its intelligence and efficiency. The results of experiments and theory analzing demonstrate that proposed method could realize the dynamic measurement of handheld laser scanner. Theory analysis and experiment shows the method is reasonable and efficient.

  18. Comparison of intraoral scanning and conventional impression techniques using 3-dimensional superimposition.

    PubMed

    Rhee, Ye-Kyu; Huh, Yoon-Hyuk; Cho, Lee-Ra; Park, Chan-Jin

    2015-12-01

    The aim of this study is to evaluate the appropriate impression technique by analyzing the superimposition of 3D digital model for evaluating accuracy of conventional impression technique and digital impression. Twenty-four patients who had no periodontitis or temporomandibular joint disease were selected for analysis. As a reference model, digital impressions with a digital impression system were performed. As a test models, for conventional impression dual-arch and full-arch, impression techniques utilizing addition type polyvinylsiloxane for fabrication of cast were applied. 3D laser scanner is used for scanning the cast. Each 3 pairs for 25 STL datasets were imported into the inspection software. The three-dimensional differences were illustrated in a color-coded map. For three-dimensional quantitative analysis, 4 specified contact locations(buccal and lingual cusps of second premolar and molar) were established. For twodimensional quantitative analysis, the sectioning from buccal cusp to lingual cusp of second premolar and molar were acquired depending on the tooth axis. In color-coded map, the biggest difference between intraoral scanning and dual-arch impression was seen (P<.05). In three-dimensional analysis, the biggest difference was seen between intraoral scanning and dual-arch impression and the smallest difference was seen between dual-arch and full-arch impression. The two- and three-dimensional deviations between intraoral scanner and dual-arch impression was bigger than full-arch and dual-arch impression (P<.05). The second premolar showed significantly bigger three-dimensional deviations than the second molar in the three-dimensional deviations (P>.05).

  19. Comparison of intraoral scanning and conventional impression techniques using 3-dimensional superimposition

    PubMed Central

    Rhee, Ye-Kyu

    2015-01-01

    PURPOSE The aim of this study is to evaluate the appropriate impression technique by analyzing the superimposition of 3D digital model for evaluating accuracy of conventional impression technique and digital impression. MATERIALS AND METHODS Twenty-four patients who had no periodontitis or temporomandibular joint disease were selected for analysis. As a reference model, digital impressions with a digital impression system were performed. As a test models, for conventional impression dual-arch and full-arch, impression techniques utilizing addition type polyvinylsiloxane for fabrication of cast were applied. 3D laser scanner is used for scanning the cast. Each 3 pairs for 25 STL datasets were imported into the inspection software. The three-dimensional differences were illustrated in a color-coded map. For three-dimensional quantitative analysis, 4 specified contact locations(buccal and lingual cusps of second premolar and molar) were established. For twodimensional quantitative analysis, the sectioning from buccal cusp to lingual cusp of second premolar and molar were acquired depending on the tooth axis. RESULTS In color-coded map, the biggest difference between intraoral scanning and dual-arch impression was seen (P<.05). In three-dimensional analysis, the biggest difference was seen between intraoral scanning and dual-arch impression and the smallest difference was seen between dual-arch and full-arch impression. CONCLUSION The two- and three-dimensional deviations between intraoral scanner and dual-arch impression was bigger than full-arch and dual-arch impression (P<.05). The second premolar showed significantly bigger three-dimensional deviations than the second molar in the three-dimensional deviations (P>.05). PMID:26816576

  20. Performance Enhancement of the RatCAP Awake Rate Brain PET System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vaska, P.; Vaska, P.; Woody, C.

    The first full prototype of the RatCAP PET system, designed to image the brain of a rat while conscious, has been completed. Initial results demonstrated excellent spatial resolution, 1.8 mm FWHM with filtered backprojection and <1.5 mm FWHM with a Monte Carlo based MLEM method. However, noise equivalent countrate studies indicated the need for better timing to mitigate the effect of randoms. Thus, the front-end ASIC has been redesigned to minimize time walk, an accurate coincidence time alignment method has been implemented, and a variance reduction technique for the randoms is being developed. To maximize the quantitative capabilities required formore » neuroscience, corrections are being implemented and validated for positron range and photon noncollinearity, scatter (including outside the field of view), attenuation, randoms, and detector efficiency (deadtime is negligible). In addition, a more robust and compact PCI-based optical data acquisition system has been built to replace the original VME-based system while retaining the linux-based data processing and image reconstruction codes. Finally, a number of new animal imaging experiments have been carried out to demonstrate the performance of the RatCAP in real imaging situations, including an F-18 fluoride bone scan, a C-11 raclopride scan, and a dynamic C-11 methamphetamine scan.« less

  1. Color visualization for fluid flow prediction

    NASA Technical Reports Server (NTRS)

    Smith, R. E.; Speray, D. E.

    1982-01-01

    High-resolution raster scan color graphics allow variables to be presented as a continuum, in a color-coded picture that is referenced to a geometry such as a flow field grid or a boundary surface. Software is used to map a scalar variable such as pressure or temperature, defined on a two-dimensional slice of a flow field. The geometric shape is preserved in the resulting picture, and the relative magnitude of the variable is color-coded onto the geometric shape. The primary numerical process for color coding is an efficient search along a raster scan line to locate the quadrilteral block in the grid that bounds each pixel on the line. Tension spline interpolation is performed relative to the grid for specific values of the scalar variable, which is then color coded. When all pixels for the field of view are color-defined, a picture is played back from a memory device onto a television screen.

  2. Capturing the spectrum of household food and beverage purchasing behavior: a review.

    PubMed

    French, Simone A; Shimotsu, Scott T; Wall, Melanie; Gerlach, Anne Faricy

    2008-12-01

    The household setting may be the most important level at which to understand the food choices of individuals and how healthful food choices can be promoted. However, there are few available measures of the food purchase behaviors of households and little consensus on the best way to measure it. This review explores the currently available measures of household food purchasing behavior. Three main measures are described, evaluated, and compared: home food inventories, food and beverage purchase records and receipts, and Universal Product Code bar code scanning. The development of coding, aggregation, and analytical methods for these measures of household food purchasing behavior is described. Currently, annotated receipts and records are the most comprehensive, detailed measure of household food purchasing behavior, and are feasible for population-based samples. Universal Product Code scanning is not recommended due to its cost and complexity. Research directions to improve household food purchasing behavior measures are discussed.

  3. An abstract model of rogue code insertion into radio frequency wireless networks. The effects of computer viruses on the Program Management Office

    NASA Astrophysics Data System (ADS)

    Feudo, Christopher V.

    1994-04-01

    This dissertation demonstrates that inadequately protected wireless LANs are more vulnerable to rogue program attack than traditional LANs. Wireless LANs not only run the same risks as traditional LANs, but they also run additional risks associated with an open transmission medium. Intruders can scan radio waves and, given enough time and resources, intercept, analyze, decipher, and reinsert data into the transmission medium. This dissertation describes the development and instantiation of an abstract model of the rogue code insertion process into a DOS-based wireless communications system using radio frequency (RF) atmospheric signal transmission. The model is general enough to be applied to widely used target environments such as UNIX, Macintosh, and DOS operating systems. The methodology and three modules, the prober, activator, and trigger modules, to generate rogue code and insert it into a wireless LAN were developed to illustrate the efficacy of the model. Also incorporated into the model are defense measures against remotely introduced rogue programs and a cost-benefit analysis that determined that such defenses for a specific environment were cost-justified.

  4. Superwide-angle coverage code-multiplexed optical scanner.

    PubMed

    Riza, Nabeel A; Arain, Muzammil A

    2004-05-01

    A superwide-angle coverage code-multiplexed optical scanner is presented that has the potential to provide 4 pi-sr coverage. As a proof-of-concept experiment, an angular scan range of 288 degrees for six randomly distributed beams is demonstrated. The proposed scanner achieves its superwide coverage by exploiting a combination of phase-encoded transmission and reflection holography within an in-line hologram recording-retrieval geometry. The basic scanner unit consists of one phase-only digital mode spatial light modulator for code entry (i.e., beam scan control) and a holographic material from which we obtained what we believe is the first-of-a-kind extremely wide coverage, low component count, high speed (e.g., microsecond domain), and large aperture (e.g., > 1-cm diameter) scanner.

  5. Automated JPSS VIIRS GEO code change testing by using Chain Run Scripts

    NASA Astrophysics Data System (ADS)

    Chen, W.; Wang, W.; Zhao, Q.; Das, B.; Mikles, V. J.; Sprietzer, K.; Tsidulko, M.; Zhao, Y.; Dharmawardane, V.; Wolf, W.

    2015-12-01

    The Joint Polar Satellite System (JPSS) is the next generation polar-orbiting operational environmental satellite system. The first satellite in the JPSS series of satellites, J-1, is scheduled to launch in early 2017. J1 will carry similar versions of the instruments that are on board of Suomi National Polar-Orbiting Partnership (S-NPP) satellite which was launched on October 28, 2011. The center for Satellite Applications and Research Algorithm Integration Team (STAR AIT) uses the Algorithm Development Library (ADL) to run S-NPP and pre-J1 algorithms in a development and test mode. The ADL is an offline test system developed by Raytheon to mimic the operational system while enabling a development environment for plug and play algorithms. The Perl Chain Run Scripts have been developed by STAR AIT to automate the staging and processing of multiple JPSS Sensor Data Record (SDR) and Environmental Data Record (EDR) products. JPSS J1 VIIRS Day Night Band (DNB) has anomalous non-linear response at high scan angles based on prelaunch testing. The flight project has proposed multiple mitigation options through onboard aggregation, and the Option 21 has been suggested by the VIIRS SDR team as the baseline aggregation mode. VIIRS GEOlocation (GEO) code analysis results show that J1 DNB GEO product cannot be generated correctly without the software update. The modified code will support both Op21, Op21/26 and is backward compatible with SNPP. J1 GEO code change version 0 delivery package is under development for the current change request. In this presentation, we will discuss how to use the Chain Run Script to verify the code change and Lookup Tables (LUTs) update in ADL Block2.

  6. Quality Traceability System of Traditional Chinese Medicine Based on Two Dimensional Barcode Using Mobile Intelligent Technology

    PubMed Central

    Cai, Yong; Li, Xiwen; Wang, Runmiao; Yang, Qing; Li, Peng; Hu, Hao

    2016-01-01

    Currently, the chemical fingerprint comparison and analysis is mainly based on professional equipment and software, it’s expensive and inconvenient. This study aims to integrate QR (Quick Response) code with quality data and mobile intelligent technology to develop a convenient query terminal for tracing quality in the whole industrial chain of TCM (traditional Chinese medicine). Three herbal medicines were randomly selected and their chemical two-dimensional barcode (2D) barcodes fingerprints were constructed. Smartphone application (APP) based on Android system was developed to read initial data of 2D chemical barcodes, and compared multiple fingerprints from different batches of same species or different species. It was demonstrated that there were no significant differences between original and scanned TCM chemical fingerprints. Meanwhile, different TCM chemical fingerprint QR codes could be rendered in the same coordinate and showed the differences very intuitively. To be able to distinguish the variations of chemical fingerprint more directly, linear interpolation angle cosine similarity algorithm (LIACSA) was proposed to get similarity ratio. This study showed that QR codes can be used as an effective information carrier to transfer quality data. Smartphone application can rapidly read quality information in QR codes and convert data into TCM chemical fingerprints. PMID:27780256

  7. Automatic identification of IASLC-defined mediastinal lymph node stations on CT scans using multi-atlas organ segmentation

    NASA Astrophysics Data System (ADS)

    Hoffman, Joanne; Liu, Jiamin; Turkbey, Evrim; Kim, Lauren; Summers, Ronald M.

    2015-03-01

    Station-labeling of mediastinal lymph nodes is typically performed to identify the location of enlarged nodes for cancer staging. Stations are usually assigned in clinical radiology practice manually by qualitative visual assessment on CT scans, which is time consuming and highly variable. In this paper, we developed a method that automatically recognizes the lymph node stations in thoracic CT scans based on the anatomical organs in the mediastinum. First, the trachea, lungs, and spines are automatically segmented to locate the mediastinum region. Then, eight more anatomical organs are simultaneously identified by multi-atlas segmentation. Finally, with the segmentation of those anatomical organs, we convert the text definitions of the International Association for the Study of Lung Cancer (IASLC) lymph node map into patient-specific color-coded CT image maps. Thus, a lymph node station is automatically assigned to each lymph node. We applied this system to CT scans of 86 patients with 336 mediastinal lymph nodes measuring equal or greater than 10 mm. 84.8% of mediastinal lymph nodes were correctly mapped to their stations.

  8. MO-FG-CAMPUS-TeP3-05: Limitations of the Dose Weighted LET Concept for Intensity Modulated Proton Therapy in the Distal Falloff Region and Beyond

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moskvin, V; Pirlepesov, F; Farr, J

    2016-06-15

    Purpose: Dose-weighted linear energy transfer (dLET) has been shown to be useful for the analysis of late effects in proton therapy. This study presents the results of the testing of the dLET concept for intensity modulated proton therapy (IMPT) with a discrete spot scanning beam system without use of an aperture or compensator (AC). Methods: IMPT (no AC) and broad beams (BB) with (AC) were simulated in the TOPAS and FLUKA code systems. Information from the independently tested Monte Carlo Damage Simulation (MCDS) was integrated into the FLUKA code systems to account for spatial variations in the RBE for protonsmore » and other light ions using an endpoint of DNA double strand break (DSB) induction. Results: The proton spectra for IMPT beams at the depths beyond the distal edge contain a tail of high energy protons up to 100 MeV. The integral from the tail is compatible with the number of 5–8 MeV protons at the tip of the Bragg peak (BP). The dose averaged energy (dEav) decreases to 7 MeV at the tip of (BP) and then increases to about 15 MeV beyond the distal edge. Neutrons produced in the nozzle are two orders of magnitude higher for BB with AC than for IMPT in low energy part of the spectra. The dLET values beyond of the distal edge of the BP are 5 times larger for the IMPT than for BB with the AC. Contrarily, negligible differences are seen in the RBE estimates for IMPT and BB with AC beyond the distal edge of the BP. Conclusion: The analysis of late effects in IMPT with a spot scanning and double scattering or scanning techniques with AC may requires both dLET and RBE as quantitative parameters to characterize effects beyond the distal edge of the BP.« less

  9. SU-F-T-156: Monte Carlo Simulation Using TOPAS for Synchrotron Based Proton Discrete Spot Scanning System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moskvin, V; Pirlepesov, F; Tsiamas, P

    Purpose: This study provides an overview of the design and commissioning of the Monte Carlo (MC) model of the spot-scanning proton therapy nozzle and its implementation for the patient plan simulation. Methods: The Hitachi PROBEAT V scanning nozzle was simulated based on vendor specifications using the TOPAS extension of Geant4 code. FLUKA MC simulation was also utilized to provide supporting data for the main simulation. Validation of the MC model was performed using vendor provided data and measurements collected during acceptance/commissioning of the proton therapy machine. Actual patient plans using CT based treatment geometry were simulated and compared to themore » dose distributions produced by the treatment planning system (Varian Eclipse 13.6), and patient quality assurance measurements. In-house MATLAB scripts are used for converting DICOM data into TOPAS input files. Results: Comparison analysis of integrated depth doses (IDDs), therapeutic ranges (R90), and spot shape/sizes at different distances from the isocenter, indicate good agreement between MC and measurements. R90 agreement is within 0.15 mm across all energy tunes. IDDs and spot shapes/sizes differences are within statistical error of simulation (less than 1.5%). The MC simulated data, validated with physical measurements, were used for the commissioning of the treatment planning system. Patient geometry simulations were conducted based on the Eclipse produced DICOM plans. Conclusion: The treatment nozzle and standard option beam model were implemented in the TOPAS framework to simulate a highly conformal discrete spot-scanning proton beam system.« less

  10. Decoding the encoding of functional brain networks: An fMRI classification comparison of non-negative matrix factorization (NMF), independent component analysis (ICA), and sparse coding algorithms.

    PubMed

    Xie, Jianwen; Douglas, Pamela K; Wu, Ying Nian; Brody, Arthur L; Anderson, Ariana E

    2017-04-15

    Brain networks in fMRI are typically identified using spatial independent component analysis (ICA), yet other mathematical constraints provide alternate biologically-plausible frameworks for generating brain networks. Non-negative matrix factorization (NMF) would suppress negative BOLD signal by enforcing positivity. Spatial sparse coding algorithms (L1 Regularized Learning and K-SVD) would impose local specialization and a discouragement of multitasking, where the total observed activity in a single voxel originates from a restricted number of possible brain networks. The assumptions of independence, positivity, and sparsity to encode task-related brain networks are compared; the resulting brain networks within scan for different constraints are used as basis functions to encode observed functional activity. These encodings are then decoded using machine learning, by using the time series weights to predict within scan whether a subject is viewing a video, listening to an audio cue, or at rest, in 304 fMRI scans from 51 subjects. The sparse coding algorithm of L1 Regularized Learning outperformed 4 variations of ICA (p<0.001) for predicting the task being performed within each scan using artifact-cleaned components. The NMF algorithms, which suppressed negative BOLD signal, had the poorest accuracy compared to the ICA and sparse coding algorithms. Holding constant the effect of the extraction algorithm, encodings using sparser spatial networks (containing more zero-valued voxels) had higher classification accuracy (p<0.001). Lower classification accuracy occurred when the extracted spatial maps contained more CSF regions (p<0.001). The success of sparse coding algorithms suggests that algorithms which enforce sparsity, discourage multitasking, and promote local specialization may capture better the underlying source processes than those which allow inexhaustible local processes such as ICA. Negative BOLD signal may capture task-related activations. Copyright © 2017 Elsevier B.V. All rights reserved.

  11. One-dimensional MHD simulations of MTF systems with compact toroid targets and spherical liners

    NASA Astrophysics Data System (ADS)

    Khalzov, Ivan; Zindler, Ryan; Barsky, Sandra; Delage, Michael; Laberge, Michel

    2017-10-01

    One-dimensional (1D) MHD code is developed in General Fusion (GF) for coupled plasma-liner simulations in magnetized target fusion (MTF) systems. The main goal of these simulations is to search for optimal parameters of MTF reactor, in which spherical liquid metal liner compresses compact toroid plasma. The code uses Lagrangian description for both liner and plasma. The liner is represented as a set of spherical shells with fixed masses while plasma is discretized as a set of nested tori with circular cross sections and fixed number of particles between them. All physical fields are 1D functions of either spherical (liner) or small toroidal (plasma) radius. Motion of liner and plasma shells is calculated self-consistently based on applied forces and equations of state. Magnetic field is determined by 1D profiles of poloidal and toroidal fluxes - they are advected with shells and diffuse according to local resistivity, this also accounts for flux leakage into the liner. Different plasma transport models are implemented, this allows for comparison with ongoing GF experiments. Fusion power calculation is included into the code. We performed a series of parameter scans in order to establish the underlying dependencies of the MTF system and find the optimal reactor design point.

  12. Telepharmacy and bar-code technology in an i.v. chemotherapy admixture area.

    PubMed

    O'Neal, Brian C; Worden, John C; Couldry, Rick J

    2009-07-01

    A program using telepharmacy and bar-code technology to increase the presence of the pharmacist at a critical risk point during chemotherapy preparation is described. Telepharmacy hardware and software were acquired, and an inspection camera was placed in a biological safety cabinet to allow the pharmacy technician to take digital photographs at various stages of the chemotherapy preparation process. Once the pharmacist checks the medication vials' agreement with the work label, the technician takes the product into the biological safety cabinet, where the appropriate patient is selected from the pending work list, a queue of patient orders sent from the pharmacy information system. The technician then scans the bar code on the vial. Assuming the bar code matches, the technician photographs the work label, vials, diluents and fluids to be used, and the syringe (before injecting the contents into the bag) along with the vial. The pharmacist views all images as a part of the final product-checking process. This process allows the pharmacist to verify that the correct quantity of medication was transferred from the primary source to a secondary container without being physically present at the time of transfer. Telepharmacy and bar coding provide a means to improve the accuracy of chemotherapy preparation by decreasing the likelihood of using the incorrect product or quantity of drug. The system facilitates the reading of small product labels and removes the need for a pharmacist to handle contaminated syringes and vials when checking the final product.

  13. Optimized and secure technique for multiplexing QR code images of single characters: application to noiseless messages retrieval

    NASA Astrophysics Data System (ADS)

    Trejos, Sorayda; Fredy Barrera, John; Torroba, Roberto

    2015-08-01

    We present for the first time an optical encrypting-decrypting protocol for recovering messages without speckle noise. This is a digital holographic technique using a 2f scheme to process QR codes entries. In the procedure, letters used to compose eventual messages are individually converted into a QR code, and then each QR code is divided into portions. Through a holographic technique, we store each processed portion. After filtering and repositioning, we add all processed data to create a single pack, thus simplifying the handling and recovery of multiple QR code images, representing the first multiplexing procedure applied to processed QR codes. All QR codes are recovered in a single step and in the same plane, showing neither cross-talk nor noise problems as in other methods. Experiments have been conducted using an interferometric configuration and comparisons between unprocessed and recovered QR codes have been performed, showing differences between them due to the involved processing. Recovered QR codes can be successfully scanned, thanks to their noise tolerance. Finally, the appropriate sequence in the scanning of the recovered QR codes brings a noiseless retrieved message. Additionally, to procure maximum security, the multiplexed pack could be multiplied by a digital diffuser as to encrypt it. The encrypted pack is easily decoded by multiplying the multiplexing with the complex conjugate of the diffuser. As it is a digital operation, no noise is added. Therefore, this technique is threefold robust, involving multiplexing, encryption, and the need of a sequence to retrieve the outcome.

  14. A simple program to measure and analyse tree rings using Excel, R and SigmaScan

    PubMed Central

    Hietz, Peter

    2011-01-01

    I present a new software that links a program for image analysis (SigmaScan), one for spreadsheets (Excel) and one for statistical analysis (R) for applications of tree-ring analysis. The first macro measures ring width marked by the user on scanned images, stores raw and detrended data in Excel and calculates the distance to the pith and inter-series correlations. A second macro measures darkness along a defined path to identify latewood–earlywood transition in conifers, and a third shows the potential for automatic detection of boundaries. Written in Visual Basic for Applications, the code makes use of the advantages of existing programs and is consequently very economic and relatively simple to adjust to the requirements of specific projects or to expand making use of already available code. PMID:26109835

  15. A simple program to measure and analyse tree rings using Excel, R and SigmaScan.

    PubMed

    Hietz, Peter

    I present a new software that links a program for image analysis (SigmaScan), one for spreadsheets (Excel) and one for statistical analysis (R) for applications of tree-ring analysis. The first macro measures ring width marked by the user on scanned images, stores raw and detrended data in Excel and calculates the distance to the pith and inter-series correlations. A second macro measures darkness along a defined path to identify latewood-earlywood transition in conifers, and a third shows the potential for automatic detection of boundaries. Written in Visual Basic for Applications, the code makes use of the advantages of existing programs and is consequently very economic and relatively simple to adjust to the requirements of specific projects or to expand making use of already available code.

  16. Fabrication of long linear arrays of plastic optical fibers with squared ends for the use of code mark printing lithography

    NASA Astrophysics Data System (ADS)

    Horiuchi, Toshiyuki; Watanabe, Jun; Suzuki, Yuta; Iwasaki, Jun-ya

    2017-05-01

    Two dimensional code marks are often used for the production management. In particular, in the production lines of liquid-crystal-display panels and others, data on fabrication processes such as production number and process conditions are written on each substrate or device in detail, and they are used for quality managements. For this reason, lithography system specialized in code mark printing is developed. However, conventional systems using lamp projection exposure or laser scan exposure are very expensive. Therefore, development of a low-cost exposure system using light emitting diodes (LEDs) and optical fibers with squared ends arrayed in a matrix is strongly expected. In the past research, feasibility of such a new exposure system was demonstrated using a handmade system equipped with 100 LEDs with a central wavelength of 405 nm, a 10×10 matrix of optical fibers with 1 mm square ends, and a 10X projection lens. Based on these progresses, a new method for fabricating large-scale arrays of finer fibers with squared ends was developed in this paper. At most 40 plastic optical fibers were arranged in a linear gap of an arraying instrument, and simultaneously squared by heating them on a hotplate at 120°C for 7 min. Fiber sizes were homogeneous within 496+/-4 μm. In addition, average light leak was improved from 34.4 to 21.3% by adopting the new method in place of conventional one by one squaring method. Square matrix arrays necessary for printing code marks will be obtained by piling the newly fabricated linear arrays up.

  17. Multimodal biometric digital watermarking on immigrant visas for homeland security

    NASA Astrophysics Data System (ADS)

    Sasi, Sreela; Tamhane, Kirti C.; Rajappa, Mahesh B.

    2004-08-01

    Passengers with immigrant Visa's are a major concern to the International Airports due to the various fraud operations identified. To curb tampering of genuine Visa, the Visa's should contain human identification information. Biometric characteristic is a common and reliable way to authenticate the identity of an individual [1]. A Multimodal Biometric Human Identification System (MBHIS) that integrates iris code, DNA fingerprint, and the passport number on the Visa photograph using digital watermarking scheme is presented. Digital Watermarking technique is well suited for any system requiring high security [2]. Ophthalmologists [3], [4], [5] suggested that iris scan is an accurate and nonintrusive optical fingerprint. DNA sequence can be used as a genetic barcode [6], [7]. While issuing Visa at the US consulates, the DNA sequence isolated from saliva, the iris code and passport number shall be digitally watermarked in the Visa photograph. This information is also recorded in the 'immigrant database'. A 'forward watermarking phase' combines a 2-D DWT transformed digital photograph with the personal identification information. A 'detection phase' extracts the watermarked information from this VISA photograph at the port of entry, from which iris code can be used for identification and DNA biometric for authentication, if an anomaly arises.

  18. GAMBIT: the global and modular beyond-the-standard-model inference tool

    NASA Astrophysics Data System (ADS)

    Athron, Peter; Balazs, Csaba; Bringmann, Torsten; Buckley, Andy; Chrząszcz, Marcin; Conrad, Jan; Cornell, Jonathan M.; Dal, Lars A.; Dickinson, Hugh; Edsjö, Joakim; Farmer, Ben; Gonzalo, Tomás E.; Jackson, Paul; Krislock, Abram; Kvellestad, Anders; Lundberg, Johan; McKay, James; Mahmoudi, Farvah; Martinez, Gregory D.; Putze, Antje; Raklev, Are; Ripken, Joachim; Rogan, Christopher; Saavedra, Aldo; Savage, Christopher; Scott, Pat; Seo, Seon-Hee; Serra, Nicola; Weniger, Christoph; White, Martin; Wild, Sebastian

    2017-11-01

    We describe the open-source global fitting package GAMBIT: the Global And Modular Beyond-the-Standard-Model Inference Tool. GAMBIT combines extensive calculations of observables and likelihoods in particle and astroparticle physics with a hierarchical model database, advanced tools for automatically building analyses of essentially any model, a flexible and powerful system for interfacing to external codes, a suite of different statistical methods and parameter scanning algorithms, and a host of other utilities designed to make scans faster, safer and more easily-extendible than in the past. Here we give a detailed description of the framework, its design and motivation, and the current models and other specific components presently implemented in GAMBIT. Accompanying papers deal with individual modules and present first GAMBIT results. GAMBIT can be downloaded from gambit.hepforge.org.

  19. CrossTalk: The Journal of Defense Software Engineering. Volume 26, Number 6, November/December 2013

    DTIC Science & Technology

    2013-12-01

    requirements during sprint planning. Automated scanning, which includes automated code-review tools, allows the expert to monitor the system... sprint . This enables the validator to leverage the test results for formal validation and verification, and perform a shortened “hybrid” style of IV&V...per SPRINT (1-4 weeks) 1 week 1 Month Up to four months Ø Deliverable product to user Ø Security posture assessed Ø Accredited to field/operate

  20. The pros and cons of code validation

    NASA Technical Reports Server (NTRS)

    Bobbitt, Percy J.

    1988-01-01

    Computational and wind tunnel error sources are examined and quantified using specific calculations of experimental data, and a substantial comparison of theoretical and experimental results, or a code validation, is discussed. Wind tunnel error sources considered include wall interference, sting effects, Reynolds number effects, flow quality and transition, and instrumentation such as strain gage balances, electronically scanned pressure systems, hot film gages, hot wire anemometers, and laser velocimeters. Computational error sources include math model equation sets, the solution algorithm, artificial viscosity/dissipation, boundary conditions, the uniqueness of solutions, grid resolution, turbulence modeling, and Reynolds number effects. It is concluded that, although improvements in theory are being made more quickly than in experiments, wind tunnel research has the advantage of the more realistic transition process of a right turbulence model in a free-transition test.

  1. GPU accelerated manifold correction method for spinning compact binaries

    NASA Astrophysics Data System (ADS)

    Ran, Chong-xi; Liu, Song; Zhong, Shuang-ying

    2018-04-01

    The graphics processing unit (GPU) acceleration of the manifold correction algorithm based on the compute unified device architecture (CUDA) technology is designed to simulate the dynamic evolution of the Post-Newtonian (PN) Hamiltonian formulation of spinning compact binaries. The feasibility and the efficiency of parallel computation on GPU have been confirmed by various numerical experiments. The numerical comparisons show that the accuracy on GPU execution of manifold corrections method has a good agreement with the execution of codes on merely central processing unit (CPU-based) method. The acceleration ability when the codes are implemented on GPU can increase enormously through the use of shared memory and register optimization techniques without additional hardware costs, implying that the speedup is nearly 13 times as compared with the codes executed on CPU for phase space scan (including 314 × 314 orbits). In addition, GPU-accelerated manifold correction method is used to numerically study how dynamics are affected by the spin-induced quadrupole-monopole interaction for black hole binary system.

  2. Why hard-nosed executives should care about management theory.

    PubMed

    Christensen, Clayton M; Raynor, Michael E

    2003-09-01

    Theory often gets a bum rap among managers because it's associated with the word "theoretical," which connotes "impractical." But it shouldn't. Because experience is solely about the past, solid theories are the only way managers can plan future actions with any degree of confidence. The key word here is "solid." Gravity is a solid theory. As such, it lets us predict that if we step off a cliff we will fall, without actually having to do so. But business literature is replete with theories that don't seem to work in practice or actually contradict each other. How can a manager tell a good business theory from a bad one? The first step is understanding how good theories are built. They develop in three stages: gathering data, organizing it into categories highlighting significant differences, then making generalizations explaining what causes what, under which circumstances. For instance, professor Ananth Raman and his colleagues collected data showing that bar code-scanning systems generated notoriously inaccurate inventory records. These observations led them to classify the types of errors the scanning systems produced and the types of shops in which those errors most often occurred. Recently, some of Raman's doctoral students have worked as clerks to see exactly what kinds of behavior cause the errors. From this foundation, a solid theory predicting under which circumstances bar code systems work, and don't work, is beginning to emerge. Once we forgo one-size-fits-all explanations and insist that a theory describes the circumstances under which it does and doesn't work, we can bring predictable success to the world of management.

  3. Exploring Cognition Using Software Defined Radios for NASA Missions

    NASA Technical Reports Server (NTRS)

    Mortensen, Dale J.; Reinhart, Richard C.

    2016-01-01

    NASA missions typically operate using a communication infrastructure that requires significant schedule planning with limited flexibility when the needs of the mission change. Parameters such as modulation, coding scheme, frequency, and data rate are fixed for the life of the mission. This is due to antiquated hardware and software for both the space and ground assets and a very complex set of mission profiles. Automated techniques in place by commercial telecommunication companies are being explored by NASA to determine their usability by NASA to reduce cost and increase science return. Adding cognition the ability to learn from past decisions and adjust behavior is also being investigated. Software Defined Radios are an ideal way to implement cognitive concepts. Cognition can be considered in many different aspects of the communication system. Radio functions, such as frequency, modulation, data rate, coding and filters can be adjusted based on measurements of signal degradation. Data delivery mechanisms and route changes based on past successes and failures can be made to more efficiently deliver the data to the end user. Automated antenna pointing can be added to improve gain, coverage, or adjust the target. Scheduling improvements and automation to reduce the dependence on humans provide more flexible capabilities. The Cognitive Communications project, funded by the Space Communication and Navigation Program, is exploring these concepts and using the SCaN Testbed on board the International Space Station to implement them as they evolve. The SCaN Testbed contains three Software Defined Radios and a flight computer. These four computing platforms, along with a tracking antenna system and the supporting ground infrastructure, will be used to implement various concepts in a system similar to those used by missions. Multiple universities and SBIR companies are supporting this investigation. This paper will describe the cognitive system ideas under consideration and the plan for implementing them on platforms, including the SCaN Testbed. Discussions in the paper will include how these concepts might be used to reduce cost and improve the science return for NASA missions.

  4. NHEXAS PHASE I ARIZONA STUDY--STANDARD OPERATING PROCEDURE FOR CODING: FIELD FORMS (UA-D-37.0)

    EPA Science Inventory

    The purpose of this SOP is to define the coding strategy for selected field forms. Forms addressed here will be scanned into databases; databases are created because the forms contain critical values needed to calculate pollutant concentrations. Other forms not addressed by thi...

  5. Beam distribution reconstruction simulation for electron beam probe

    NASA Astrophysics Data System (ADS)

    Feng, Yong-Chun; Mao, Rui-Shi; Li, Peng; Kang, Xin-Cai; Yin, Yan; Liu, Tong; You, Yao-Yao; Chen, Yu-Cong; Zhao, Tie-Cheng; Xu, Zhi-Guo; Wang, Yan-Yu; Yuan, You-Jin

    2017-07-01

    An electron beam probe (EBP) is a detector which makes use of a low-intensity and low-energy electron beam to measure the transverse profile, bunch shape, beam neutralization and beam wake field of an intense beam with small dimensions. While it can be applied to many aspects, we limit our analysis to beam distribution reconstruction. This kind of detector is almost non-interceptive for all of the beam and does not disturb the machine environment. In this paper, we present the theoretical aspects behind this technique for beam distribution measurement and some simulation results of the detector involved. First, a method to obtain a parallel electron beam is introduced and a simulation code is developed. An EBP as a profile monitor for dense beams is then simulated using the fast scan method for various target beam profiles, including KV distribution, waterbag distribution, parabolic distribution, Gaussian distribution and halo distribution. Profile reconstruction from the deflected electron beam trajectory is implemented and compared with the actual profile, and the expected agreement is achieved. Furthermore, as well as fast scan, a slow scan, i.e. step-by-step scan, is considered, which lowers the requirement for hardware, i.e. Radio Frequency deflector. We calculate the three-dimensional electric field of a Gaussian distribution and simulate the electron motion in this field. In addition, a fast scan along the target beam direction and slow scan across the beam are also presented, and can provide a measurement of longitudinal distribution as well as transverse profile simultaneously. As an example, simulation results for the China Accelerator Driven Sub-critical System (CADS) and High Intensity Heavy Ion Accelerator Facility (HIAF) are given. Finally, a potential system design for an EBP is described.

  6. Software ion scan functions in analysis of glycomic and lipidomic MS/MS datasets.

    PubMed

    Haramija, Marko

    2018-03-01

    Hardware ion scan functions unique to tandem mass spectrometry (MS/MS) mode of data acquisition, such as precursor ion scan (PIS) and neutral loss scan (NLS), are important for selective extraction of key structural data from complex MS/MS spectra. However, their software counterparts, software ion scan (SIS) functions, are still not regularly available. Software ion scan functions can be easily coded for additional functionalities, such as software multiple precursor ion scan, software no ion scan, and software variable ion scan functions. These are often necessary, since they allow more efficient analysis of complex MS/MS datasets, often encountered in glycomics and lipidomics. Software ion scan functions can be easily coded by using modern script languages and can be independent of instrument manufacturer. Here we demonstrate the utility of SIS functions on a medium-size glycomic MS/MS dataset. Knowledge of sample properties, as well as of diagnostic and conditional diagnostic ions crucial for data analysis, was needed. Based on the tables constructed with the output data from the SIS functions performed, a detailed analysis of a complex MS/MS glycomic dataset could be carried out in a quick, accurate, and efficient manner. Glycomic research is progressing slowly, and with respect to the MS experiments, one of the key obstacles for moving forward is the lack of appropriate bioinformatic tools necessary for fast analysis of glycomic MS/MS datasets. Adding novel SIS functionalities to the glycomic MS/MS toolbox has a potential to significantly speed up the glycomic data analysis process. Similar tools are useful for analysis of lipidomic MS/MS datasets as well, as will be discussed briefly. Copyright © 2017 John Wiley & Sons, Ltd.

  7. Development and Implementation of the Waste Management Information System to Support Hanford's River Corridor Cleanup

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nolan, L. M.

    2006-07-01

    This paper describes the development of a Waste Information Management System (WMIS) to support the waste designation, transportation, and disposal processes used by Washington Closure Hanford, LLC to support cleanup of the Columbia River Corridor. This waste, primarily consisting of remediated burial sites and building demolition debris, is disposed at the Environmental Restoration Disposal Facility (ERDF), which is located in the center of the Hanford Site (an approximately 1460 square kilometers site). WMIS uses a combination of bar-code scanning, hand-held computers, and strategic employment of a radio frequency identification (RFID) tag system to track each waste shipment from waste generationmore » to disposal. (authors)« less

  8. Polar synthetic imaging

    NASA Astrophysics Data System (ADS)

    George, Jonathan K.

    2013-05-01

    In the search for low-cost wide spectrum imagers it may become necessary to sacrifice the expense of the focal plane array and revert to a scanning methodology. In many cases the sensor may be too unwieldy to physically scan and mirrors may have adverse effects on particular frequency bands. In these cases, photonic masks can be devised to modulate the incoming light field with a code over time. This is in essence code-division multiplexing of the light field into a lower dimension channel. In this paper a simple method for modulating the light field with masks of the Archimedes' spiral is presented and a mathematical model of the two-dimensional mask set is developed.

  9. Volume 19, Issue8 (December 2004)Articles in the Current Issue:Research ArticleTowards automation of palynology 1: analysis of pollen shape and ornamentation using simple geometric measures, derived from scanning electron microscope images

    NASA Astrophysics Data System (ADS)

    Treloar, W. J.; Taylor, G. E.; Flenley, J. R.

    2004-12-01

    This is the first of a series of papers on the theme of automated pollen analysis. The automation of pollen analysis could result in numerous advantages for the reconstruction of past environments, with larger data sets made practical, objectivity and fine resolution sampling. There are also applications in apiculture and medicine. Previous work on the classification of pollen using texture measures has been successful with small numbers of pollen taxa. However, as the number of pollen taxa to be identified increases, more features may be required to achieve a successful classification. This paper describes the use of simple geometric measures to augment the texture measures. The feasibility of this new approach is tested using scanning electron microscope (SEM) images of 12 taxa of fresh pollen taken from reference material collected on Henderson Island, Polynesia. Pollen images were captured directly from a SEM connected to a PC. A threshold grey-level was set and binary images were then generated. Pollen edges were then located and the boundaries were traced using a chain coding system. A number of simple geometric variables were calculated directly from the chain code of the pollen and a variable selection procedure was used to choose the optimal subset to be used for classification. The efficiency of these variables was tested using a leave-one-out classification procedure. The system successfully split the original 12 taxa sample into five sub-samples containing no more than six pollen taxa each. The further subdivision of echinate pollen types was then attempted with a subset of four pollen taxa. A set of difference codes was constructed for a range of displacements along the chain code. From these difference codes probability variables were calculated. A variable selection procedure was again used to choose the optimal subset of probabilities that may be used for classification. The efficiency of these variables was again tested using a leave-one-out classification procedure. The proportion of correctly classified pollen ranged from 81% to 100% depending on the subset of variables used. The best set of variables had an overall classification rate averaging at about 95%. This is comparable with the classification rates from the earlier texture analysis work for other types of pollen. Copyright

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kanehira, T; Sutherland, K; Matsuura, T

    Purpose: To evaluate density inhomogeneities which can effect dose distributions for real-time image gated spot-scanning proton therapy (RGPT), a dose calculation system, using treatment planning system VQA (Hitachi Ltd., Tokyo) spot position data, was developed based on Geant4. Methods: A Geant4 application was developed to simulate spot-scanned proton beams at Hokkaido University Hospital. A CT scan (0.98 × 0.98 × 1.25 mm) was performed for prostate cancer treatment with three or four inserted gold markers (diameter 1.5 mm, volume 1.77 mm3) in or near the target tumor. The CT data was read into VQA. A spot scanning plan was generatedmore » and exported to text files, specifying the beam energy and position of each spot. The text files were converted and read into our Geant4-based software. The spot position was converted into steering magnet field strength (in Tesla) for our beam nozzle. Individual protons were tracked from the vacuum chamber, through the helium chamber, steering magnets, dose monitors, etc., in a straight, horizontal line. The patient CT data was converted into materials with variable density and placed in a parametrized volume at the isocenter. Gold fiducial markers were represented in the CT data by two adjacent voxels (volume 2.38 mm3). 600,000 proton histories were tracked for each target spot. As one beam contained about 1,000 spots, approximately 600 million histories were recorded for each beam on a blade server. Two plans were considered: two beam horizontal opposed (90 and 270 degree) and three beam (0, 90 and 270 degree). Results: We are able to convert spot scanning plans from VQA and simulate them with our Geant4-based code. Our system can be used to evaluate the effect of dose reduction caused by gold markers used for RGPT. Conclusion: Our Geant4 application is able to calculate dose distributions for spot scanned proton therapy.« less

  11. Structured physician order entry for trauma CT: value in improving clinical information transfer and billing efficiency.

    PubMed

    Wortman, Jeremy R; Goud, Asha; Raja, Ali S; Marchello, Dana; Sodickson, Aaron

    2014-12-01

    The purpose of this study was to measure the effects of use of a structured physician order entry system for trauma CT on the communication of clinical information and on coding practices and reimbursement efficiency. This study was conducted between April 1, 2011, and January 14, 2013, at a level I trauma center with 59,000 annual emergency department visits. On March 29, 2012, a structured order entry system was implemented for head through pelvis trauma CT, so-called pan-scan CT. This study compared the following factors before and after implementation: communication of clinical signs and symptoms and mechanism of injury, primary International Classification of Diseases, 9th revision, Clinical Modification (ICD-9-CM) code category, success of reimbursement, and time required for successful reimbursement for the examination. Chi-square statistics were used to compare all categoric variables before and after the intervention, and the Wilcoxon rank sum test was used to compare billing cycle times. A total of 457 patients underwent pan-scan CT in 2734 distinct examinations. After the intervention, there was a 62% absolute increase in requisitions containing clinical signs or symptoms (from 0.4% to 63%, p<0.0001) and a 99% absolute increase in requisitions providing mechanism of injury (from 0.4% to 99%, p<0.0001). There was a 19% absolute increase in primary ICD-9-CM codes representing clinical signs or symptoms (from 2.9% to 21.8%, p<0.0001), and a 7% absolute increase in reimbursement success for examinations submitted to insurance carriers (from 83.0% to 89.7%, p<0.0001). For reimbursed studies, there was a 14.7-day reduction in mean billing cycle time (from 68.4 days to 53.7 days, p=0.008). Implementation of structured physician order entry for trauma CT was associated with significant improvement in the communication of clinical history to radiologists. The improvement was also associated with changes in coding practices, greater billing efficiency, and an increase in reimbursement success.

  12. Positron Scanner for Locating Brain Tumors

    DOE R&D Accomplishments Database

    Rankowitz, S.; Robertson, J. S.; Higinbotham, W. A.; Rosenblum, M. J.

    1962-03-01

    A system is described that makes use of positron emitting isotopes for locating brain tumors. This system inherently provides more information about the distribution of radioactivity in the head in less time than existing scanners which use one or two detectors. A stationary circular array of 32 scintillation detectors scans a horizontal layer of the head from many directions simultaneously. The data, consisting of the number of counts in all possible coincidence pairs, are coded and stored in the memory of a Two-Dimensional Pulse-Height Analyzer. A unique method of displaying and interpreting the data is described that enables rapid approximate analysis of complex source distribution patterns. (auth)

  13. Computerized bar code-based blood identification systems and near-miss transfusion episodes and transfusion errors.

    PubMed

    Nuttall, Gregory A; Abenstein, John P; Stubbs, James R; Santrach, Paula; Ereth, Mark H; Johnson, Pamela M; Douglas, Emily; Oliver, William C

    2013-04-01

    To determine whether the use of a computerized bar code-based blood identification system resulted in a reduction in transfusion errors or near-miss transfusion episodes. Our institution instituted a computerized bar code-based blood identification system in October 2006. After institutional review board approval, we performed a retrospective study of transfusion errors from January 1, 2002, through December 31, 2005, and from January 1, 2007, through December 31, 2010. A total of 388,837 U were transfused during the 2002-2005 period. There were 6 misidentification episodes of a blood product being transfused to the wrong patient during that period (incidence of 1 in 64,806 U or 1.5 per 100,000 transfusions; 95% CI, 0.6-3.3 per 100,000 transfusions). There was 1 reported near-miss transfusion episode (incidence of 0.3 per 100,000 transfusions; 95% CI, <0.1-1.4 per 100,000 transfusions). A total of 304,136 U were transfused during the 2007-2010 period. There was 1 misidentification episode of a blood product transfused to the wrong patient during that period when the blood bag and patient's armband were scanned after starting to transfuse the unit (incidence of 1 in 304,136 U or 0.3 per 100,000 transfusions; 95% CI, <0.1-1.8 per 100,000 transfusions; P=.14). There were 34 reported near-miss transfusion errors (incidence of 11.2 per 100,000 transfusions; 95% CI, 7.7-15.6 per 100,000 transfusions; P<.001). Institution of a computerized bar code-based blood identification system was associated with a large increase in discovered near-miss events. Copyright © 2013 Mayo Foundation for Medical Education and Research. Published by Elsevier Inc. All rights reserved.

  14. U.S.-MEXICO BORDER PROGRAM ARIZONA BORDER STUDY--STANDARD OPERATING PROCEDURE FOR CODING: FIELD FORMS (UA-D-37.0)

    EPA Science Inventory

    The purpose of this SOP is to define the coding strategy for selected field forms. Forms addressed here will be scanned into databases. Databases are created because the forms contain critical values needed to calculate pollutant concentrations. Other forms not addressed by th...

  15. Self-Powered Forward Error-Correcting Biosensor Based on Integration of Paper-Based Microfluidics and Self-Assembled Quick Response Codes.

    PubMed

    Yuan, Mingquan; Liu, Keng-Ku; Singamaneni, Srikanth; Chakrabartty, Shantanu

    2016-10-01

    This paper extends our previous work on silver-enhancement based self-assembling structures for designing reliable, self-powered biosensors with forward error correcting (FEC) capability. At the core of the proposed approach is the integration of paper-based microfluidics with quick response (QR) codes that can be optically scanned using a smart-phone. The scanned information is first decoded to obtain the location of a web-server which further processes the self-assembled QR image to determine the concentration of target analytes. The integration substrate for the proposed FEC biosensor is polyethylene and the patterning of the QR code on the substrate has been achieved using a combination of low-cost ink-jet printing and a regular ballpoint dispensing pen. A paper-based microfluidics channel has been integrated underneath the substrate for acquiring, mixing and flowing the sample to areas on the substrate where different parts of the code can self-assemble in presence of immobilized gold nanorods. In this paper we demonstrate the proof-of-concept detection using prototypes of QR encoded FEC biosensors.

  16. Improving Inpatient Surveys: Web-Based Computer Adaptive Testing Accessed via Mobile Phone QR Codes

    PubMed Central

    2016-01-01

    Background The National Health Service (NHS) 70-item inpatient questionnaire surveys inpatients on their perceptions of their hospitalization experience. However, it imposes more burden on the patient than other similar surveys. The literature shows that computerized adaptive testing (CAT) based on item response theory can help shorten the item length of a questionnaire without compromising its precision. Objective Our aim was to investigate whether CAT can be (1) efficient with item reduction and (2) used with quick response (QR) codes scanned by mobile phones. Methods After downloading the 2008 inpatient survey data from the Picker Institute Europe website and analyzing the difficulties of this 70-item questionnaire, we used an author-made Excel program using the Rasch partial credit model to simulate 1000 patients’ true scores followed by a standard normal distribution. The CAT was compared to two other scenarios of answering all items (AAI) and the randomized selection method (RSM), as we investigated item length (efficiency) and measurement accuracy. The author-made Web-based CAT program for gathering patient feedback was effectively accessed from mobile phones by scanning the QR code. Results We found that the CAT can be more efficient for patients answering questions (ie, fewer items to respond to) than either AAI or RSM without compromising its measurement accuracy. A Web-based CAT inpatient survey accessed by scanning a QR code on a mobile phone was viable for gathering inpatient satisfaction responses. Conclusions With advances in technology, patients can now be offered alternatives for providing feedback about hospitalization satisfaction. This Web-based CAT is a possible option in health care settings for reducing the number of survey items, as well as offering an innovative QR code access. PMID:26935793

  17. Improving Inpatient Surveys: Web-Based Computer Adaptive Testing Accessed via Mobile Phone QR Codes.

    PubMed

    Chien, Tsair-Wei; Lin, Weir-Sen

    2016-03-02

    The National Health Service (NHS) 70-item inpatient questionnaire surveys inpatients on their perceptions of their hospitalization experience. However, it imposes more burden on the patient than other similar surveys. The literature shows that computerized adaptive testing (CAT) based on item response theory can help shorten the item length of a questionnaire without compromising its precision. Our aim was to investigate whether CAT can be (1) efficient with item reduction and (2) used with quick response (QR) codes scanned by mobile phones. After downloading the 2008 inpatient survey data from the Picker Institute Europe website and analyzing the difficulties of this 70-item questionnaire, we used an author-made Excel program using the Rasch partial credit model to simulate 1000 patients' true scores followed by a standard normal distribution. The CAT was compared to two other scenarios of answering all items (AAI) and the randomized selection method (RSM), as we investigated item length (efficiency) and measurement accuracy. The author-made Web-based CAT program for gathering patient feedback was effectively accessed from mobile phones by scanning the QR code. We found that the CAT can be more efficient for patients answering questions (ie, fewer items to respond to) than either AAI or RSM without compromising its measurement accuracy. A Web-based CAT inpatient survey accessed by scanning a QR code on a mobile phone was viable for gathering inpatient satisfaction responses. With advances in technology, patients can now be offered alternatives for providing feedback about hospitalization satisfaction. This Web-based CAT is a possible option in health care settings for reducing the number of survey items, as well as offering an innovative QR code access.

  18. Improving radiopharmaceutical supply chain safety by implementing bar code technology.

    PubMed

    Matanza, David; Hallouard, François; Rioufol, Catherine; Fessi, Hatem; Fraysse, Marc

    2014-11-01

    The aim of this study was to describe and evaluate an approach for improving radiopharmaceutical supply chain safety by implementing bar code technology. We first evaluated the current situation of our radiopharmaceutical supply chain and, by means of the ALARM protocol, analysed two dispensing errors that occurred in our department. Thereafter, we implemented a bar code system to secure selected key stages of the radiopharmaceutical supply chain. Finally, we evaluated the cost of this implementation, from overtime, to overheads, to additional radiation exposure to workers. An analysis of the events that occurred revealed a lack of identification of prepared or dispensed drugs. Moreover, the evaluation of the current radiopharmaceutical supply chain showed that the dispensation and injection steps needed to be further secured. The bar code system was used to reinforce product identification at three selected key stages: at usable stock entry; at preparation-dispensation; and during administration, allowing to check conformity between the labelling of the delivered product (identity and activity) and the prescription. The extra time needed for all these steps had no impact on the number and successful conduct of examinations. The investment cost was reduced (2600 euros for new material and 30 euros a year for additional supplies) because of pre-existing computing equipment. With regard to the radiation exposure to workers there was an insignificant overexposure for hands with this new organization because of the labelling and scanning processes of radiolabelled preparation vials. Implementation of bar code technology is now an essential part of a global securing approach towards optimum patient management.

  19. Determination of Flaw Type and Location Using an Expert Module in Ultrasonic Nondestructive Testing for Weld Inspection

    NASA Astrophysics Data System (ADS)

    Shahriari, D.; Zolfaghari, A.; Masoumi, F.

    2011-01-01

    Nondestructive evaluation is explained as nondestructive testing, nondestructive inspection, and nondestructive examination. It is a desire to determine some characteristic of the object or to determine whether the object contains irregularities, discontinuities, or flaws. Ultrasound based inspection techniques are used extensively throughout industry for detection of flaws in engineering materials. The range and variety of imperfections encountered is large, and critical assessment of location, size, orientation and type is often difficult. In addition, increasing quality requirements of new standards and codes of practice relating to fitness for purpose are placing higher demands on operators. Applying of an expert knowledge-based analysis in ultrasonic examination is a powerful tool that can help assure safety, quality, and reliability; increase productivity; decrease liability; and save money. In this research, an expert module system is coupled with ultrasonic examination (A-Scan Procedure) to determine and evaluate type and location of flaws that embedded during welding parts. The processing module of this expert system is implemented based on EN standard to classify welding defects, acceptance condition and measuring of their location via echo static pattern and image processing. The designed module introduces new system that can automate evaluating of the results of A-scan method according to EN standard. It can simultaneously recognize the number and type of defects, and determine flaw position during each scan.

  20. Modeling Of A Monocular, Full-Color, Laser-Scanning, Helmet-Mounted Display for Aviator Situational Awareness

    DTIC Science & Technology

    2017-03-27

    USAARL Report No. 2017-10 Modeling of a Monocular, Full -Color, Laser- Scanning, Helmet-Mounted Display for Aviator Situational Awareness By Thomas...RESPONSIBLE PERSON 19b. TELEPHONE NUMBER (Include area code) 27-03-2017 Final 2002-2003 Modeling of a Monocular, Full -Color, Laser-Scanning, Helmet...was the idea of modeling HMDs by producing computer imagery for an observer to evaluate the quality of symbology. HMD, ANVIS, HGU-56P, Virtual

  1. Ultra-fast quantitative imaging using ptychographic iterative engine based digital micro-mirror device

    NASA Astrophysics Data System (ADS)

    Sun, Aihui; Tian, Xiaolin; Kong, Yan; Jiang, Zhilong; Liu, Fei; Xue, Liang; Wang, Shouyu; Liu, Cheng

    2018-01-01

    As a lensfree imaging technique, ptychographic iterative engine (PIE) method can provide both quantitative sample amplitude and phase distributions avoiding aberration. However, it requires field of view (FoV) scanning often relying on mechanical translation, which not only slows down measuring speed, but also introduces mechanical errors decreasing both resolution and accuracy in retrieved information. In order to achieve high-accurate quantitative imaging with fast speed, digital micromirror device (DMD) is adopted in PIE for large FoV scanning controlled by on/off state coding by DMD. Measurements were implemented using biological samples as well as USAF resolution target, proving high resolution in quantitative imaging using the proposed system. Considering its fast and accurate imaging capability, it is believed the DMD based PIE technique provides a potential solution for medical observation and measurements.

  2. Spatial Angular Compounding Technique for H-Scan Ultrasound Imaging.

    PubMed

    Khairalseed, Mawia; Xiong, Fangyuan; Kim, Jung-Whan; Mattrey, Robert F; Parker, Kevin J; Hoyt, Kenneth

    2018-01-01

    H-Scan is a new ultrasound imaging technique that relies on matching a model of pulse-echo formation to the mathematics of a class of Gaussian-weighted Hermite polynomials. This technique may be beneficial in the measurement of relative scatterer sizes and in cancer therapy, particularly for early response to drug treatment. Because current H-scan techniques use focused ultrasound data acquisitions, spatial resolution degrades away from the focal region and inherently affects relative scatterer size estimation. Although the resolution of ultrasound plane wave imaging can be inferior to that of traditional focused ultrasound approaches, the former exhibits a homogeneous spatial resolution throughout the image plane. The purpose of this study was to implement H-scan using plane wave imaging and investigate the impact of spatial angular compounding on H-scan image quality. Parallel convolution filters using two different Gaussian-weighted Hermite polynomials that describe ultrasound scattering events are applied to the radiofrequency data. The H-scan processing is done on each radiofrequency image plane before averaging to get the angular compounded image. The relative strength from each convolution is color-coded to represent relative scatterer size. Given results from a series of phantom materials, H-scan imaging with spatial angular compounding more accurately reflects the true scatterer size caused by reductions in the system point spread function and improved signal-to-noise ratio. Preliminary in vivo H-scan imaging of tumor-bearing animals suggests this modality may be useful for monitoring early response to chemotherapeutic treatment. Overall, H-scan imaging using ultrasound plane waves and spatial angular compounding is a promising approach for visualizing the relative size and distribution of acoustic scattering sources. Copyright © 2018 World Federation for Ultrasound in Medicine and Biology. Published by Elsevier Inc. All rights reserved.

  3. Incorporating 3-dimensional models in online articles.

    PubMed

    Cevidanes, Lucia H S; Ruellas, Antonio C O; Jomier, Julien; Nguyen, Tung; Pieper, Steve; Budin, Francois; Styner, Martin; Paniagua, Beatriz

    2015-05-01

    The aims of this article are to introduce the capability to view and interact with 3-dimensional (3D) surface models in online publications, and to describe how to prepare surface models for such online 3D visualizations. Three-dimensional image analysis methods include image acquisition, construction of surface models, registration in a common coordinate system, visualization of overlays, and quantification of changes. Cone-beam computed tomography scans were acquired as volumetric images that can be visualized as 3D projected images or used to construct polygonal meshes or surfaces of specific anatomic structures of interest. The anatomic structures of interest in the scans can be labeled with color (3D volumetric label maps), and then the scans are registered in a common coordinate system using a target region as the reference. The registered 3D volumetric label maps can be saved in .obj, .ply, .stl, or .vtk file formats and used for overlays, quantification of differences in each of the 3 planes of space, or color-coded graphic displays of 3D surface distances. All registered 3D surface models in this study were saved in .vtk file format and loaded in the Elsevier 3D viewer. In this study, we describe possible ways to visualize the surface models constructed from cone-beam computed tomography images using 2D and 3D figures. The 3D surface models are available in the article's online version for viewing and downloading using the reader's software of choice. These 3D graphic displays are represented in the print version as 2D snapshots. Overlays and color-coded distance maps can be displayed using the reader's software of choice, allowing graphic assessment of the location and direction of changes or morphologic differences relative to the structure of reference. The interpretation of 3D overlays and quantitative color-coded maps requires basic knowledge of 3D image analysis. When submitting manuscripts, authors can now upload 3D models that will allow readers to interact with or download them. Such interaction with 3D models in online articles now will give readers and authors better understanding and visualization of the results. Copyright © 2015 American Association of Orthodontists. Published by Elsevier Inc. All rights reserved.

  4. Radiation doses in volume-of-interest breast computed tomography—A Monte Carlo simulation study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lai, Chao-Jen, E-mail: cjlai3711@gmail.com; Zhong, Yuncheng; Yi, Ying

    2015-06-15

    Purpose: Cone beam breast computed tomography (breast CT) with true three-dimensional, nearly isotropic spatial resolution has been developed and investigated over the past decade to overcome the problem of lesions overlapping with breast anatomical structures on two-dimensional mammographic images. However, the ability of breast CT to detect small objects, such as tissue structure edges and small calcifications, is limited. To resolve this problem, the authors proposed and developed a volume-of-interest (VOI) breast CT technique to image a small VOI using a higher radiation dose to improve that region’s visibility. In this study, the authors performed Monte Carlo simulations to estimatemore » average breast dose and average glandular dose (AGD) for the VOI breast CT technique. Methods: Electron–Gamma-Shower system code-based Monte Carlo codes were used to simulate breast CT. The Monte Carlo codes estimated were validated using physical measurements of air kerma ratios and point doses in phantoms with an ion chamber and optically stimulated luminescence dosimeters. The validated full cone x-ray source was then collimated to simulate half cone beam x-rays to image digital pendant-geometry, hemi-ellipsoidal, homogeneous breast phantoms and to estimate breast doses with full field scans. 13-cm in diameter, 10-cm long hemi-ellipsoidal homogeneous phantoms were used to simulate median breasts. Breast compositions of 25% and 50% volumetric glandular fractions (VGFs) were used to investigate the influence on breast dose. The simulated half cone beam x-rays were then collimated to a narrow x-ray beam with an area of 2.5 × 2.5 cm{sup 2} field of view at the isocenter plane and to perform VOI field scans. The Monte Carlo results for the full field scans and the VOI field scans were then used to estimate the AGD for the VOI breast CT technique. Results: The ratios of air kerma ratios and dose measurement results from the Monte Carlo simulation to those from the physical measurements were 0.97 ± 0.03 and 1.10 ± 0.13, respectively, indicating that the accuracy of the Monte Carlo simulation was adequate. The normalized AGD with VOI field scans was substantially reduced by a factor of about 2 over the VOI region and by a factor of 18 over the entire breast for both 25% and 50% VGF simulated breasts compared with the normalized AGD with full field scans. The normalized AGD for the VOI breast CT technique can be kept the same as or lower than that for a full field scan with the exposure level for the VOI field scan increased by a factor of as much as 12. Conclusions: The authors’ Monte Carlo estimates of normalized AGDs for the VOI breast CT technique show that this technique can be used to markedly increase the dose to the breast and thus the visibility of the VOI region without increasing the dose to the breast. The results of this investigation should be helpful for those interested in using VOI breast CT technique to image small calcifications with dose concern.« less

  5. Radiation doses in volume-of-interest breast computed tomography—A Monte Carlo simulation study

    PubMed Central

    Lai, Chao-Jen; Zhong, Yuncheng; Yi, Ying; Wang, Tianpeng; Shaw, Chris C.

    2015-01-01

    Purpose: Cone beam breast computed tomography (breast CT) with true three-dimensional, nearly isotropic spatial resolution has been developed and investigated over the past decade to overcome the problem of lesions overlapping with breast anatomical structures on two-dimensional mammographic images. However, the ability of breast CT to detect small objects, such as tissue structure edges and small calcifications, is limited. To resolve this problem, the authors proposed and developed a volume-of-interest (VOI) breast CT technique to image a small VOI using a higher radiation dose to improve that region’s visibility. In this study, the authors performed Monte Carlo simulations to estimate average breast dose and average glandular dose (AGD) for the VOI breast CT technique. Methods: Electron–Gamma-Shower system code-based Monte Carlo codes were used to simulate breast CT. The Monte Carlo codes estimated were validated using physical measurements of air kerma ratios and point doses in phantoms with an ion chamber and optically stimulated luminescence dosimeters. The validated full cone x-ray source was then collimated to simulate half cone beam x-rays to image digital pendant-geometry, hemi-ellipsoidal, homogeneous breast phantoms and to estimate breast doses with full field scans. 13-cm in diameter, 10-cm long hemi-ellipsoidal homogeneous phantoms were used to simulate median breasts. Breast compositions of 25% and 50% volumetric glandular fractions (VGFs) were used to investigate the influence on breast dose. The simulated half cone beam x-rays were then collimated to a narrow x-ray beam with an area of 2.5 × 2.5 cm2 field of view at the isocenter plane and to perform VOI field scans. The Monte Carlo results for the full field scans and the VOI field scans were then used to estimate the AGD for the VOI breast CT technique. Results: The ratios of air kerma ratios and dose measurement results from the Monte Carlo simulation to those from the physical measurements were 0.97 ± 0.03 and 1.10 ± 0.13, respectively, indicating that the accuracy of the Monte Carlo simulation was adequate. The normalized AGD with VOI field scans was substantially reduced by a factor of about 2 over the VOI region and by a factor of 18 over the entire breast for both 25% and 50% VGF simulated breasts compared with the normalized AGD with full field scans. The normalized AGD for the VOI breast CT technique can be kept the same as or lower than that for a full field scan with the exposure level for the VOI field scan increased by a factor of as much as 12. Conclusions: The authors’ Monte Carlo estimates of normalized AGDs for the VOI breast CT technique show that this technique can be used to markedly increase the dose to the breast and thus the visibility of the VOI region without increasing the dose to the breast. The results of this investigation should be helpful for those interested in using VOI breast CT technique to image small calcifications with dose concern. PMID:26127058

  6. G-scan--mobile multiview 3-D measuring system for the analysis of the face.

    PubMed

    Kopp, S; Kühmstedt, P; Notni, G; Geller, R

    2003-10-01

    The development of optical 3-D measuring techniques and their use in industrial quality assurance, in design, and for rapid prototyping has experienced strong growth. A large number of optical 3-D measuring methods and systems are on the market in dentistry. CAD/CAM production has become firmly established in dental medicine, not least due to the systematic introduction of the Cerec technique and the digiDent method. The scanners on which these technologies are based are designed for a relatively small measuring area. To be able to measure and three-dimensionally assess the face--and the numerous changes in the face/forehead/neck region--it was necessary to design and develop a self-calibrating measuring system with gray code for clinical use: the G-Scan measuring system. Objects up to a size of 500 x 500 x 400 mm can be acquired three-dimensionally with it, with a measuring inaccuracy of 10 to 70 microm in a typical measuring time of 15 s. The present article describes the measuring principle, the system parameters, and the features of the new measuring system, and illustrates the measuring results on 3-D displays of the face in static occlusion and in functional occlusion positions.

  7. Exe-Guard Project

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, Rhett; Marshall, Tim; Chavez, Adrian

    The exe-Guard Project is an alliance between Dominion Virginia Power (DVP), Sandia National Laboratories (SNL), Dartmouth University, and Schweitzer Engineering Laboratories (SEL). SEL is primary recipient on this project. The exe-Guard project was selected for award under DE-FOA-0000359 with CFDA number 81.122 to address Topic Area of Interest 4: Hardened platforms and Systems. The exe-Guard project developed an antivirus solution for control system embedded devices to prevent the execution of unauthorized code and maintain settings and configuration integrity. This project created a white list antivirus solution for control systems capable of running on embedded Linux® operating systems. White list antivirusmore » methods allow only credible programs to run through the use of digital signatures and hash functions. Once a system’s secure state is baselined, white list antivirus software denies deviations from that state because of the installation of malicious code as this changes hash results. Black list antivirus software has been effective in traditional IT environments but has negative implications for control systems. Black list antivirus uses pattern matching and behavioral analysis to identify system threats while relying on regular updates to the signature file and recurrent system scanning. Black list antivirus is vulnerable to zero day exploits which have not yet been incorporated into a signature file update. System scans hamper the performance of high availability applications, as revealed in NIST special publication 1058 which summarizes the impact of blacklist antivirus on control systems: Manual or “on-demand” scanning has a major effect on control processes in that they take CPU time needed by the control process (Sometimes close to 100% of CPU time). Minimizing the antivirus software throttle setting will reduce but not eliminate this effect. Signature updates can also take up to 100% of CPU time, but for a much shorter period than a typical manual scanning process. Control systems are vulnerable to performance losses if off-the-shelf blacklist antivirus solutions aren’t implemented with care. This investment in configuration in addition to constant decommissioning to perform manual signature file updates is unprecedented and impractical. Additionally, control systems are often disconnected or islanded from the network making the delivery of signature updates difficult. Exe-Guard project developed a white list antivirus solution that mitigated the above drawbacks and allows control systems to cost-effectively apply malware protection while maintaining high reliability. The application of security patches can also be minimized since white listing maintains constant defense against unauthorized code execution. Security patches can instead be applied in less frequent intervals where system decommissioning can be scheduled and planned for. Since control systems are less dynamic than IT environments, the feasibility of maintaining a secure baselined state is more practical. Because upgrades are performed in infrequent, calculated intervals, it allows a new security baseline to be established before the system is returned to service. Exe-Guard built on the efforts of SNL under the Code Seal project. SNL demonstrated prototype Trust Anchors on the project which are independent monitoring and control devices that can be integrated into untrustworthy components. The exe-Guard team started with the lessons learned under this project then designed commercial solution for white list malware protection. Malware is a real threat, even on islanded or un-networked installations, since operators can unintentionally install infected files, plug in infected mass storage devices, or infect a piece of equipment on the islanded local area network that can then spread to other connected equipment. Protection at the device level is one of the last layers of defense in a security-in-depth defense model before an asset becomes compromised. This project provided non-destructive intrusion, isolation and automated response solution, achieving a goal of the Department of Energy (DOE) Roadmap to Secure Control Systems. It also addressed CIP-007-R4 which requires asset owners to employ malicious software prevention tools on assets within the electronic security perimeter. In addition, the CIP-007-R3 requirement for security patch management is minimized because white listing narrows the impact of vulnerabilities and patch releases. The exe-Guard Project completed all tasks identified in the statement of project objective and identified additional tasks within scope that were performed and completed within the original budget. The cost share was met and all deliverables were successfully completed and submitted on time. Most importantly the technology developed and commercialized under this project has been adopted by the Energy sector and thousands of devices with exe-Guard technology integrated in them have now been deployed and are protecting our power systems today« less

  8. An automatic gore panel mapping system

    NASA Technical Reports Server (NTRS)

    Shiver, John D.; Phelps, Norman N.

    1990-01-01

    The Automatic Gore Mapping System is being developed to reduce the time and labor costs associated with manufacturing the External Tank. The present chem-milling processes and procedures are discussed. The down loading of the simulation of the system has to be performed to verify that the simulation package will translate the simulation code into robot code. Also a simulation of this system has to be programmed for a gantry robot instead of the articulating robot that is presently in the system. It was discovered using the simulation package that the articulation robot cannot reach all the point on some of the panels, therefore when the system is ready for production, a gantry robot will be used. Also a hydrosensor system is being developed to replace the point-to-point contact probe. The hydrosensor will allow the robot to perform a non-contact continuous scan of the panel. It will also provide a faster scan of the panel because it will eliminate the in-and-out movement required for the present end effector. The system software is currently being modified so that the hydrosensor will work with the system. The hydrosensor consists of a Krautkramer-Branson transducer encased in a plexiglass nozzle. The water stream pumped through the nozzle is the couplant for the probe. Also, software is being written so that the robot will have the ability to draw the contour lines on the panel displaying the out-of-tolerance regions. Presently the contour lines can only be displayed on the computer screens. Research is also being performed on improving and automating the method of scribing the panels. Presently the panels are manually scribed with a sharp knife. The use of a low power laser or water jet is being studied as a method of scribing the panels. The contour drawing pen will be replaced with scribing tool and the robot will then move along the contour lines. With these developments the Automatic Gore Mapping Systems will provide a reduction in time and labor costs associated with manufacturing the External Task. The system also has the potential of inspecting other manufactured parts.

  9. Laser beam propagation through bulk nonlinear media: Numerical simulation and experiment

    NASA Astrophysics Data System (ADS)

    Kovsh, Dmitriy I.

    This dissertation describes our efforts in modeling the propagation of high intensity laser pulses through optical systems consisting of one or multiple nonlinear elements. These nonlinear elements can be up to 103 times thicker than the depth of focus of the laser beam, so that the beam size changes drastically within the medium. The set of computer codes developed are organized in a software package (NLO_BPM). The ultrafast nonlinearities of the bound-electronic n2 and two-photon absorption as well as time dependent excited-state, free-carrier and thermal nonlinearities are included in the codes for modeling propagation of picosecond to nanosecond pulses and pulse trains. Various cylindrically symmetric spatial distributions of the input beam are modeled. We use the cylindrical symmetry typical of laser outputs to reduce the CPU and memory requirements making modeling a real- time task on PC's. The hydrodynamic equations describing the rarefaction of the medium due to heating and electrostriction are solved in the transient regime to determine refractive index changes on a nanosecond time scale. This effect can be simplified in some cases by an approximation that assumes an instantaneous expansion. We also find that the index change obtained from the photo-acoustic equation overshoots its steady-state value once the ratio between the pulse width and the acoustic transit time is greater than unity. We numerically study the sensitivity of the closed- aperture Z-scan experiment to nonlinear refraction for various input beam profiles. If the beam has a ring structure with a minimum (or zero) on axis in the far field, the sensitivity of Z-scan measurements can be increased by up to one order of magnitude. The linear propagation module integrated with the nonlinear beam propagation codes allows the simulation of typical experiments such as Z-scan and optical limiting experiments. We have used these codes to model the performance of optical limiters. We study two of the most promising limiter designs: the monolithic self-protective semiconductor limiter (MONOPOL) and a multi-cell tandem limiter based on a liquid solution of reverse saturable absorbing organic dye. The numerical outputs show good agreement with experimental results up to input energies where nonlinear scattering becomes significant.

  10. Projector-Based Augmented Reality for Quality Inspection of Scanned Objects

    NASA Astrophysics Data System (ADS)

    Kern, J.; Weinmann, M.; Wursthorn, S.

    2017-09-01

    After scanning or reconstructing the geometry of objects, we need to inspect the result of our work. Are there any parts missing? Is every detail covered in the desired quality? We typically do this by looking at the resulting point clouds or meshes of our objects on-screen. What, if we could see the information directly visualized on the object itself? Augmented reality is the generic term for bringing virtual information into our real environment. In our paper, we show how we can project any 3D information like thematic visualizations or specific monitoring information with reference to our object onto the object's surface itself, thus augmenting it with additional information. For small objects that could for instance be scanned in a laboratory, we propose a low-cost method involving a projector-camera system to solve this task. The user only needs a calibration board with coded fiducial markers to calibrate the system and to estimate the projector's pose later on for projecting textures with information onto the object's surface. Changes within the projected 3D information or of the projector's pose will be applied in real-time. Our results clearly reveal that such a simple setup will deliver a good quality of the augmented information.

  11. IET. Periscope shielding and installation details. Shows range of scanning ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    IET. Periscope shielding and installation details. Shows range of scanning head, removable concrete cap, concrete shielding. Ralph M. Parsons 902-4-ANP-620-A 324. Date: February 1954. Approved by INEEL Classification Office for public release. INEEL Index code no. 035-0620-00-693-106909 - Idaho National Engineering Laboratory, Test Area North, Scoville, Butte County, ID

  12. SU-E-T-590: Optimizing Magnetic Field Strengths with Matlab for An Ion-Optic System in Particle Therapy Consisting of Two Quadrupole Magnets for Subsequent Simulations with the Monte-Carlo Code FLUKA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baumann, K; Weber, U; Simeonov, Y

    Purpose: Aim of this study was to optimize the magnetic field strengths of two quadrupole magnets in a particle therapy facility in order to obtain a beam quality suitable for spot beam scanning. Methods: The particle transport through an ion-optic system of a particle therapy facility consisting of the beam tube, two quadrupole magnets and a beam monitor system was calculated with the help of Matlab by using matrices that solve the equation of motion of a charged particle in a magnetic field and field-free region, respectively. The magnetic field strengths were optimized in order to obtain a circular andmore » thin beam spot at the iso-center of the therapy facility. These optimized field strengths were subsequently transferred to the Monte-Carlo code FLUKA and the transport of 80 MeV/u C12-ions through this ion-optic system was calculated by using a user-routine to implement magnetic fields. The fluence along the beam-axis and at the iso-center was evaluated. Results: The magnetic field strengths could be optimized by using Matlab and transferred to the Monte-Carlo code FLUKA. The implementation via a user-routine was successful. Analyzing the fluence-pattern along the beam-axis the characteristic focusing and de-focusing effects of the quadrupole magnets could be reproduced. Furthermore the beam spot at the iso-center was circular and significantly thinner compared to an unfocused beam. Conclusion: In this study a Matlab tool was developed to optimize magnetic field strengths for an ion-optic system consisting of two quadrupole magnets as part of a particle therapy facility. These magnetic field strengths could subsequently be transferred to and implemented in the Monte-Carlo code FLUKA to simulate the particle transport through this optimized ion-optic system.« less

  13. Dynamic electronic collimation method for 3-D catheter tracking on a scanning-beam digital x-ray system

    PubMed Central

    Dunkerley, David A. P.; Slagowski, Jordan M.; Funk, Tobias; Speidel, Michael A.

    2017-01-01

    Abstract. Scanning-beam digital x-ray (SBDX) is an inverse geometry x-ray fluoroscopy system capable of tomosynthesis-based 3-D catheter tracking. This work proposes a method of dose-reduced 3-D catheter tracking using dynamic electronic collimation (DEC) of the SBDX scanning x-ray tube. This is achieved through the selective deactivation of focal spot positions not needed for the catheter tracking task. The technique was retrospectively evaluated with SBDX detector data recorded during a phantom study. DEC imaging of a catheter tip at isocenter required 340 active focal spots per frame versus 4473 spots in full field-of-view (FOV) mode. The dose-area product (DAP) and peak skin dose (PSD) for DEC versus full FOV scanning were calculated using an SBDX Monte Carlo simulation code. The average DAP was reduced to 7.8% of the full FOV value, consistent with the relative number of active focal spots (7.6%). For image sequences with a moving catheter, PSD was 33.6% to 34.8% of the full FOV value. The root-mean-squared-deviation between DEC-based 3-D tracking coordinates and full FOV 3-D tracking coordinates was less than 0.1 mm. The 3-D distance between the tracked tip and the sheath centerline averaged 0.75 mm. DEC is a feasible method for dose reduction during SBDX 3-D catheter tracking. PMID:28439521

  14. Characterizing a proton beam scanning system for Monte Carlo dose calculation in patients

    NASA Astrophysics Data System (ADS)

    Grassberger, C.; Lomax, Anthony; Paganetti, H.

    2015-01-01

    The presented work has two goals. First, to demonstrate the feasibility of accurately characterizing a proton radiation field at treatment head exit for Monte Carlo dose calculation of active scanning patient treatments. Second, to show that this characterization can be done based on measured depth dose curves and spot size alone, without consideration of the exact treatment head delivery system. This is demonstrated through calibration of a Monte Carlo code to the specific beam lines of two institutions, Massachusetts General Hospital (MGH) and Paul Scherrer Institute (PSI). Comparison of simulations modeling the full treatment head at MGH to ones employing a parameterized phase space of protons at treatment head exit reveals the adequacy of the method for patient simulations. The secondary particle production in the treatment head is typically below 0.2% of primary fluence, except for low-energy electrons (<0.6 MeV for 230 MeV protons), whose contribution to skin dose is negligible. However, there is significant difference between the two methods in the low-dose penumbra, making full treatment head simulations necessary to study out-of-field effects such as secondary cancer induction. To calibrate the Monte Carlo code to measurements in a water phantom, we use an analytical Bragg peak model to extract the range-dependent energy spread at the two institutions, as this quantity is usually not available through measurements. Comparison of the measured with the simulated depth dose curves demonstrates agreement within 0.5 mm over the entire energy range. Subsequently, we simulate three patient treatments with varying anatomical complexity (liver, head and neck and lung) to give an example how this approach can be employed to investigate site-specific discrepancies between treatment planning system and Monte Carlo simulations.

  15. Characterizing a Proton Beam Scanning System for Monte Carlo Dose Calculation in Patients

    PubMed Central

    Grassberger, C; Lomax, Tony; Paganetti, H

    2015-01-01

    The presented work has two goals. First, to demonstrate the feasibility of accurately characterizing a proton radiation field at treatment head exit for Monte Carlo dose calculation of active scanning patient treatments. Second, to show that this characterization can be done based on measured depth dose curves and spot size alone, without consideration of the exact treatment head delivery system. This is demonstrated through calibration of a Monte Carlo code to the specific beam lines of two institutions, Massachusetts General Hospital (MGH) and Paul Scherrer Institute (PSI). Comparison of simulations modeling the full treatment head at MGH to ones employing a parameterized phase space of protons at treatment head exit reveals the adequacy of the method for patient simulations. The secondary particle production in the treatment head is typically below 0.2% of primary fluence, except for low–energy electrons (<0.6MeV for 230MeV protons), whose contribution to skin dose is negligible. However, there is significant difference between the two methods in the low-dose penumbra, making full treatment head simulations necessary to study out-of field effects such as secondary cancer induction. To calibrate the Monte Carlo code to measurements in a water phantom, we use an analytical Bragg peak model to extract the range-dependent energy spread at the two institutions, as this quantity is usually not available through measurements. Comparison of the measured with the simulated depth dose curves demonstrates agreement within 0.5mm over the entire energy range. Subsequently, we simulate three patient treatments with varying anatomical complexity (liver, head and neck and lung) to give an example how this approach can be employed to investigate site-specific discrepancies between treatment planning system and Monte Carlo simulations. PMID:25549079

  16. Cognitive, Social, and Literacy Competencies: The Chelsea Bank Simulation Project. Year One: Final Report. [Volume 2]: Appendices.

    ERIC Educational Resources Information Center

    Duffy, Thomas; And Others

    This supplementary volume presents appendixes A-E associated with a 1-year study which determined what secondary school students were doing as they engaged in the Chelsea Bank computer software simulation activities. Appendixes present the SCANS Analysis Coding Sheet; coding problem analysis of 50 video segments; student and teacher interview…

  17. Use of the QR Reader to Provide Real-Time Evaluation of Residents' Skills Following Surgical Procedures.

    PubMed

    Reynolds, Kellin; Barnhill, Danny; Sias, Jamie; Young, Amy; Polite, Florencia Greer

    2014-12-01

    A portable electronic method of providing instructional feedback and recording an evaluation of resident competency immediately following surgical procedures has not previously been documented in obstetrics and gynecology. This report presents a unique electronic format that documents resident competency and encourages verbal communication between faculty and residents immediately following operative procedures. The Microsoft Tag system and SurveyMonkey platform were linked by a 2-D QR code using Microsoft QR code generator. Each resident was given a unique code (TAG) embedded onto an ID card. An evaluation form was attached to each resident's file in SurveyMonkey. Postoperatively, supervising faculty scanned the resident's TAG with a smartphone and completed the brief evaluation using the phone's screen. The evaluation was reviewed with the resident and automatically submitted to the resident's educational file. The evaluation system was quickly accepted by residents and faculty. Of 43 residents and faculty in the study, 38 (88%) responded to a survey 8 weeks after institution of the electronic evaluation system. Thirty (79%) of the 38 indicated it was superior to the previously used handwritten format. The electronic system demonstrated improved utilization compared with paper evaluations, with a mean of 23 electronic evaluations submitted per resident during a 6-month period versus 14 paper assessments per resident during an earlier period of 6 months. This streamlined portable electronic evaluation is an effective tool for direct, formative feedback for residents, and it creates a longitudinal record of resident progress. Satisfaction with, and use of, this evaluation system was high.

  18. Use of the QR Reader to Provide Real-Time Evaluation of Residents' Skills Following Surgical Procedures

    PubMed Central

    Reynolds, Kellin; Barnhill, Danny; Sias, Jamie; Young, Amy; Polite, Florencia Greer

    2014-01-01

    Background A portable electronic method of providing instructional feedback and recording an evaluation of resident competency immediately following surgical procedures has not previously been documented in obstetrics and gynecology. Objective This report presents a unique electronic format that documents resident competency and encourages verbal communication between faculty and residents immediately following operative procedures. Methods The Microsoft Tag system and SurveyMonkey platform were linked by a 2-D QR code using Microsoft QR code generator. Each resident was given a unique code (TAG) embedded onto an ID card. An evaluation form was attached to each resident's file in SurveyMonkey. Postoperatively, supervising faculty scanned the resident's TAG with a smartphone and completed the brief evaluation using the phone's screen. The evaluation was reviewed with the resident and automatically submitted to the resident's educational file. Results The evaluation system was quickly accepted by residents and faculty. Of 43 residents and faculty in the study, 38 (88%) responded to a survey 8 weeks after institution of the electronic evaluation system. Thirty (79%) of the 38 indicated it was superior to the previously used handwritten format. The electronic system demonstrated improved utilization compared with paper evaluations, with a mean of 23 electronic evaluations submitted per resident during a 6-month period versus 14 paper assessments per resident during an earlier period of 6 months. Conclusions This streamlined portable electronic evaluation is an effective tool for direct, formative feedback for residents, and it creates a longitudinal record of resident progress. Satisfaction with, and use of, this evaluation system was high. PMID:26140128

  19. LabVIEW control software for scanning micro-beam X-ray fluorescence spectrometer.

    PubMed

    Wrobel, Pawel; Czyzycki, Mateusz; Furman, Leszek; Kolasinski, Krzysztof; Lankosz, Marek; Mrenca, Alina; Samek, Lucyna; Wegrzynek, Dariusz

    2012-05-15

    Confocal micro-beam X-ray fluorescence microscope was constructed. The system was assembled from commercially available components - a low power X-ray tube source, polycapillary X-ray optics and silicon drift detector - controlled by an in-house developed LabVIEW software. A video camera coupled to optical microscope was utilized to display the area excited by X-ray beam. The camera image calibration and scan area definition software were also based entirely on LabVIEW code. Presently, the main area of application of the newly constructed spectrometer is 2-dimensional mapping of element distribution in environmental, biological and geological samples with micrometer spatial resolution. The hardware and the developed software can already handle volumetric 3-D confocal scans. In this work, a front panel graphical user interface as well as communication protocols between hardware components were described. Two applications of the spectrometer, to homogeneity testing of titanium layers and to imaging of various types of grains in air particulate matter collected on membrane filters, were presented. Copyright © 2012 Elsevier B.V. All rights reserved.

  20. Analytic reconstruction algorithms for triple-source CT with horizontal data truncation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Ming; Yu, Hengyong, E-mail: hengyong-yu@ieee.org

    2015-10-15

    Purpose: This paper explores a triple-source imaging method with horizontal data truncation to enlarge the field of view (FOV) for big objects. Methods: The study is conducted by using theoretical analysis, mathematical deduction, and numerical simulations. The proposed algorithms are implemented in c + + and MATLAB. While the basic platform is constructed in MATLAB, the computationally intensive segments are coded in c + +, which are linked via a MEX interface. Results: A triple-source circular scanning configuration with horizontal data truncation is developed, where three pairs of x-ray sources and detectors are unevenly distributed on the same circle tomore » cover the whole imaging object. For this triple-source configuration, a fan-beam filtered backprojection-type algorithm is derived for truncated full-scan projections without data rebinning. The algorithm is also extended for horizontally truncated half-scan projections and cone-beam projections in a Feldkamp-type framework. Using their method, the FOV is enlarged twofold to threefold to scan bigger objects with high speed and quality. The numerical simulation results confirm the correctness and effectiveness of the developed algorithms. Conclusions: The triple-source scanning configuration with horizontal data truncation cannot only keep most of the advantages of a traditional multisource system but also cover a larger FOV for big imaging objects. In addition, because the filtering is shift-invariant, the proposed algorithms are very fast and easily parallelized on graphic processing units.« less

  1. Analytic reconstruction algorithms for triple-source CT with horizontal data truncation.

    PubMed

    Chen, Ming; Yu, Hengyong

    2015-10-01

    This paper explores a triple-source imaging method with horizontal data truncation to enlarge the field of view (FOV) for big objects. The study is conducted by using theoretical analysis, mathematical deduction, and numerical simulations. The proposed algorithms are implemented in c + + and matlab. While the basic platform is constructed in matlab, the computationally intensive segments are coded in c + +, which are linked via a mex interface. A triple-source circular scanning configuration with horizontal data truncation is developed, where three pairs of x-ray sources and detectors are unevenly distributed on the same circle to cover the whole imaging object. For this triple-source configuration, a fan-beam filtered backprojection-type algorithm is derived for truncated full-scan projections without data rebinning. The algorithm is also extended for horizontally truncated half-scan projections and cone-beam projections in a Feldkamp-type framework. Using their method, the FOV is enlarged twofold to threefold to scan bigger objects with high speed and quality. The numerical simulation results confirm the correctness and effectiveness of the developed algorithms. The triple-source scanning configuration with horizontal data truncation cannot only keep most of the advantages of a traditional multisource system but also cover a larger FOV for big imaging objects. In addition, because the filtering is shift-invariant, the proposed algorithms are very fast and easily parallelized on graphic processing units.

  2. MO-F-16A-01: Implementation of MPPG TPS Verification Tests On Various Accelerators

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smilowitz, J; Bredfeldt, J; Geurts, M

    2014-06-15

    Purpose: To demonstrate the implementation of the Medical Physics Practice Guideline (MPPG) for dose calculation and beam parameters verification of treatment planning systems (TPS). Methods: We implemented the draft TPS MPPG for three linacs: Varian Trilogy, TomoHDA and Elekta Infinity. Static and modulated test plans were created. The static fields are different than used in commissioning. Data was collected using ion chambers and diodes in a scanning water tank, Delta4 phantom and a custom phantom. MatLab and Microsoft Excel were used to create analysis tools to compare reference DICOM dose with scan data. This custom code allowed for the interpolation,more » registration and gamma analysis of arbitrary dose profiles. It will be provided as open source code. IMRT fields were validated with Delta4 registration and comparison tools. The time for each task was recorded. Results: The tests confirmed the strengths, and revealed some limitations, of our TPS. The agreement between calculated and measured dose was reported for all beams. For static fields, percent depth dose and profiles were analyzed with criteria in the draft MPPG. The results reveal areas of slight mismatch with the model (MLC leaf penumbra, buildup region.) For TomoTherapy, the IMRT plan 2%/2 mm gamma analysis revealed poorest agreement in the low dose regions. For one static test plan for all 10MV Trilogy photon beams, the plan generation, scan queue creation, data collection, data analysis and report took 2 hours, excluding tank setup. Conclusions: We have demonstrated the implementation feasibility of the TPS MPPG. This exercise generated an open source tool for dose comparisons between scan data and DICOM dose data. An easily reproducible and efficient infrastructure with streamlined data collection was created for repeatable robust testing of the TPS. The tests revealed minor discrepancies in our models and areas for improvement that are being investigated.« less

  3. Method for dose-reduced 3D catheter tracking on a scanning-beam digital x-ray system using dynamic electronic collimation

    NASA Astrophysics Data System (ADS)

    Dunkerley, David A. P.; Funk, Tobias; Speidel, Michael A.

    2016-03-01

    Scanning-beam digital x-ray (SBDX) is an inverse geometry x-ray fluoroscopy system capable of tomosynthesis-based 3D catheter tracking. This work proposes a method of dose-reduced 3D tracking using dynamic electronic collimation (DEC) of the SBDX scanning x-ray tube. Positions in the 2D focal spot array are selectively activated to create a regionof- interest (ROI) x-ray field around the tracked catheter. The ROI position is updated for each frame based on a motion vector calculated from the two most recent 3D tracking results. The technique was evaluated with SBDX data acquired as a catheter tip inside a chest phantom was pulled along a 3D trajectory. DEC scans were retrospectively generated from the detector images stored for each focal spot position. DEC imaging of a catheter tip in a volume measuring 11.4 cm across at isocenter required 340 active focal spots per frame, versus 4473 spots in full-FOV mode. The dose-area-product (DAP) and peak skin dose (PSD) for DEC versus full field-of-view (FOV) scanning were calculated using an SBDX Monte Carlo simulation code. DAP was reduced to 7.4% to 8.4% of the full-FOV value, consistent with the relative number of active focal spots (7.6%). For image sequences with a moving catheter, PSD was 33.6% to 34.8% of the full-FOV value. The root-mean-squared-deviation between DEC-based 3D tracking coordinates and full-FOV 3D tracking coordinates was less than 0.1 mm. The 3D distance between the tracked tip and the sheath centerline averaged 0.75 mm. Dynamic electronic collimation can reduce dose with minimal change in tracking performance.

  4. Real-time photoacoustic and ultrasound dual-modality imaging system facilitated with graphics processing unit and code parallel optimization.

    PubMed

    Yuan, Jie; Xu, Guan; Yu, Yao; Zhou, Yu; Carson, Paul L; Wang, Xueding; Liu, Xiaojun

    2013-08-01

    Photoacoustic tomography (PAT) offers structural and functional imaging of living biological tissue with highly sensitive optical absorption contrast and excellent spatial resolution comparable to medical ultrasound (US) imaging. We report the development of a fully integrated PAT and US dual-modality imaging system, which performs signal scanning, image reconstruction, and display for both photoacoustic (PA) and US imaging all in a truly real-time manner. The back-projection (BP) algorithm for PA image reconstruction is optimized to reduce the computational cost and facilitate parallel computation on a state of the art graphics processing unit (GPU) card. For the first time, PAT and US imaging of the same object can be conducted simultaneously and continuously, at a real-time frame rate, presently limited by the laser repetition rate of 10 Hz. Noninvasive PAT and US imaging of human peripheral joints in vivo were achieved, demonstrating the satisfactory image quality realized with this system. Another experiment, simultaneous PAT and US imaging of contrast agent flowing through an artificial vessel, was conducted to verify the performance of this system for imaging fast biological events. The GPU-based image reconstruction software code for this dual-modality system is open source and available for download from http://sourceforge.net/projects/patrealtime.

  5. Application of a color scanner for 60Co high dose rate brachytherapy dosimetry with EBT radiochromic film

    PubMed Central

    Ghorbani, Mahdi; Toossi, Mohammad Taghi Bahreyni; Mowlavi, Ali Asghar; Roodi, Shahram Bayani; Meigooni, Ali Soleimani

    2012-01-01

    Background. The aim of this study is to evaluate the performance of a color scanner as a radiochromic film reader in two dimensional dosimetry around a high dose rate brachytherapy source. Materials and methods A Microtek ScanMaker 1000XL film scanner was utilized for the measurement of dose distribution around a high dose rate GZP6 60Co brachytherapy source with GafChromic® EBT radiochromic films. In these investigations, the non-uniformity of the film and scanner response, combined, as well as the films sensitivity to scanner’s light source was evaluated using multiple samples of films, prior to the source dosimetry. The results of these measurements were compared with the Monte Carlo simulated data using MCNPX code. In addition, isodose curves acquired by radiochromic films and Monte Carlo simulation were compared with those provided by the GZP6 treatment planning system. Results Scanning of samples of uniformly irradiated films demonstrated approximately 2.85% and 4.97% nonuniformity of the response, respectively in the longitudinal and transverse directions of the film. Our findings have also indicated that the film response is not affected by the exposure to the scanner’s light source, particularly in multiple scanning of film. The results of radiochromic film measurements are in good agreement with the Monte Carlo calculations (4%) and the corresponding dose values presented by the GZP6 treatment planning system (5%). Conclusions The results of these investigations indicate that the Microtek ScanMaker 1000XL color scanner in conjunction with GafChromic EBT film is a reliable system for dosimetric evaluation of a high dose rate brachytherapy source. PMID:23411947

  6. Optical image encryption based on real-valued coding and subtracting with the help of QR code

    NASA Astrophysics Data System (ADS)

    Deng, Xiaopeng

    2015-08-01

    A novel optical image encryption based on real-valued coding and subtracting is proposed with the help of quick response (QR) code. In the encryption process, the original image to be encoded is firstly transformed into the corresponding QR code, and then the corresponding QR code is encoded into two phase-only masks (POMs) by using basic vector operations. Finally, the absolute values of the real or imaginary parts of the two POMs are chosen as the ciphertexts. In decryption process, the QR code can be approximately restored by recording the intensity of the subtraction between the ciphertexts, and hence the original image can be retrieved without any quality loss by scanning the restored QR code with a smartphone. Simulation results and actual smartphone collected results show that the method is feasible and has strong tolerance to noise, phase difference and ratio between intensities of the two decryption light beams.

  7. Study of neoclassical effects on the pedestal structure in ELMy H-mode plasmas

    NASA Astrophysics Data System (ADS)

    Pankin, A. Y.; Bateman, G.; Kritz, A. H.; Rafiq, T.; Park, G. Y.; Ku, S.; Chang, C. S.; Snyder, P. B.

    2009-11-01

    The neoclassical effects on the H-mode pedestal structure are investigated in this study. First principles' kinetic simulations of the neoclassical pedestal dynamics are combined with the MHD stability conditions for triggering ELM crashes that limit the pedestal width and height in H-mode plasmas. The neoclassical kinetic XGC0 code [1] is used to produce systematic scans over plasma parameters including plasma current, elongation, and triangularity. As plasma profiles evolve, the MHD stability limits of these profiles are analyzed with the ideal MHD stability ELITE code [2]. The scalings of the pedestal width and height are presented as a function of the scanned plasma parameters. Simulations with the XGC0 code, which include coupled ion-electron dynamics, yield predictions for both ion and electron pedestal profiles. Differences in the electron and ion pedestal scalings are investigated. [1] C.S. Chang et al, Phys. Plasmas 11 (2004) 2649. [2] P.B. Snyder et al, Phys. Plasmas, 9 (2002) 2037.

  8. Automated Detection and Analysis of Interplanetary Shocks with Real-Time Application

    NASA Astrophysics Data System (ADS)

    Vorotnikov, V.; Smith, C. W.; Hu, Q.; Szabo, A.; Skoug, R. M.; Cohen, C. M.

    2006-12-01

    The ACE real-time data stream provides web-based now-casting capabilities for solar wind conditions upstream of Earth. Our goal is to provide an automated code that finds and analyzes interplanetary shocks as they occur for possible real-time application to space weather nowcasting. Shock analysis algorithms based on the Rankine-Hugoniot jump conditions exist and are in wide-spread use today for the interactive analysis of interplanetary shocks yielding parameters such as shock speed and propagation direction and shock strength in the form of compression ratios. Although these codes can be automated in a reasonable manner to yield solutions not far from those obtained by user-directed interactive analysis, event detection presents an added obstacle and the first step in a fully automated analysis. We present a fully automated Rankine-Hugoniot analysis code that can scan the ACE science data, find shock candidates, analyze the events, obtain solutions in good agreement with those derived from interactive applications, and dismiss false positive shock candidates on the basis of the conservation equations. The intent is to make this code available to NOAA for use in real-time space weather applications. The code has the added advantage of being able to scan spacecraft data sets to provide shock solutions for use outside real-time applications and can easily be applied to science-quality data sets from other missions. Use of the code for this purpose will also be explored.

  9. Fast Exact Search in Hamming Space With Multi-Index Hashing.

    PubMed

    Norouzi, Mohammad; Punjani, Ali; Fleet, David J

    2014-06-01

    There is growing interest in representing image data and feature descriptors using compact binary codes for fast near neighbor search. Although binary codes are motivated by their use as direct indices (addresses) into a hash table, codes longer than 32 bits are not being used as such, as it was thought to be ineffective. We introduce a rigorous way to build multiple hash tables on binary code substrings that enables exact k-nearest neighbor search in Hamming space. The approach is storage efficient and straight-forward to implement. Theoretical analysis shows that the algorithm exhibits sub-linear run-time behavior for uniformly distributed codes. Empirical results show dramatic speedups over a linear scan baseline for datasets of up to one billion codes of 64, 128, or 256 bits.

  10. Scan-Based Implementation of JPEG 2000 Extensions

    NASA Technical Reports Server (NTRS)

    Rountree, Janet C.; Webb, Brian N.; Flohr, Thomas J.; Marcellin, Michael W.

    2001-01-01

    JPEG 2000 Part 2 (Extensions) contains a number of technologies that are of potential interest in remote sensing applications. These include arbitrary wavelet transforms, techniques to limit boundary artifacts in tiles, multiple component transforms, and trellis-coded quantization (TCQ). We are investigating the addition of these features to the low-memory (scan-based) implementation of JPEG 2000 Part 1. A scan-based implementation of TCQ has been realized and tested, with a very small performance loss as compared with the full image (frame-based) version. A proposed amendment to JPEG 2000 Part 2 will effect the syntax changes required to make scan-based TCQ compatible with the standard.

  11. Optical Testing Using Portable Laser Coordinate Measuring Instruments

    NASA Technical Reports Server (NTRS)

    Khreishi, Manal; Ohl, Raymond G.; Mclean, Kyle F.; Hadjimichael, Theodore J.; Hayden, Joseph E.

    2017-01-01

    High precision, portable coordinate measuring instruments (CMI) such as laser radars (LR) and laser trackers (LT) have been used for optical system alignment and integration. The LR's ability to perform a non-contact scan of surfaces was previously utilized to characterize large spherical and aspheric mirrors. In this paper, we explore the use of a CMI as an accurate, fast, robust, and non-contact tool for prescription characterization of powered optical surfaces. Using Nikon's MV-224/350 LR and Leica's Absolute Tracker AT401/402 instruments, proof of concept measurements were performed to characterize a variety of optical components by measuring the actual and apparent, or equivalently the "direct and through" (D&T), coordinates of calibrated metrology targets. Custom macros in metrology software and other data reduction code were developed to compute surface-ray intercepts and surface slopes from the D&T shots. The calculated data is fit to an aspheric surface formula to obtain the optimum prescription. The results were compared to the nominal parameters and were crosschecked using LR scans or other approaches. We discuss potential applications across the fields of optical component fabrication and system alignment and testing.

  12. Optical Testing Using Portable Laser Coordinate Measuring Instruments

    NASA Technical Reports Server (NTRS)

    Khreishi, M.; Ohl, R.; Mclean, K.; Hadjimichael, T.; Hayden, J.

    2017-01-01

    High precision, portable coordinate measuring instruments (CMI) such as laser radars (LR) and laser trackers (LT) have been used for optical system alignment and integration. The LRs ability to perform a non-contact scan of surfaces was previously utilized to characterize large spherical and aspheric mirrors. In this paper, we explore the use of a CMI as an accurate, fast, robust, and non-contact tool for prescription characterization of powered optical surfaces. Using Nikons MV-224350 LR and Leicas Absolute Tracker AT401402 instruments, proof of concept measurements were performed to characterize a variety of optical components by measuring the actual and apparent, or equivalently the direct and through (DT), coordinates of calibrated metrology targets. Custom macros in metrology software and other data reduction code were developed to compute surface-ray intercepts and surface slopes from the DT shots. The calculated data is fit to an aspheric surface formula to obtain the optimum prescription. The results were compared to the nominal parameters and were crosschecked using LR scans or other approaches. We discuss potential applications across the fields of optical component fabrication and system alignment and testing.

  13. SCaN Network Ground Station Receiver Performance for Future Service Support

    NASA Technical Reports Server (NTRS)

    Estabrook, Polly; Lee, Dennis; Cheng, Michael; Lau, Chi-Wung

    2012-01-01

    Objectives: Examine the impact of providing the newly standardized CCSDS Low Density Parity Check (LDPC) codes to the SCaN return data service on the SCaN SN and DSN ground stations receivers: SN Current Receiver: Integrated Receiver (IR). DSN Current Receiver: Downlink Telemetry and Tracking (DTT) Receiver. Early Commercial-Off-The-Shelf (COTS) prototype of the SN User Service Subsystem Component Replacement (USS CR) Narrow Band Receiver. Motivate discussion of general issues of ground station hardware design to enable simple and cheap modifications for support of future services.

  14. Deconstructing processing speed deficits in schizophrenia: application of a parametric digit symbol coding test.

    PubMed

    Bachman, Peter; Reichenberg, Abraham; Rice, Patrick; Woolsey, Mary; Chaves, Olga; Martinez, David; Maples, Natalie; Velligan, Dawn I; Glahn, David C

    2010-05-01

    Cognitive processing inefficiency, often measured using digit symbol coding tasks, is a putative vulnerability marker for schizophrenia and a reliable indicator of illness severity and functional outcome. Indeed, performance on the digit symbol coding task may be the most severe neuropsychological deficit patients with schizophrenia display at the group level. Yet, little is known about the contributions of simpler cognitive processes to coding performance in schizophrenia (e.g. decision making, visual scanning, relational memory, motor ability). We developed an experimental behavioral task, based on a computerized digit symbol coding task, which allows the manipulation of demands placed on visual scanning efficiency and relational memory while holding decisional and motor requirements constant. Although patients (n=85) were impaired on all aspects of the task when compared to demographically matched healthy comparison subjects (n=30), they showed a particularly striking failure to benefit from the presence of predictable target information. These findings are consistent with predicted impairments in cognitive processing speed due to schizophrenia patients' well-known memory impairment, suggesting that this mnemonic deficit may have consequences for critical aspects of information processing that are traditionally considered quite separate from the memory domain. Future investigation into the mechanisms underlying the wide-ranging consequences of mnemonic deficits in schizophrenia should provide additional insight. Copyright (c) 2010 Elsevier B.V. All rights reserved.

  15. Digital Equivalent Data System for XRF Labeling of Objects

    NASA Technical Reports Server (NTRS)

    Schramm, Harry F.; Kaiser, Bruce

    2005-01-01

    A digital equivalent data system (DEDS) is a system for identifying objects by means of the x-ray fluorescence (XRF) spectra of labeling elements that are encased in or deposited on the objects. As such, a DEDS is a revolutionary new major subsystem of an XRF system. A DEDS embodies the means for converting the spectral data output of an XRF scanner to an ASCII alphanumeric or barcode label that can be used to identify (or verify the assumed or apparent identity of) an XRF-scanned object. A typical XRF spectrum of interest contains peaks at photon energies associated with specific elements on the Periodic Table (see figure). The height of each spectral peak above the local background spectral intensity is proportional to the relative abundance of the corresponding element. Alphanumeric values are assigned to the relative abundances of the elements. Hence, if an object contained labeling elements in suitably chosen proportions, an alphanumeric representation of the object could be extracted from its XRF spectrum. The mixture of labeling elements and for reading the XRF spectrum would be compatible with one of the labeling conventions now used for bar codes and binary matrix patterns (essentially, two-dimensional bar codes that resemble checkerboards). A further benefit of such compatibility is that it would enable the conversion of the XRF spectral output to a bar or matrix-coded label, if needed. In short, a process previously used only for material composition analysis has been reapplied to the world of identification. This new level of verification is now being used for "authentication."

  16. DROP: Detecting Return-Oriented Programming Malicious Code

    NASA Astrophysics Data System (ADS)

    Chen, Ping; Xiao, Hai; Shen, Xiaobin; Yin, Xinchun; Mao, Bing; Xie, Li

    Return-Oriented Programming (ROP) is a new technique that helps the attacker construct malicious code mounted on x86/SPARC executables without any function call at all. Such technique makes the ROP malicious code contain no instruction, which is different from existing attacks. Moreover, it hides the malicious code in benign code. Thus, it circumvents the approaches that prevent control flow diversion outside legitimate regions (such as W ⊕ X ) and most malicious code scanning techniques (such as anti-virus scanners). However, ROP has its own intrinsic feature which is different from normal program design: (1) uses short instruction sequence ending in "ret", which is called gadget, and (2) executes the gadgets contiguously in specific memory space, such as standard GNU libc. Based on the features of the ROP malicious code, in this paper, we present a tool DROP, which is focused on dynamically detecting ROP malicious code. Preliminary experimental results show that DROP can efficiently detect ROP malicious code, and have no false positives and negatives.

  17. On-board processing concepts for future satellite communications systems

    NASA Technical Reports Server (NTRS)

    Brandon, W. T. (Editor); White, B. E. (Editor)

    1980-01-01

    The initial definition of on-board processing for an advanced satellite communications system to service domestic markets in the 1990's is discussed. An exemplar system with both RF on-board switching and demodulation/remodulation baseband processing is used to identify important issues related to system implementation, cost, and technology development. Analyses of spectrum-efficient modulation, coding, and system control techniques are summarized. Implementations for an RF switch and baseband processor are described. Among the major conclusions listed is the need for high gain satellites capable of handling tens of simultaneous beams for the efficient reuse of the 2.5 GHz 30/20 frequency band. Several scanning beams are recommended in addition to the fixed beams. Low power solid state 20 GHz GaAs FET power amplifiers in the 5W range and a general purpose digital baseband processor with gigahertz logic speeds and megabits of memory are also recommended.

  18. Security authentication with a three-dimensional optical phase code using random forest classifier: an overview

    NASA Astrophysics Data System (ADS)

    Markman, Adam; Carnicer, Artur; Javidi, Bahram

    2017-05-01

    We overview our recent work [1] on utilizing three-dimensional (3D) optical phase codes for object authentication using the random forest classifier. A simple 3D optical phase code (OPC) is generated by combining multiple diffusers and glass slides. This tag is then placed on a quick-response (QR) code, which is a barcode capable of storing information and can be scanned under non-uniform illumination conditions, rotation, and slight degradation. A coherent light source illuminates the OPC and the transmitted light is captured by a CCD to record the unique signature. Feature extraction on the signature is performed and inputted into a pre-trained random-forest classifier for authentication.

  19. Imaging Analysis of the Hard X-Ray Telescope ProtoEXIST2 and New Techniques for High-Resolution Coded-Aperture Telescopes

    NASA Technical Reports Server (NTRS)

    Hong, Jaesub; Allen, Branden; Grindlay, Jonathan; Barthelmy, Scott D.

    2016-01-01

    Wide-field (greater than or approximately equal to 100 degrees squared) hard X-ray coded-aperture telescopes with high angular resolution (greater than or approximately equal to 2 minutes) will enable a wide range of time domain astrophysics. For instance, transient sources such as gamma-ray bursts can be precisely localized without the assistance of secondary focusing X-ray telescopes to enable rapid followup studies. On the other hand, high angular resolution in coded-aperture imaging introduces a new challenge in handling the systematic uncertainty: the average photon count per pixel is often too small to establish a proper background pattern or model the systematic uncertainty in a timescale where the model remains invariant. We introduce two new techniques to improve detection sensitivity, which are designed for, but not limited to, a high-resolution coded-aperture system: a self-background modeling scheme which utilizes continuous scan or dithering operations, and a Poisson-statistics based probabilistic approach to evaluate the significance of source detection without subtraction in handling the background. We illustrate these new imaging analysis techniques in high resolution coded-aperture telescope using the data acquired by the wide-field hard X-ray telescope ProtoEXIST2 during a high-altitude balloon flight in fall 2012. We review the imaging sensitivity of ProtoEXIST2 during the flight, and demonstrate the performance of the new techniques using our balloon flight data in comparison with a simulated ideal Poisson background.

  20. ScanRanker: Quality Assessment of Tandem Mass Spectra via Sequence Tagging

    PubMed Central

    Ma, Ze-Qiang; Chambers, Matthew C.; Ham, Amy-Joan L.; Cheek, Kristin L.; Whitwell, Corbin W.; Aerni, Hans-Rudolf; Schilling, Birgit; Miller, Aaron W.; Caprioli, Richard M.; Tabb, David L.

    2011-01-01

    In shotgun proteomics, protein identification by tandem mass spectrometry relies on bioinformatics tools. Despite recent improvements in identification algorithms, a significant number of high quality spectra remain unidentified for various reasons. Here we present ScanRanker, an open-source tool that evaluates the quality of tandem mass spectra via sequence tagging with reliable performance in data from different instruments. The superior performance of ScanRanker enables it not only to find unassigned high quality spectra that evade identification through database search, but also to select spectra for de novo sequencing and cross-linking analysis. In addition, we demonstrate that the distribution of ScanRanker scores predicts the richness of identifiable spectra among multiple LC-MS/MS runs in an experiment, and ScanRanker scores assist the process of peptide assignment validation to increase confident spectrum identifications. The source code and executable versions of ScanRanker are available from http://fenchurch.mc.vanderbilt.edu. PMID:21520941

  1. Novel modes and adaptive block scanning order for intra prediction in AV1

    NASA Astrophysics Data System (ADS)

    Hadar, Ofer; Shleifer, Ariel; Mukherjee, Debargha; Joshi, Urvang; Mazar, Itai; Yuzvinsky, Michael; Tavor, Nitzan; Itzhak, Nati; Birman, Raz

    2017-09-01

    The demand for streaming video content is on the rise and growing exponentially. Networks bandwidth is very costly and therefore there is a constant effort to improve video compression rates and enable the sending of reduced data volumes while retaining quality of experience (QoE). One basic feature that utilizes the spatial correlation of pixels for video compression is Intra-Prediction, which determines the codec's compression efficiency. Intra prediction enables significant reduction of the Intra-Frame (I frame) size and, therefore, contributes to efficient exploitation of bandwidth. In this presentation, we propose new Intra-Prediction algorithms that improve the AV1 prediction model and provide better compression ratios. Two (2) types of methods are considered: )1( New scanning order method that maximizes spatial correlation in order to reduce prediction error; and )2( New Intra-Prediction modes implementation in AVI. Modern video coding standards, including AVI codec, utilize fixed scan orders in processing blocks during intra coding. The fixed scan orders typically result in residual blocks with high prediction error mainly in blocks with edges. This means that the fixed scan orders cannot fully exploit the content-adaptive spatial correlations between adjacent blocks, thus the bitrate after compression tends to be large. To reduce the bitrate induced by inaccurate intra prediction, the proposed approach adaptively chooses the scanning order of blocks according to criteria of firstly predicting blocks with maximum number of surrounding, already Inter-Predicted blocks. Using the modified scanning order method and the new modes has reduced the MSE by up to five (5) times when compared to conventional TM mode / Raster scan and up to two (2) times when compared to conventional CALIC mode / Raster scan, depending on the image characteristics (which determines the percentage of blocks predicted with Inter-Prediction, which in turn impacts the efficiency of the new scanning method). For the same cases, the PSNR was shown to improve by up to 7.4dB and up to 4 dB, respectively. The new modes have yielded 5% improvement in BD-Rate over traditionally used modes, when run on K-Frame, which is expected to yield 1% of overall improvement.

  2. Toroidal modeling of the n = 1 intrinsic error field correction experiments in EAST

    NASA Astrophysics Data System (ADS)

    Yang, Xu; Liu, Yueqiang; Sun, Youwen; Wang, Huihui; Gu, Shuai; Jia, Manni; Li, Li; Liu, Yue; Wang, Zhirui; Zhou, Lina

    2018-05-01

    The m/n = 2/1 resonant vacuum error field (EF) in the EAST tokamak experiments, inferred from the compass coil current amplitude and phase scan for mode locking, was found to depend on the parity between the upper and lower rows of the EF correction (EFC) coils (Wang et al 2016 Nucl. Fusion 56 066011). Here m and n are the poloidal and toroidal harmonic numbers in a torus, respectively. This experimental observation implies that the compass scan results cannot be simply interpreted as reflecting the true intrinsic EF. This work aims at understanding this puzzle, based on toroidal modeling of the EFC plasma discharge in EAST using the MARS-F code (Liu et al 2000 Phys. Plasmas 7 3681). By varying the amplitude and phase of the assumed n = 1 intrinsic vacuum EF with different poloidal spectra, and by computing the plasma response to the assumed EF, the compass scan predicted 2/1 EF, based on minimizing the computed resonant electromagnetic torque, can be made to match well with that of the EFC experiments using both even and odd parity coils. Moreover, the compass scan predicted vacuum EFs are found to be significantly differing from the true intrinsic EF used as input to the MARS-F code. While the puzzling result remains to be fully resolved, the results from this study offer an improved understanding of the EFC experiments and the compass scan technique for determining the intrinsic resonant EF.

  3. TWANG-PIC, a novel gyro-averaged one-dimensional particle-in-cell code for interpretation of gyrotron experiments

    NASA Astrophysics Data System (ADS)

    Braunmueller, F.; Tran, T. M.; Vuillemin, Q.; Alberti, S.; Genoud, J.; Hogge, J.-Ph.; Tran, M. Q.

    2015-06-01

    A new gyrotron simulation code for simulating the beam-wave interaction using a monomode time-dependent self-consistent model is presented. The new code TWANG-PIC is derived from the trajectory-based code TWANG by describing the electron motion in a gyro-averaged one-dimensional Particle-In-Cell (PIC) approach. In comparison to common PIC-codes, it is distinguished by its computation speed, which makes its use in parameter scans and in experiment interpretation possible. A benchmark of the new code is presented as well as a comparative study between the two codes. This study shows that the inclusion of a time-dependence in the electron equations, as it is the case in the PIC-approach, is mandatory for simulating any kind of non-stationary oscillations in gyrotrons. Finally, the new code is compared with experimental results and some implications of the violated model assumptions in the TWANG code are disclosed for a gyrotron experiment in which non-stationary regimes have been observed and for a critical case that is of interest in high power gyrotron development.

  4. TWANG-PIC, a novel gyro-averaged one-dimensional particle-in-cell code for interpretation of gyrotron experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Braunmueller, F., E-mail: falk.braunmueller@epfl.ch; Tran, T. M.; Alberti, S.

    A new gyrotron simulation code for simulating the beam-wave interaction using a monomode time-dependent self-consistent model is presented. The new code TWANG-PIC is derived from the trajectory-based code TWANG by describing the electron motion in a gyro-averaged one-dimensional Particle-In-Cell (PIC) approach. In comparison to common PIC-codes, it is distinguished by its computation speed, which makes its use in parameter scans and in experiment interpretation possible. A benchmark of the new code is presented as well as a comparative study between the two codes. This study shows that the inclusion of a time-dependence in the electron equations, as it is themore » case in the PIC-approach, is mandatory for simulating any kind of non-stationary oscillations in gyrotrons. Finally, the new code is compared with experimental results and some implications of the violated model assumptions in the TWANG code are disclosed for a gyrotron experiment in which non-stationary regimes have been observed and for a critical case that is of interest in high power gyrotron development.« less

  5. Compressive spectral testbed imaging system based on thin-film color-patterned filter arrays.

    PubMed

    Rueda, Hoover; Arguello, Henry; Arce, Gonzalo R

    2016-11-20

    Compressive spectral imaging systems can reliably capture multispectral data using far fewer measurements than traditional scanning techniques. In this paper, a thin-film patterned filter array-based compressive spectral imager is demonstrated, including its optical design and implementation. The use of a patterned filter array entails a single-step three-dimensional spatial-spectral coding on the input data cube, which provides higher flexibility on the selection of voxels being multiplexed on the sensor. The patterned filter array is designed and fabricated with micrometer pitch size thin films, referred to as pixelated filters, with three different wavelengths. The performance of the system is evaluated in terms of references measured by a commercially available spectrometer and the visual quality of the reconstructed images. Different distributions of the pixelated filters, including random and optimized structures, are explored.

  6. A Picture is Worth 1,000 Words. The Use of Clinical Images in Electronic Medical Records.

    PubMed

    Ai, Angela C; Maloney, Francine L; Hickman, Thu-Trang; Wilcox, Allison R; Ramelson, Harley; Wright, Adam

    2017-07-12

    To understand how clinicians utilize image uploading tools in a home grown electronic health records (EHR) system. A content analysis of patient notes containing non-radiological images from the EHR was conducted. Images from 4,000 random notes from July 1, 2009 - June 30, 2010 were reviewed and manually coded. Codes were assigned to four properties of the image: (1) image type, (2) role of image uploader (e.g. MD, NP, PA, RN), (3) practice type (e.g. internal medicine, dermatology, ophthalmology), and (4) image subject. 3,815 images from image-containing notes stored in the EHR were reviewed and manually coded. Of those images, 32.8% were clinical and 66.2% were non-clinical. The most common types of the clinical images were photographs (38.0%), diagrams (19.1%), and scanned documents (14.4%). MDs uploaded 67.9% of clinical images, followed by RNs with 10.2%, and genetic counselors with 6.8%. Dermatology (34.9%), ophthalmology (16.1%), and general surgery (10.8%) uploaded the most clinical images. The content of clinical images referencing body parts varied, with 49.8% of those images focusing on the head and neck region, 15.3% focusing on the thorax, and 13.8% focusing on the lower extremities. The diversity of image types, content, and uploaders within a home grown EHR system reflected the versatility and importance of the image uploading tool. Understanding how users utilize image uploading tools in a clinical setting highlights important considerations for designing better EHR tools and the importance of interoperability between EHR systems and other health technology.

  7. Evaluation of Key Factors Impacting Feeding Safety in the Neonatal Intensive Care Unit: A Systematic Review.

    PubMed

    Matus, Bethany A; Bridges, Kayla M; Logomarsino, John V

    2018-06-21

    Individualized feeding care plans and safe handling of milk (human or formula) are critical in promoting growth, immune function, and neurodevelopment in the preterm infant. Feeding errors and disruptions or limitations to feeding processes in the neonatal intensive care unit (NICU) are associated with negative safety events. Feeding errors include contamination of milk and delivery of incorrect or expired milk and may result in adverse gastrointestinal illnesses. The purpose of this review was to evaluate the effect(s) of centralized milk preparation, use of trained technicians, use of bar code-scanning software, and collaboration between registered dietitians and registered nurses on feeding safety in the NICU. A systematic review of the literature was completed, and 12 articles were selected as relevant to search criteria. Study quality was evaluated using the Downs and Black scoring tool. An evaluation of human studies indicated that the use of centralized milk preparation, trained technicians, bar code-scanning software, and possible registered dietitian involvement decreased feeding-associated error in the NICU. A state-of-the-art NICU includes a centralized milk preparation area staffed by trained technicians, care supported by bar code-scanning software, and utilization of a registered dietitian to improve patient safety. These resources will provide nurses more time to focus on nursing-specific neonatal care. Further research is needed to evaluate the impact of factors related to feeding safety in the NICU as well as potential financial benefits of these quality improvement opportunities.

  8. Comparison of three coding strategies for a low cost structure light scanner

    NASA Astrophysics Data System (ADS)

    Xiong, Hanwei; Xu, Jun; Xu, Chenxi; Pan, Ming

    2014-12-01

    Coded structure light is widely used for 3D scanning, and different coding strategies are adopted to suit for different goals. In this paper, three coding strategies are compared, and one of them is selected to implement a low cost structure light scanner under the cost of €100. To reach this goal, the projector and the video camera must be the cheapest, which will lead to some problems related to light coding. For a cheapest projector, complex intensity pattern can't be generated; even if it can be generated, it can't be captured by a cheapest camera. Based on Gray code, three different strategies are implemented and compared, called phase-shift, line-shift, and bit-shift, respectively. The bit-shift Gray code is the contribution of this paper, in which a simple, stable light pattern is used to generate dense(mean points distance<0.4mm) and accurate(mean error<0.1mm) results. The whole algorithm details and some example are presented in the papers.

  9. Visual scanning with or without spatial uncertainty and time-sharing performance

    NASA Technical Reports Server (NTRS)

    Liu, Yili; Wickens, Christopher D.

    1989-01-01

    An experiment is reported that examines the pattern of task interference between visual scanning as a sequential and selective attention process and other concurrent spatial or verbal processing tasks. A distinction is proposed between visual scanning with or without spatial uncertainty regarding the possible differential effects of these two types of scanning on interference with other concurrent processes. The experiment required the subject to perform a simulated primary tracking task, which was time-shared with a secondary spatial or verbal decision task. The relevant information that was needed to perform the decision tasks were displayed with or without spatial uncertainty. The experiment employed a 2 x 2 x 2 design with types of scanning (with or without spatial uncertainty), expected scanning distance (low/high), and codes of concurrent processing (spatial/verbal) as the three experimental factors. The results provide strong evidence that visual scanning as a spatial exploratory activity produces greater task interference with concurrent spatial tasks than with concurrent verbal tasks. Furthermore, spatial uncertainty in visual scanning is identified to be the crucial factor in producing this differential effect.

  10. Are developments in mental scanning and mental rotation related?

    PubMed Central

    Wimmer, Marina C.; Robinson, Elizabeth J.; Doherty, Martin J.

    2017-01-01

    The development and relation of mental scanning and mental rotation were examined in 4-, 6-, 8-, 10-year old children and adults (N = 102). Based on previous findings from adults and ageing populations, the key question was whether they develop as a set of related abilities and become increasingly differentiated or are unrelated abilities per se. Findings revealed that both mental scanning and rotation abilities develop between 4- and 6 years of age. Specifically, 4-year-olds showed no difference in accuracy of mental scanning and no scanning trials whereas all older children and adults made more errors in scanning trials. Additionally, the minority of 4-year-olds showed a linear increase in response time with increasing rotation angle difference of two stimuli in contrast to all older participants. Despite similar developmental trajectories, mental scanning and rotation performances were unrelated. Thus, adding to research findings from adults, mental scanning and rotation appear to develop as a set of unrelated abilities from the outset. Different underlying abilities such as visual working memory and spatial coding versus representing past and future events are discussed. PMID:28207810

  11. A Novel Method for Estimating Transgender Status Using Electronic Medical Records

    PubMed Central

    Roblin, Douglas; Barzilay, Joshua; Tolsma, Dennis; Robinson, Brandi; Schild, Laura; Cromwell, Lee; Braun, Hayley; Nash, Rebecca; Gerth, Joseph; Hunkeler, Enid; Quinn, Virginia P.; Tangpricha, Vin; Goodman, Michael

    2016-01-01

    Background We describe a novel algorithm for identifying transgender people and determining their male-to-female (MTF) or female-to-male (FTM) identity in electronic medical records (EMR) of an integrated health system. Methods A SAS program scanned Kaiser Permanente Georgia EMR from January 2006 through December 2014 for relevant diagnostic codes, and presence of specific keywords (e.g., “transgender” or “transsexual”) in clinical notes. Eligibility was verified by review of de-identified text strings containing targeted keywords, and if needed, by an additional in-depth review of records. Once transgender status was confirmed, FTM or MTF identity was assessed using a second SAS program and another round of text string reviews. Results Of 813,737 members, 271 were identified as possibly transgender: 137 through keywords only, 25 through diagnostic codes only, and 109 through both codes and keywords. Of these individuals, 185 (68%, 95% confidence interval [CI]: 62-74%) were confirmed as definitely transgender. The proportions (95% CIs) of definite transgender status among persons identified via keywords, diagnostic codes, and both were 45% (37-54%), 56% (35-75%), and 100% (96-100%), respectively. Of the 185 definitely transgender people, 99 (54%, 95% CI: 46-61%) were MTF, 84 (45%, 95% CI: 38-53%) were FTM. For two persons, gender identity remained unknown. Prevalence of transgender people (per 100,000 members) was 4.4 (95% CI: 2.6-7.4) in 2006 and 38.7 (95% CI: 32.4-46.2) in 2014. Conclusions The proposed method of identifying candidates for transgender health studies is low cost and relatively efficient. It can be applied in other similar health care systems. PMID:26907539

  12. The FBI wavelet/scalar quantization standard for gray-scale fingerprint image compression

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bradley, J.N.; Brislawn, C.M.; Hopper, T.

    1993-05-01

    The FBI has recently adopted a standard for the compression of digitized 8-bit gray-scale fingerprint images. The standard is based on scalar quantization of a 64-subband discrete wavelet transform decomposition of the images, followed by Huffman coding. Novel features of the algorithm include the use of symmetric boundary conditions for transforming finite-length signals and a subband decomposition tailored for fingerprint images scanned at 500 dpi. The standard is intended for use in conjunction with ANSI/NBS-CLS 1-1993, American National Standard Data Format for the Interchange of Fingerprint Information, and the FBI`s Integrated Automated Fingerprint Identification System.

  13. The FBI wavelet/scalar quantization standard for gray-scale fingerprint image compression

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bradley, J.N.; Brislawn, C.M.; Hopper, T.

    1993-01-01

    The FBI has recently adopted a standard for the compression of digitized 8-bit gray-scale fingerprint images. The standard is based on scalar quantization of a 64-subband discrete wavelet transform decomposition of the images, followed by Huffman coding. Novel features of the algorithm include the use of symmetric boundary conditions for transforming finite-length signals and a subband decomposition tailored for fingerprint images scanned at 500 dpi. The standard is intended for use in conjunction with ANSI/NBS-CLS 1-1993, American National Standard Data Format for the Interchange of Fingerprint Information, and the FBI's Integrated Automated Fingerprint Identification System.

  14. Automated data capture from free-text radiology reports to enhance accuracy of hospital inpatient stroke codes.

    PubMed

    Flynn, Robert W V; Macdonald, Thomas M; Schembri, Nicola; Murray, Gordon D; Doney, Alexander S F

    2010-08-01

    Much potentially useful clinical information for pharmacoepidemiological research is contained in unstructured free-text documents and is not readily available for analysis. Routine health data such as Scottish Morbidity Records (SMR01) frequently use generic 'stroke' codes. Free-text Computerised Radiology Information System (CRIS) reports have potential to provide this missing detail. We aimed to increase the number of stroke-type-specific diagnoses by augmenting SMR01 with data derived from CRIS reports and to assess the accuracy of this methodology. SMR01 codes describing first-ever-stroke admissions in Tayside, Scotland from 1994 to 2005 were linked to CRIS CT-brain scan reports occurring with 14 days of admission. Software was developed to parse the text and elicit details of stroke type using keyword matching. An algorithm was iteratively developed to differentiate intracerebral haemorrhage (ICH) from ischaemic stroke (IS) against a training set of reports with pathophysiologically precise SMR01 codes. This algorithm was then applied to CRIS reports associated with generic SMR01 codes. To establish the accuracy of the algorithm a sample of 150 ICH and 150 IS reports were independently classified by a stroke physician. There were 8419 SMR01 coded first-ever strokes. The proportion of patients with pathophysiologically clear diagnoses doubled from 2745 (32.6%) to 5614 (66.7%). The positive predictive value was 94.7% (95%CI 89.8-97.3) for IS and 76.7% (95%CI 69.3-82.7) for haemorrhagic stroke. A free-text processing approach was acceptably accurate at identifying IS, but not ICH. This approach could be adapted to other studies where radiology reports may be informative. 2010 John Wiley & Sons, Ltd.

  15. The use of computer-generated color graphic images for transient thermal analysis. [for hypersonic aircraft

    NASA Technical Reports Server (NTRS)

    Edwards, C. L. W.; Meissner, F. T.; Hall, J. B.

    1979-01-01

    Color computer graphics techniques were investigated as a means of rapidly scanning and interpreting large sets of transient heating data. The data presented were generated to support the conceptual design of a heat-sink thermal protection system (TPS) for a hypersonic research airplane. Color-coded vector and raster displays of the numerical geometry used in the heating calculations were employed to analyze skin thicknesses and surface temperatures of the heat-sink TPS under a variety of trajectory flight profiles. Both vector and raster displays proved to be effective means for rapidly identifying heat-sink mass concentrations, regions of high heating, and potentially adverse thermal gradients. The color-coded (raster) surface displays are a very efficient means for displaying surface-temperature and heating histories, and thereby the more stringent design requirements can quickly be identified. The related hardware and software developments required to implement both the vector and the raster displays for this application are also discussed.

  16. Assessment of Quadrivalent Human Papillomavirus Vaccine Safety Using the Self-Controlled Tree-Temporal Scan Statistic Signal-Detection Method in the Sentinel System.

    PubMed

    Yih, W Katherine; Maro, Judith C; Nguyen, Michael; Baker, Meghan A; Balsbaugh, Carolyn; Cole, David V; Dashevsky, Inna; Mba-Jonas, Adamma; Kulldorff, Martin

    2018-06-01

    The self-controlled tree-temporal scan statistic-a new signal-detection method-can evaluate whether any of a wide variety of health outcomes are temporally associated with receipt of a specific vaccine, while adjusting for multiple testing. Neither health outcomes nor postvaccination potential periods of increased risk need be prespecified. Using US medical claims data in the Food and Drug Administration's Sentinel system, we employed the method to evaluate adverse events occurring after receipt of quadrivalent human papillomavirus vaccine (4vHPV). Incident outcomes recorded in emergency department or inpatient settings within 56 days after first doses of 4vHPV received by 9- through 26.9-year-olds in 2006-2014 were identified using International Classification of Diseases, Ninth Revision, diagnosis codes and analyzed by pairing the new method with a standard hierarchical classification of diagnoses. On scanning diagnoses of 1.9 million 4vHPV recipients, 2 statistically significant categories of adverse events were found: cellulitis on days 2-3 after vaccination and "other complications of surgical and medical procedures" on days 1-3 after vaccination. Cellulitis is a known adverse event. Clinically informed investigation of electronic claims records of the patients with "other complications" did not suggest any previously unknown vaccine safety problem. Considering that thousands of potential short-term adverse events and hundreds of potential risk intervals were evaluated, these findings add significantly to the growing safety record of 4vHPV.

  17. JOVE NASA-FIT program: Microgravity and aeronomy projects

    NASA Technical Reports Server (NTRS)

    Patterson, James D.; Mantovani, James G.; Rassoul, Hamid K.

    1994-01-01

    This semi-annual status report is divided into two sections: Scanning Tunneling Microscopy Lab and Aeronomy Lab. The Scanning Tunneling Microscopy (STM) research involves studying solar cell materials using the STM built at Florida Tech using a portion of our initial Jove equipment funding. One result of the participation in the FSEC project will be to design and build an STM system which is portable. This could serve as a prototype STM system which might be used on the Space Shuttle during a Spacelab mission, or onboard the proposed Space Station. The scanning tunneling microscope is only able to image the surface structure of electrically conductive crystals; by building an atomic force microscope (AFM) the surface structure of any sample, regardless of its conductivity, will be able to be imaged. With regards to the Aeronomy Lab, a total of four different mesospheric oxygen emission codes were created to calculate the intensity along the line of sight of the shuttle observations for 2972A, Herzberg I, Herzberg II, and Chamberlain bands. The thermosphere-ionosphere coupling project was completed with two major accomplishments: collection of 500 data points on modulation of neutral wind with geophysical variables, and establishment of constraints on behavior of the height of the ionosphere as a result of interaction between geophysical and geometrical factors. The magnetotail plasma project has been centered around familiarization with the subject in the form of a literature search and preprocessing of IMP-8 data.

  18. DVB-S2 Experiment over NASA's Space Network

    NASA Technical Reports Server (NTRS)

    Downey, Joseph A.; Evans, Michael A.; Tollis, Nicholas S.

    2017-01-01

    The commercial DVB-S2 standard was successfully demonstrated over NASAs Space Network (SN) and the Tracking Data and Relay Satellite System (TDRSS) during testing conducted September 20-22nd, 2016. This test was a joint effort between NASA Glenn Research Center (GRC) and Goddard Space Flight Center (GSFC) to evaluate the performance of DVB-S2 as an alternative to traditional NASA SN waveforms. Two distinct sets of tests were conducted: one was sourced from the Space Communication and Navigation (SCaN) Testbed, an external payload on the International Space Station, and the other was sourced from GRCs S-band ground station to emulate a Space Network user through TDRSS. In both cases, a commercial off-the-shelf (COTS) receiver made by Newtec was used to receive the signal at White Sands Complex. Using SCaN Testbed, peak data rates of 5.7 Mbps were demonstrated. Peak data rates of 33 Mbps were demonstrated over the GRC S-band ground station through a 10MHz channel over TDRSS, using 32-amplitude phase shift keying (APSK) and a rate 89 low density parity check (LDPC) code. Advanced features of the DVB-S2 standard were evaluated, including variable and adaptive coding and modulation (VCMACM), as well as an adaptive digital pre-distortion (DPD) algorithm. These features provided additional data throughput and increased link performance reliability. This testing has shown that commercial standards are a viable, low-cost alternative for future Space Network users.

  19. Workshop on the Physics and Chemistry of Mercury Cadmium Telluride and Related II-VI Compounds Held in San Diego, California on October 3, 4, 5, 1989

    DTIC Science & Technology

    1989-11-01

    spatially scanned to change the position of the laser beam on the diode. The diode leakage current at 100 mV reverse bias and at 77K was measured as a...by damage incurred at all the positions the beam has passed until this pohic during the scan. The current scan is transformed into a false color (or a...report are those of he authgr(i).and sh uld not be consted as an official De artment of the Army position , 17. COSATI CODES IS. SUBJECT TERI (C&cWnuan

  20. Embedding QR codes in tumor board presentations, enhancing educational content for oncology information management.

    PubMed

    Siderits, Richard; Yates, Stacy; Rodriguez, Arelis; Lee, Tina; Rimmer, Cheryl; Roche, Mark

    2011-01-01

    Quick Response (QR) Codes are standard in supply management and seen with increasing frequency in advertisements. They are now present regularly in healthcare informatics and education. These 2-dimensional square bar codes, originally designed by the Toyota car company, are free of license and have a published international standard. The codes can be generated by free online software and the resulting images incorporated into presentations. The images can be scanned by "smart" phones and tablets using either the iOS or Android platforms, which link the device with the information represented by the QR code (uniform resource locator or URL, online video, text, v-calendar entries, short message service [SMS] and formatted text). Once linked to the device, the information can be viewed at any time after the original presentation, saved in the device or to a Web-based "cloud" repository, printed, or shared with others via email or Bluetooth file transfer. This paper describes how we use QR codes in our tumor board presentations, discusses the benefits, the different QR codes from Web links and how QR codes facilitate the distribution of educational content.

  1. Temporal Prediction Errors Affect Short-Term Memory Scanning Response Time.

    PubMed

    Limongi, Roberto; Silva, Angélica M

    2016-11-01

    The Sternberg short-term memory scanning task has been used to unveil cognitive operations involved in time perception. Participants produce time intervals during the task, and the researcher explores how task performance affects interval production - where time estimation error is the dependent variable of interest. The perspective of predictive behavior regards time estimation error as a temporal prediction error (PE), an independent variable that controls cognition, behavior, and learning. Based on this perspective, we investigated whether temporal PEs affect short-term memory scanning. Participants performed temporal predictions while they maintained information in memory. Model inference revealed that PEs affected memory scanning response time independently of the memory-set size effect. We discuss the results within the context of formal and mechanistic models of short-term memory scanning and predictive coding, a Bayes-based theory of brain function. We state the hypothesis that our finding could be associated with weak frontostriatal connections and weak striatal activity.

  2. Semi-automated Anatomical Labeling and Inter-subject Warping of High-Density Intracranial Recording Electrodes in Electrocorticography.

    PubMed

    Hamilton, Liberty S; Chang, David L; Lee, Morgan B; Chang, Edward F

    2017-01-01

    In this article, we introduce img_pipe, our open source python package for preprocessing of imaging data for use in intracranial electrocorticography (ECoG) and intracranial stereo-EEG analyses. The process of electrode localization, labeling, and warping for use in ECoG currently varies widely across laboratories, and it is usually performed with custom, lab-specific code. This python package aims to provide a standardized interface for these procedures, as well as code to plot and display results on 3D cortical surface meshes. It gives the user an easy interface to create anatomically labeled electrodes that can also be warped to an atlas brain, starting with only a preoperative T1 MRI scan and a postoperative CT scan. We describe the full capabilities of our imaging pipeline and present a step-by-step protocol for users.

  3. Monitoring eye movements to investigate the picture superiority effect in spatial memory.

    PubMed

    Cattaneo, Zaira; Rosen, Mitchell; Vecchi, Tomaso; Pelz, Jeff B

    2008-01-01

    Spatial memory is usually better for iconic than for verbal material. Our aim was to assess whether such effect is related to the way iconic and verbal targets are viewed when people have to memorize their locations. Eye movements were recorded while participants memorized the locations of images or words. Images received fewer, but longer, gazes than words. Longer gazes on images might reflect greater attention devoted to images due to their higher sensorial distinctiveness and/or generation with images of an additional phonological code beyond the visual code immediately available. We found that words were scanned mainly from left to right while a more heterogeneous scanning strategy characterized encoding of images. This suggests that iconic configurations tend to be maintained as global integrated representations in which all the item/location pairs are simultaneously present whilst verbal configurations are maintained through more sequential processes.

  4. Efficient Transition State Optimization of Periodic Structures through Automated Relaxed Potential Energy Surface Scans.

    PubMed

    Plessow, Philipp N

    2018-02-13

    This work explores how constrained linear combinations of bond lengths can be used to optimize transition states in periodic structures. Scanning of constrained coordinates is a standard approach for molecular codes with localized basis functions, where a full set of internal coordinates is used for optimization. Common plane wave-codes for periodic boundary conditions almost exlusively rely on Cartesian coordinates. An implementation of constrained linear combinations of bond lengths with Cartesian coordinates is described. Along with an optimization of the value of the constrained coordinate toward the transition states, this allows transition optimization within a single calculation. The approach is suitable for transition states that can be well described in terms of broken and formed bonds. In particular, the implementation is shown to be effective and efficient in the optimization of transition states in zeolite-catalyzed reactions, which have high relevance in industrial processes.

  5. Semi-automated Anatomical Labeling and Inter-subject Warping of High-Density Intracranial Recording Electrodes in Electrocorticography

    PubMed Central

    Hamilton, Liberty S.; Chang, David L.; Lee, Morgan B.; Chang, Edward F.

    2017-01-01

    In this article, we introduce img_pipe, our open source python package for preprocessing of imaging data for use in intracranial electrocorticography (ECoG) and intracranial stereo-EEG analyses. The process of electrode localization, labeling, and warping for use in ECoG currently varies widely across laboratories, and it is usually performed with custom, lab-specific code. This python package aims to provide a standardized interface for these procedures, as well as code to plot and display results on 3D cortical surface meshes. It gives the user an easy interface to create anatomically labeled electrodes that can also be warped to an atlas brain, starting with only a preoperative T1 MRI scan and a postoperative CT scan. We describe the full capabilities of our imaging pipeline and present a step-by-step protocol for users. PMID:29163118

  6. Lidar performance analysis

    NASA Technical Reports Server (NTRS)

    Spiers, Gary D.

    1994-01-01

    Section 1 details the theory used to build the lidar model, provides results of using the model to evaluate AEOLUS design instrument designs, and provides snapshots of the visual appearance of the coded model. Appendix A contains a Fortran program to calculate various forms of the refractive index structure function. This program was used to determine the refractive index structure function used in the main lidar simulation code. Appendix B contains a memo on the optimization of the lidar telescope geometry for a line-scan geometry. Appendix C contains the code for the main lidar simulation and brief instruction on running the code. Appendix D contains a Fortran code to calculate the maximum permissible exposure for the eye from the ANSI Z136.1-1992 eye safety standards. Appendix E contains a paper on the eye safety analysis of a space-based coherent lidar presented at the 7th Coherent Laser Radar Applications and Technology Conference, Paris, France, 19-23 July 1993.

  7. Physicochemical analog for modeling superimposed and coded memories

    NASA Astrophysics Data System (ADS)

    Ensanian, Minas

    1992-07-01

    The mammalian brain is distinguished by a life-time of memories being stored within the same general region of physicochemical space, and having two extraordinary features. First, memories to varying degrees are superimposed, as well as coded. Second, instantaneous recall of past events can often be affected by relatively simple, and seemingly unrelated sensory clues. For the purposes of attempting to mathematically model such complex behavior, and for gaining additional insights, it would be highly advantageous to be able to simulate or mimic similar behavior in a nonbiological entity where some analogical parameters of interest can reasonably be controlled. It has recently been discovered that in nonlinear accumulative metal fatigue memories (related to mechanical deformation) can be superimposed and coded in the crystal lattice, and that memory, that is, the total number of stress cycles can be recalled (determined) by scanning not the surfaces but the `edges' of the objects. The new scanning technique known as electrotopography (ETG) now makes the state space modeling of metallic networks possible. The author provides an overview of the new field and outlines the areas that are of immediate interest to the science of artificial neural networks.

  8. Inversions of synthetic umbral flashes: Effects of scanning time on the inferred atmospheres

    NASA Astrophysics Data System (ADS)

    Felipe, T.; Socas-Navarro, H.; Przybylski, D.

    2018-06-01

    Context. The use of instruments that record narrowband images at selected wavelengths is a common approach in solar observations. They allow scanning of a spectral line by sampling the Stokes profiles with two-dimensional images at each line position, but require a compromise between spectral resolution and temporal cadence. The interpretation and inversion of spectropolarimetric data generally neglect changes in the solar atmosphere during the scanning of line profiles. Aims: We evaluate the impact of the time-dependent acquisition of various wavelengths on the inversion of spectropolarimetric profiles from chromospheric lines during umbral flashes. Methods: Numerical simulations of nonlinear wave propagation in a sunspot model were performed with the code MANCHA. Synthetic Stokes parameters in the Ca II 8542 Å line in NLTE were computed for an umbral flash event using the code NICOLE. Artificial profiles with the same wavelength coverage and temporal cadence from reported observations were constructed and inverted. The inferred atmospheric stratifications were compared with the original simulated models. Results: The inferred atmospheres provide a reasonable characterization of the thermodynamic properties of the atmosphere during most of the phases of the umbral flash. The Stokes profiles present apparent wavelength shifts and other spurious deformations at the early stages of the flash, when the shock wave reaches the formation height of the Ca II 8542 Å line. These features are misinterpreted by the inversion code, which can return unrealistic atmospheric models from a good fit of the Stokes profiles. The misguided results include flashed atmospheres with strong downflows, even though the simulation exhibits upflows during the umbral flash, and large variations in the magnetic field strength. Conclusions: Our analyses validate the inversion of Stokes profiles acquired by sequentially scanning certain selected wavelengths of a line profile, even in the case of rapidly changing chromospheric events such as umbral flashes. However, the inversion results are unreliable during a short period at the development phase of the flash.

  9. An object-oriented programming system for the integration of internet-based bioinformatics resources.

    PubMed

    Beveridge, Allan

    2006-01-01

    The Internet consists of a vast inhomogeneous reservoir of data. Developing software that can integrate a wide variety of different data sources is a major challenge that must be addressed for the realisation of the full potential of the Internet as a scientific research tool. This article presents a semi-automated object-oriented programming system for integrating web-based resources. We demonstrate that the current Internet standards (HTML, CGI [common gateway interface], Java, etc.) can be exploited to develop a data retrieval system that scans existing web interfaces and then uses a set of rules to generate new Java code that can automatically retrieve data from the Web. The validity of the software has been demonstrated by testing it on several biological databases. We also examine the current limitations of the Internet and discuss the need for the development of universal standards for web-based data.

  10. The Architecture Design of Detection and Calibration System for High-voltage Electrical Equipment

    NASA Astrophysics Data System (ADS)

    Ma, Y.; Lin, Y.; Yang, Y.; Gu, Ch; Yang, F.; Zou, L. D.

    2018-01-01

    With the construction of Material Quality Inspection Center of Shandong electric power company, Electric Power Research Institute takes on more jobs on quality analysis and laboratory calibration for high-voltage electrical equipment, and informationization construction becomes urgent. In the paper we design a consolidated system, which implements the electronic management and online automation process for material sampling, test apparatus detection and field test. In the three jobs we use QR code scanning, online Word editing and electronic signature. These techniques simplify the complex process of warehouse management and testing report transferring, and largely reduce the manual procedure. The construction of the standardized detection information platform realizes the integrated management of high-voltage electrical equipment from their networking, running to periodic detection. According to system operation evaluation, the speed of transferring report is doubled, and querying data is also easier and faster.

  11. Implementation of a web-based medication tracking system in a large academic medical center.

    PubMed

    Calabrese, Sam V; Williams, Jonathan P

    2012-10-01

    Pharmacy workflow efficiencies achieved through the use of an electronic medication-tracking system are described. Medication dispensing turnaround times at the inpatient pharmacy of a large hospital were evaluated before and after transition from manual medication tracking to a Web-based tracking process involving sequential bar-code scanning and real-time monitoring of medication status. The transition was carried out in three phases: (1) a workflow analysis, including the identification of optimal points for medication scanning with hand-held wireless devices, (2) the phased implementation of an automated solution and associated hardware at a central dispensing pharmacy and three satellite locations, and (3) postimplementation data collection to evaluate the impact of the new tracking system and areas for improvement. Relative to the manual tracking method, electronic medication tracking allowed the capture of far more data points, enabling the pharmacy team to delineate the time required for each step of the medication dispensing process and to identify the steps most likely to involve delays. A comparison of baseline and postimplementation data showed substantial reductions in overall medication turnaround times with the use of the Web-based tracking system (time reductions of 45% and 22% at the central and satellite sites, respectively). In addition to more accurate projections and documentation of turnaround times, the Web-based tracking system has facilitated quality-improvement initiatives. Implementation of an electronic tracking system for monitoring the delivery of medications provided a comprehensive mechanism for calculating turnaround times and allowed the pharmacy to identify bottlenecks within the medication distribution system. Altering processes removed these bottlenecks and decreased delivery turnaround times.

  12. Research Performed within the Non-Destructive Evaluation Team at NASA Glenn Research Center

    NASA Technical Reports Server (NTRS)

    Burns, Erin A.

    2004-01-01

    Non-destructive testing is essential in many fields of manufacturing and research in order to perform reliable examination of potentially damaged materials and parts without destroying the inherent structure of the materials. Thus, the Non-Destructive Evaluation (NDE) Team at NASA Glenn Research Center partakes in various projects to improve materials testing equipment as well as analyze materials, material defects, and material deficiencies. Due to the array of projects within the NDE Team at this time, five research aims were supplemental to some current projects. A literature survey of "DE and testing methodologies as related to rocks was performed. Also, Mars Expedition Rover technology was assessed to understand the requirements for instrumentation in harsh space environments (e.g. temperature). Potential instrumentation and technologies were also considered and documented. The literature survey provided background and potential sources for a proposal to acquire funding for ultrasonic instrumentation on board a future Mars expedition. The laboratory uses a Santec Systems AcousticScope AS200 acoustography system. Labview code was written within the current program in order to improve the current performance of the acoustography system. A sample of Reinforced Carbon/Carbon (RCC) material from the leading edge of the space shuttle underwent various non-destructive tests (guided wave scanning, thermography, computed tomography, real time x-ray, etc.) in order to characterize its structure and examine possible defects. Guided wave scan data of a ceramic matrix composite (CMC) panel was reanalyzed utilizing image correlations and signal processing variables. Additional guided wave scans and thermography were also performed on the CMC panel. These reevaluated data and images will be used in future presentations and publications. An additional axis for the guided wave scanner was designed, constructed, and implemented. This additional axis allowed incremental spacing of the previously fixed transducers for ultrasonic velocity measurements.

  13. Application of computer generated color graphic techniques to the processing and display of three dimensional fluid dynamic data

    NASA Technical Reports Server (NTRS)

    Anderson, B. H.; Putt, C. W.; Giamati, C. C.

    1981-01-01

    Color coding techniques used in the processing of remote sensing imagery were adapted and applied to the fluid dynamics problems associated with turbofan mixer nozzles. The computer generated color graphics were found to be useful in reconstructing the measured flow field from low resolution experimental data to give more physical meaning to this information and in scanning and interpreting the large volume of computer generated data from the three dimensional viscous computer code used in the analysis.

  14. 2010 Workplace and Gender Relations Survey of Active Duty Members. Administration, Datasets, and Codebook

    DTIC Science & Technology

    2011-04-01

    from the survey litho code list if a survey form was sent or independently if only a letter was sent. Ticket Numbers for Web Survey Access Prior...variables BATCH, SERIAL, and LITHO uniquely identify each returned survey. LITHO is the lithocode scanned from the survey. BATCH and SERIAL are the...Uned 593 LEADERSAT Tabs: Leadership Satisfaction Scale- Q11 176 LITHO * Litho code 1086 MAILTYP* Mail Type 1087 MENTOR 12. [12] Do you have a

  15. Evaluation and application of a fast module in a PLC based interlock and control system

    NASA Astrophysics Data System (ADS)

    Zaera-Sanz, M.

    2009-08-01

    The LHC Beam Interlock system requires a controller performing a simple matrix function to collect the different beam dump requests. To satisfy the expected safety level of the Interlock, the system should be robust and reliable. The PLC is a promising candidate to fulfil both aspects but too slow to meet the expected response time which is of the order of μseconds. Siemens has introduced a ``so called'' fast module (FM352-5 Boolean Processor). It provides independent and extremely fast control of a process within a larger control system using an onboard processor, a Field Programmable Gate Array (FPGA), to execute code in parallel which results in extremely fast scan times. It is interesting to investigate its features and to evaluate it as a possible candidate for the beam interlock system. This paper publishes the results of this study. As well, this paper could be useful for other applications requiring fast processing using a PLC.

  16. Soul on Silicon.

    ERIC Educational Resources Information Center

    Kurzweil, Raymond C.

    1994-01-01

    Summarizes recent advances in computer simulation and "reverse engineering" technologies, highlighting the Human Genome Project to scan the human genetic code; artificial retina chips to copy the human retina's neural organization; high-speed, high-resolution Magnetic Resonance Imaging scanners; and the virtual book. Discusses…

  17. 3D laser scanning for quality control and assurance in bridge deck construction.

    DOT National Transportation Integrated Search

    2014-08-01

    The inspection of installations of rebar and other embedded components in bridge deck construction is a tedious : task for eld inspectors, requiring considerable eld time for measurement and verication against code requirement. The verica...

  18. Outpatients flow management and ophthalmic electronic medical records system in university hospital using Yahgee Document View.

    PubMed

    Matsuo, Toshihiko; Gochi, Akira; Hirakawa, Tsuyoshi; Ito, Tadashi; Kohno, Yoshihisa

    2010-10-01

    General electronic medical records systems remain insufficient for ophthalmology outpatient clinics from the viewpoint of dealing with many ophthalmic examinations and images in a large number of patients. Filing systems for documents and images by Yahgee Document View (Yahgee, Inc.) were introduced on the platform of general electronic medical records system (Fujitsu, Inc.). Outpatients flow management system and electronic medical records system for ophthalmology were constructed. All images from ophthalmic appliances were transported to Yahgee Image by the MaxFile gateway system (P4 Medic, Inc.). The flow of outpatients going through examinations such as visual acuity testing were monitored by the list "Ophthalmology Outpatients List" by Yahgee Workflow in addition to the list "Patients Reception List" by Fujitsu. Patients' identification number was scanned with bar code readers attached to ophthalmic appliances. Dual monitors were placed in doctors' rooms to show Fujitsu Medical Records on the left-hand monitor and ophthalmic charts of Yahgee Document on the right-hand monitor. The data of manually-inputted visual acuity, automatically-exported autorefractometry and non-contact tonometry on a new template, MaxFile ED, were again automatically transported to designated boxes on ophthalmic charts of Yahgee Document. Images such as fundus photographs, fluorescein angiograms, optical coherence tomographic and ultrasound scans were viewed by Yahgee Image, and were copy-and-pasted to assigned boxes on the ophthalmic charts. Ordering such as appointments, drug prescription, fees and diagnoses input, central laboratory tests, surgical theater and ward room reservations were placed by functions of the Fujitsu electronic medical records system. The combination of the Fujitsu electronic medical records and Yahgee Document View systems enabled the University Hospital to examine the same number of outpatients as prior to the implementation of the computerized filing system.

  19. SU-E-T-594: Preliminary Active Scanning Results of KHIMA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kim, C; Yang, T; Chang, S

    Purpose: To verify the design criteria on heavy ion beam irradiation, developing a proto type active scanning system was purposed. The active scanning system consists of scanning magnet, power supplies, beam monitors, energy modulation system, and irradiation control system. Methods: Each components of the active scanning system was designed for carbon beam first. For the fast ramping a laminated yoke was purposed. To measure incoming dose and profile, a plate and strip type of ion chambers were designed. Also, ridge filter and range shifter was manufactured. And, the scanning system was modified to adopt 45 MeV of proton beam becausemore » of the absence of carbon ion beam in Korea. The system was installed in a beam line at MC-50, KIRAMS. Also, the irradiation control system and planning software was provided. Results: The scanning experiment was performed by drawing KHIMA logo on GaF film. The logo was scanned by 237 scanning points through time normalized intensity modulation. Also, a grid points scanning was performed to measure the scanning resolution and intensity resolution. Conclusion: A prototype active scanning system was successfully designed and manufactured. Also, an initial experiment to print out a drawing on GaF film through the scanning system was completed. More experiments would be required to specify the system performance.« less

  20. Fast scanning mode and its realization in a scanning acoustic microscope

    NASA Astrophysics Data System (ADS)

    Ju, Bing-Feng; Bai, Xiaolong; Chen, Jian

    2012-03-01

    The scanning speed of the two-dimensional stage dominates the efficiency of mechanical scanning measurement systems. This paper focused on a detailed scanning time analysis of conventional raster and spiral scan modes and then proposed two fast alternative scanning modes. Performed on a self-developed scanning acoustic microscope (SAM), the measured images obtained by using the conventional scan mode and fast scan modes are compared. The total scanning time is reduced by 29% of the two proposed fast scan modes. It will offer a better solution for high speed scanning without sacrificing the system stability, and will not introduce additional difficulties to the configuration of scanning measurement systems. They can be easily applied to the mechanical scanning measuring systems with different driving actuators such as piezoelectric, linear motor, dc motor, and so on. The proposed fast raster and square spiral scan modes are realized in SAM, but not specially designed for it. Therefore, they have universal adaptability and can be applied to other scanning measurement systems with two-dimensional mechanical scanning stages, such as atomic force microscope or scanning tunneling microscope.

  1. A system to track skin dose for neuro-interventional cone-beam computed tomography (CBCT)

    NASA Astrophysics Data System (ADS)

    Vijayan, Sarath; Xiong, Zhenyu; Rudin, Stephen; Bednarek, Daniel R.

    2016-03-01

    The skin-dose tracking system (DTS) provides a color-coded illustration of the cumulative skin-dose distribution on a closely-matching 3D graphic of the patient during fluoroscopic interventions in real-time for immediate feedback to the interventionist. The skin-dose tracking utility of DTS has been extended to include cone-beam computed tomography (CBCT) of neurointerventions. While the DTS was developed to track the entrance skin dose including backscatter, a significant part of the dose in CBCT is contributed by exit primary radiation and scatter due to the many overlapping projections during the rotational scan. The variation of backscatter inside and outside the collimated beam was measured with radiochromic film and a curve was fit to obtain a scatter spread function that could be applied in the DTS. Likewise, the exit dose distribution was measured with radiochromic film for a single projection and a correction factor was determined as a function of path length through the head. Both of these sources of skin dose are added for every projection in the CBCT scan to obtain a total dose mapping over the patient graphic. Results show the backscatter to follow a sigmoidal falloff near the edge of the beam, extending outside the beam as far as 8 cm. The exit dose measured for a cylindrical CTDI phantom was nearly 10 % of the entrance peak skin dose for the central ray. The dose mapping performed by the DTS for a CBCT scan was compared to that measured with radiochromic film and a CTDI-head phantom with good agreement.

  2. Coordinated design of coding and modulation systems

    NASA Technical Reports Server (NTRS)

    Massey, J. L.; Ancheta, T.; Johannesson, R.; Lauer, G.; Lee, L.

    1976-01-01

    The joint optimization of the coding and modulation systems employed in telemetry systems was investigated. Emphasis was placed on formulating inner and outer coding standards used by the Goddard Spaceflight Center. Convolutional codes were found that are nearly optimum for use with Viterbi decoding in the inner coding of concatenated coding systems. A convolutional code, the unit-memory code, was discovered and is ideal for inner system usage because of its byte-oriented structure. Simulations of sequential decoding on the deep-space channel were carried out to compare directly various convolutional codes that are proposed for use in deep-space systems.

  3. a New Approach for the Semi-Automatic Texture Generation of the Buildings Facades, from Terrestrial Laser Scanner Data

    NASA Astrophysics Data System (ADS)

    Oniga, E.

    2012-07-01

    The result of the terrestrial laser scanning is an impressive number of spatial points, each of them being characterized as position by the X, Y and Z co-ordinates, by the value of the laser reflectance and their real color, expressed as RGB (Red, Green, Blue) values. The color code for each LIDAR point is taken from the georeferenced digital images, taken with a high resolution panoramic camera incorporated in the scanner system. In this article I propose a new algorithm for the semiautomatic texture generation, using the color information, the RGB values of every point that has been taken by terrestrial laser scanning technology and the 3D surfaces defining the buildings facades, generated with the Leica Cyclone software. The first step is when the operator defines the limiting value, i.e. the minimum distance between a point and the closest surface. The second step consists in calculating the distances, or the perpendiculars drawn from each point to the closest surface. In the third step we associate the points whose 3D coordinates are known, to every surface, depending on the limiting value. The fourth step consists in computing the Voronoi diagram for the points that belong to a surface. The final step brings automatic association between the RGB value of the color code and the corresponding polygon of the Voronoi diagram. The advantage of using this algorithm is that we can obtain, in a semi-automatic manner, a photorealistic 3D model of the building.

  4. Automated landmark extraction for orthodontic measurement of faces using the 3-camera photogrammetry methodology.

    PubMed

    Deli, Roberto; Di Gioia, Eliana; Galantucci, Luigi Maria; Percoco, Gianluca

    2010-01-01

    To set up a three-dimensional photogrammetric scanning system for precise landmark measurements, without any physical contact, using a low-cost and noninvasive digital photogrammetric solution, for supporting several necessity in clinical orthodontics and/or surgery diagnosis. Thirty coded targets were directly applied onto the subject's face on the soft tissue landmarks, and then, 3 simultaneous photos were acquired using photogrammetry, at room light conditions. For comparison, a dummy head was digitized both with a photogrammetric technique and with the laser scanner Minolta Vivid 910i (Konica Minolta, Tokyo, Japan). The precise measurement of the landmarks is ranged between 0.017 and 0.029 mm. The system automatically measures spatial position of face landmarks, from which distances and angles can be obtained. The facial measurements were compared with those done using laser scanning and manual caliper. The adopted method gives higher precision than the others (0.022-mm mean value on points and 0.038-mm mean value on linear distances on a dummy head), is simple, and can be used easily as a standard routine. The study demonstrated the validity of photogrammetry for accurate digitization of human face landmarks. This research points out the potential of this low-cost photogrammetry approach for medical digitization.

  5. Using Quick Response Codes in the Classroom: Quality Outcomes.

    PubMed

    Zurmehly, Joyce; Adams, Kellie

    2017-10-01

    With smart device technology emerging, educators are challenged with redesigning teaching strategies using technology to allow students to participate dynamically and provide immediate answers. To facilitate integration of technology and to actively engage students, quick response codes were included in a medical surgical lecture. Quick response codes are two-dimensional square patterns that enable the coding or storage of more than 7000 characters that can be accessed via a quick response code scanning application. The aim of this quasi-experimental study was to explore quick response code use in a lecture and measure students' satisfaction (met expectations, increased interest, helped understand, and provided practice and prompt feedback) and engagement (liked most, liked least, wanted changed, and kept involved), assessed using an investigator-developed instrument. Although there was no statistically significant correlation of quick response use to examination scores, satisfaction scores were high, and there was a small yet positive association between how students perceived their learning with quick response codes and overall examination scores. Furthermore, on open-ended survey questions, students responded that they were satisfied with the use of quick response codes, appreciated the immediate feedback, and planned to use them in the clinical setting. Quick response codes offer a way to integrate technology into the classroom to provide students with instant positive feedback.

  6. SINE_scan: an efficient tool to discover short interspersed nuclear elements (SINEs) in large-scale genomic datasets.

    PubMed

    Mao, Hongliang; Wang, Hao

    2017-03-01

    Short Interspersed Nuclear Elements (SINEs) are transposable elements (TEs) that amplify through a copy-and-paste mode via RNA intermediates. The computational identification of new SINEs are challenging because of their weak structural signals and rapid diversification in sequences. Here we report SINE_Scan, a highly efficient program to predict SINE elements in genomic DNA sequences. SINE_Scan integrates hallmark of SINE transposition, copy number and structural signals to identify a SINE element. SINE_Scan outperforms the previously published de novo SINE discovery program. It shows high sensitivity and specificity in 19 plant and animal genome assemblies, of which sizes vary from 120 Mb to 3.5 Gb. It identifies numerous new families and substantially increases the estimation of the abundance of SINEs in these genomes. The code of SINE_Scan is freely available at http://github.com/maohlzj/SINE_Scan , implemented in PERL and supported on Linux. wangh8@fudan.edu.cn. Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press.

  7. SINE_scan: an efficient tool to discover short interspersed nuclear elements (SINEs) in large-scale genomic datasets

    PubMed Central

    Mao, Hongliang

    2017-01-01

    Abstract Motivation: Short Interspersed Nuclear Elements (SINEs) are transposable elements (TEs) that amplify through a copy-and-paste mode via RNA intermediates. The computational identification of new SINEs are challenging because of their weak structural signals and rapid diversification in sequences. Results: Here we report SINE_Scan, a highly efficient program to predict SINE elements in genomic DNA sequences. SINE_Scan integrates hallmark of SINE transposition, copy number and structural signals to identify a SINE element. SINE_Scan outperforms the previously published de novo SINE discovery program. It shows high sensitivity and specificity in 19 plant and animal genome assemblies, of which sizes vary from 120 Mb to 3.5 Gb. It identifies numerous new families and substantially increases the estimation of the abundance of SINEs in these genomes. Availability and Implementation: The code of SINE_Scan is freely available at http://github.com/maohlzj/SINE_Scan, implemented in PERL and supported on Linux. Contact: wangh8@fudan.edu.cn Supplementary information: Supplementary data are available at Bioinformatics online. PMID:28062442

  8. Partitioning of genetic variation between regulatory and coding gene segments: the predominance of software variation in genes encoding introvert proteins.

    PubMed

    Mitchison, A

    1997-01-01

    In considering genetic variation in eukaryotes, a fundamental distinction can be made between variation in regulatory (software) and coding (hardware) gene segments. For quantitative traits the bulk of variation, particularly that near the population mean, appears to reside in regulatory segments. The main exceptions to this rule concern proteins which handle extrinsic substances, here termed extrovert proteins. The immune system includes an unusually large proportion of this exceptional category, but even so its chief source of variation may well be polymorphism in regulatory gene segments. The main evidence for this view emerges from genome scanning for quantitative trait loci (QTL), which in the case of the immune system points to a major contribution of pro-inflammatory cytokine genes. Further support comes from sequencing of major histocompatibility complex (Mhc) class II promoters, where a high level of polymorphism has been detected. These Mhc promoters appear to act, in part at least, by gating the back-signal from T cells into antigen-presenting cells. Both these forms of polymorphism are likely to be sustained by the need for flexibility in the immune response. Future work on promoter polymorphism is likely to benefit from the input from genome informatics.

  9. Modeling human faces with multi-image photogrammetry

    NASA Astrophysics Data System (ADS)

    D'Apuzzo, Nicola

    2002-03-01

    Modeling and measurement of the human face have been increasing by importance for various purposes. Laser scanning, coded light range digitizers, image-based approaches and digital stereo photogrammetry are the used methods currently employed in medical applications, computer animation, video surveillance, teleconferencing and virtual reality to produce three dimensional computer models of the human face. Depending on the application, different are the requirements. Ours are primarily high accuracy of the measurement and automation in the process. The method presented in this paper is based on multi-image photogrammetry. The equipment, the method and results achieved with this technique are here depicted. The process is composed of five steps: acquisition of multi-images, calibration of the system, establishment of corresponding points in the images, computation of their 3-D coordinates and generation of a surface model. The images captured by five CCD cameras arranged in front of the subject are digitized by a frame grabber. The complete system is calibrated using a reference object with coded target points, which can be measured fully automatically. To facilitate the establishment of correspondences in the images, texture in the form of random patterns can be projected from two directions onto the face. The multi-image matching process, based on a geometrical constrained least squares matching algorithm, produces a dense set of corresponding points in the five images. Neighborhood filters are then applied on the matching results to remove the errors. After filtering the data, the three-dimensional coordinates of the matched points are computed by forward intersection using the results of the calibration process; the achieved mean accuracy is about 0.2 mm in the sagittal direction and about 0.1 mm in the lateral direction. The last step of data processing is the generation of a surface model from the point cloud and the application of smooth filters. Moreover, a color texture image can be draped over the model to achieve a photorealistic visualization. The advantage of the presented method over laser scanning and coded light range digitizers is the acquisition of the source data in a fraction of a second, allowing the measurement of human faces with higher accuracy and the possibility to measure dynamic events like the speech of a person.

  10. The use of porcine corrosion casts for teaching human anatomy.

    PubMed

    Eberlova, Lada; Liska, Vaclav; Mirka, Hynek; Tonar, Zbynek; Haviar, Stanislav; Svoboda, Milos; Benes, Jan; Palek, Richard; Emingr, Michal; Rosendorf, Jachym; Mik, Patrik; Leupen, Sarah; Lametschwandtner, Alois

    2017-09-01

    In teaching and learning human anatomy, anatomical autopsy and prosected specimens have always been indispensable. However, alternative methods must often be used to demonstrate particularly delicate structures. Corrosion casting of porcine organs with Biodur E20 ® Plus is valuable for teaching and learning both gross anatomy and, uniquely, the micromorphology of cardiovascular, respiratory, digestive, and urogenital systems. Assessments of casts with a stereomicroscope and/or scanning electron microscope as well as highlighting cast structures using color coding help students to better understand how the structures that they have observed as two-dimensional images actually exist in three dimensions, and students found using the casts to be highly effective in their learning. Reconstructions of cast hollow structures from (micro-)computed tomography scans and videos facilitate detailed analyses of branching patterns and spatial arrangements in cast structures, aid in the understanding of clinically relevant structures and provide innovative visual aids. The casting protocol and teaching manual we offer can be adjusted to different technical capabilities and might also be found useful for veterinary or other biological science classes. Copyright © 2017 Elsevier GmbH. All rights reserved.

  11. Mapping and correction of the CMM workspace error with the use of an electronic gyroscope and neural networks--practical application.

    PubMed

    Swornowski, Pawel J

    2013-01-01

    The article presents the application of neural networks in determining and correction of the deformation of a coordinate measuring machine (CMM) workspace. The information about the CMM errors is acquired using an ADXRS401 electronic gyroscope. A test device (PS-20 module) was built and integrated with a commercial measurement system based on the SP25M passive scanning probe and with a PH10M module (Renishaw). The proposed solution was tested on a Kemco 600 CMM and on a DEA Global Clima CMM. In the former case, correction of the CMM errors was performed using the source code of WinIOS software owned by The Institute of Advanced Manufacturing Technology, Cracow, Poland and in the latter on an external PC. Optimum parameters of full and simplified mapping of a given layer of the CMM workspace were determined for practical applications. The proposed method can be employed for the interim check (ISO 10360-2 procedure) or to detect local CMM deformations, occurring when the CMM works at high scanning speeds (>20 mm/s). © Wiley Periodicals, Inc.

  12. TH-AB-209-12: Tissue Equivalent Phantom with Excised Human Tissue for Assessing Clinical Capabilities of Coherent Scatter Imaging Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Albanese, K; Morris, R; Spencer, J

    Purpose: Previously we reported the development of anthropomorphic tissue-equivalent scatter phantoms of the human breast. Here we present the first results from the scatter imaging of the tissue equivalent breast phantoms for breast cancer diagnosis. Methods: A breast phantom was designed to assess the capability of coded aperture coherent x-ray scatter imaging to classify different types of breast tissue (adipose, fibroglandular, tumor). The phantom geometry was obtained from a prone breast geometry scanned on a dedicated breast CT system. The phantom was 3D printed using the segmented DICOM breast CT data. The 3D breast phantom was filled with lard (asmore » a surrogate for adipose tissue) and scanned in different geometries alongside excised human breast tissues (obtained from lumpectomy and mastectomy procedures). The raw data were reconstructed using a model-based reconstruction algorithm and yielded the location and form factor (i.e., momentum transfer (q) spectrum) of the materials that were imaged. The measured material form factors were then compared to the ground truth measurements acquired by x-ray diffraction (XRD) imaging. Results: Our scatter imaging system was able to define the location and composition of the various materials and tissues within the phantom. Cancerous breast tissue was detected and classified through automated spectral matching and an 86% correlation threshold. The total scan time for the sample was approximately 10 minutes and approaches workflow times for clinical use in intra-operative or other diagnostic tasks. Conclusion: This work demonstrates the first results from an anthropomorphic tissue equivalent scatter phantom to characterize a coherent scatter imaging system. The functionality of the system shows promise in applications such as intra-operative margin detection or virtual biopsy in the diagnosis of breast cancer. Future work includes using additional patient-derived tissues (e.g., human fat), and modeling additional organs (e.g., lung).« less

  13. Fully automated macular pathology detection in retina optical coherence tomography images using sparse coding and dictionary learning

    NASA Astrophysics Data System (ADS)

    Sun, Yankui; Li, Shan; Sun, Zhongyang

    2017-01-01

    We propose a framework for automated detection of dry age-related macular degeneration (AMD) and diabetic macular edema (DME) from retina optical coherence tomography (OCT) images, based on sparse coding and dictionary learning. The study aims to improve the classification performance of state-of-the-art methods. First, our method presents a general approach to automatically align and crop retina regions; then it obtains global representations of images by using sparse coding and a spatial pyramid; finally, a multiclass linear support vector machine classifier is employed for classification. We apply two datasets for validating our algorithm: Duke spectral domain OCT (SD-OCT) dataset, consisting of volumetric scans acquired from 45 subjects-15 normal subjects, 15 AMD patients, and 15 DME patients; and clinical SD-OCT dataset, consisting of 678 OCT retina scans acquired from clinics in Beijing-168, 297, and 213 OCT images for AMD, DME, and normal retinas, respectively. For the former dataset, our classifier correctly identifies 100%, 100%, and 93.33% of the volumes with DME, AMD, and normal subjects, respectively, and thus performs much better than the conventional method; for the latter dataset, our classifier leads to a correct classification rate of 99.67%, 99.67%, and 100.00% for DME, AMD, and normal images, respectively.

  14. An efficient coding algorithm for the compression of ECG signals using the wavelet transform.

    PubMed

    Rajoub, Bashar A

    2002-04-01

    A wavelet-based electrocardiogram (ECG) data compression algorithm is proposed in this paper. The ECG signal is first preprocessed, the discrete wavelet transform (DWT) is then applied to the preprocessed signal. Preprocessing guarantees that the magnitudes of the wavelet coefficients be less than one, and reduces the reconstruction errors near both ends of the compressed signal. The DWT coefficients are divided into three groups, each group is thresholded using a threshold based on a desired energy packing efficiency. A binary significance map is then generated by scanning the wavelet decomposition coefficients and outputting a binary one if the scanned coefficient is significant, and a binary zero if it is insignificant. Compression is achieved by 1) using a variable length code based on run length encoding to compress the significance map and 2) using direct binary representation for representing the significant coefficients. The ability of the coding algorithm to compress ECG signals is investigated, the results were obtained by compressing and decompressing the test signals. The proposed algorithm is compared with direct-based and wavelet-based compression algorithms and showed superior performance. A compression ratio of 24:1 was achieved for MIT-BIH record 117 with a percent root mean square difference as low as 1.08%.

  15. Fully automated macular pathology detection in retina optical coherence tomography images using sparse coding and dictionary learning.

    PubMed

    Sun, Yankui; Li, Shan; Sun, Zhongyang

    2017-01-01

    We propose a framework for automated detection of dry age-related macular degeneration (AMD) and diabetic macular edema (DME) from retina optical coherence tomography (OCT) images, based on sparse coding and dictionary learning. The study aims to improve the classification performance of state-of-the-art methods. First, our method presents a general approach to automatically align and crop retina regions; then it obtains global representations of images by using sparse coding and a spatial pyramid; finally, a multiclass linear support vector machine classifier is employed for classification. We apply two datasets for validating our algorithm: Duke spectral domain OCT (SD-OCT) dataset, consisting of volumetric scans acquired from 45 subjects—15 normal subjects, 15 AMD patients, and 15 DME patients; and clinical SD-OCT dataset, consisting of 678 OCT retina scans acquired from clinics in Beijing—168, 297, and 213 OCT images for AMD, DME, and normal retinas, respectively. For the former dataset, our classifier correctly identifies 100%, 100%, and 93.33% of the volumes with DME, AMD, and normal subjects, respectively, and thus performs much better than the conventional method; for the latter dataset, our classifier leads to a correct classification rate of 99.67%, 99.67%, and 100.00% for DME, AMD, and normal images, respectively.

  16. Embed dynamic content in your poster.

    PubMed

    Hutchins, B Ian

    2013-01-29

    A new technology has emerged that will facilitate the presentation of dynamic or otherwise inaccessible data on posters at scientific meetings. Video, audio, or other digital files hosted on mobile-friendly sites can be linked to through a quick response (QR) code, a two-dimensional barcode that can be scanned by smartphones, which then display the content. This approach is more affordable than acquiring tablet computers for playing dynamic content and can reach many users at large conferences. This resource details how to host videos, generate QR codes, and view the associated files on mobile devices.

  17. Automatic Registration of Scanned Satellite Imagery with a Digital Map Data Base.

    DTIC Science & Technology

    1980-11-01

    define the corresponding map window (mW)(procedure TRANSFORMWINDOW MAP A-- S4S Araofms Cpo iin et Serc Area deiatl compAr tal _______________ T...to a LIST-item). LIN: = ( ® code 2621431 ; ® pointer LA to the line list, © pointer PRI; pointer PR2), LIST: = ( Q pointer PL to a LIN-item; n pointer...items where PL -pointers are replaced by a code for the beginning (the number 262140 in our case) and end (the number 26241). Figure 3.2 illustra- tes a

  18. Techniques used for the analysis of oculometer eye-scanning data obtained from an air traffic control display

    NASA Technical Reports Server (NTRS)

    Crawford, Daniel J.; Burdette, Daniel W.; Capron, William R.

    1993-01-01

    The methodology and techniques used to collect and analyze look-point position data from a real-time ATC display-format comparison experiment are documented. That study compared the delivery precision and controller workload of three final approach spacing aid display formats. Using an oculometer, controller lookpoint position data were collected, associated with gaze objects (e.g., moving aircraft) on the ATC display, and analyzed to determine eye-scan behavior. The equipment involved and algorithms for saving, synchronizing with the ATC simulation output, and filtering the data are described. Target (gaze object) and cross-check scanning identification algorithms are also presented. Data tables are provided of total dwell times, average dwell times, and cross-check scans. Flow charts, block diagrams, file record descriptors, and source code are included. The techniques and data presented are intended to benefit researchers in other studies that incorporate non-stationary gaze objects and oculometer equipment.

  19. Effectiveness and usability of Scanning Wizard software: a tool for enhancing switch scanning.

    PubMed

    Koester, Heidi Horstmann; Simpson, Richard C

    2017-11-24

    Scanning Wizard software helps scanning users improve the setup of their switch and scanning system. This study evaluated Scanning Wizard's effectiveness and usability. Ten people who use switch scanning and ten practitioners used Scanning Wizard in the initial session. Usability was high, based on survey responses averaging over 4.5 out of 5, and qualitative feedback was very positive. Five switch users were able to complete the multi-week protocol, using settings on their own scanning system that were recommended from the Scanning Wizard session. Using these revised settings, text entry rates improved by an average of 71%, ranging from 29% to 172% improvement. Results suggest that Scanning Wizard is a useful tool for improving the configuration of scanning systems for people who use switch scanning to communicate. Implications for Rehabilitation Some individuals with severe physical impairments use switch scanning for spoken and written communication. Scanning Wizard software helps scanning users improve the setup of their switch and scanning system. This study demonstrated high usability of Scanning Wizard (with 10 switch userpractitioner teams) and increased text entry rate by an average of 71% (for five switch users). Results suggest that Scanning Wizard is a useful tool for improving the configuration of scanning systems for people who use switch scanning to communicate.

  20. The FORTRAN static source code analyzer program (SAP) user's guide, revision 1

    NASA Technical Reports Server (NTRS)

    Decker, W.; Taylor, W.; Eslinger, S.

    1982-01-01

    The FORTRAN Static Source Code Analyzer Program (SAP) User's Guide (Revision 1) is presented. SAP is a software tool designed to assist Software Engineering Laboratory (SEL) personnel in conducting studies of FORTRAN programs. SAP scans FORTRAN source code and produces reports that present statistics and measures of statements and structures that make up a module. This document is a revision of the previous SAP user's guide, Computer Sciences Corporation document CSC/TM-78/6045. SAP Revision 1 is the result of program modifications to provide several new reports, additional complexity analysis, and recognition of all statements described in the FORTRAN 77 standard. This document provides instructions for operating SAP and contains information useful in interpreting SAP output.

  1. Identifying subassemblies by ultrasound to prevent fuel handling error in sodium fast reactors: First test performed in water

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Paumel, Kevin; Lhuillier, Christian

    2015-07-01

    Identifying subassemblies by ultrasound is a method that is being considered to prevent handling errors in sodium fast reactors. It is based on the reading of a code (aligned notches) engraved on the subassembly head by an emitting/receiving ultrasonic sensor. This reading is carried out in sodium with high temperature transducers. The resulting one-dimensional C-scan can be likened to a binary code expressing the subassembly type and number. The first test performed in water investigated two parameters: width and depth of the notches. The code remained legible for notches as thin as 1.6 mm wide. The impact of the depthmore » seems minor in the range under investigation. (authors)« less

  2. Modeling the Deterioration of Engine and Low Pressure Compressor Performance During a Roll Back Event Due to Ice Accretion

    NASA Technical Reports Server (NTRS)

    Veres, Joseph P.; Jorgenson, Philip, C. E.; Jones, Scott M.

    2014-01-01

    The main focus of this study is to apply a computational tool for the flow analysis of the engine that has been tested with ice crystal ingestion in the Propulsion Systems Laboratory (PSL) of NASA Glenn Research Center. A data point was selected for analysis during which the engine experienced a full roll back event due to the ice accretion on the blades and flow path of the low pressure compressor. The computational tool consists of the Numerical Propulsion System Simulation (NPSS) engine system thermodynamic cycle code, and an Euler-based compressor flow analysis code, that has an ice particle melt estimation code with the capability of determining the rate of sublimation, melting, and evaporation through the compressor blade rows. Decreasing the performance characteristics of the low pressure compressor (LPC) within the NPSS cycle analysis resulted in matching the overall engine performance parameters measured during testing at data points in short time intervals through the progression of the roll back event. Detailed analysis of the fan-core and LPC with the compressor flow analysis code simulated the effects of ice accretion by increasing the aerodynamic blockage and pressure losses through the low pressure compressor until achieving a match with the NPSS cycle analysis results, at each scan. With the additional blockages and losses in the LPC, the compressor flow analysis code results were able to numerically reproduce the performance that was determined by the NPSS cycle analysis, which was in agreement with the PSL engine test data. The compressor flow analysis indicated that the blockage due to ice accretion in the LPC exit guide vane stators caused the exit guide vane (EGV) to be nearly choked, significantly reducing the air flow rate into the core. This caused the LPC to eventually be in stall due to increasing levels of diffusion in the rotors and high incidence angles in the inlet guide vane (IGV) and EGV stators. The flow analysis indicating compressor stall is substantiated by the video images of the IGV taken during the PSL test, which showed water on the surface of the IGV flowing upstream out of the engine, indicating flow reversal, which is characteristic of a stalled compressor.

  3. Obstacles to Industrial Implementation of Scanning Systems

    Treesearch

    Anders Astrom; Olog Broman; John Graffman; Anders Gronlund; Armas Jappinene; Jari Luostarinen; Jan Nystrom; Daniel L. Schmoldt

    1998-01-01

    Initially the group discussed what is meant by scanning systems. An operational definition was adopted to consider scanning system in the current context to be nontraditional scanning. Where, traditional scanning is defined as scanning that has been industrially operational and relatively common for several years-a mature technology. For example,...

  4. High energy x-ray phase contrast CT using glancing-angle grating interferometers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sarapata, A., E-mail: adrian.sarapata@tum.de; Stayman, J. W.; Siewerdsen, J. H.

    Purpose: The authors present initial progress toward a clinically compatible x-ray phase contrast CT system, using glancing-angle x-ray grating interferometry to provide high contrast soft tissue images at estimated by computer simulation dose levels comparable to conventional absorption based CT. Methods: DPC-CT scans of a joint phantom and of soft tissues were performed in order to answer several important questions from a clinical setup point of view. A comparison between high and low fringe visibility systems is presented. The standard phase stepping method was compared with sliding window interlaced scanning. Using estimated dose values obtained with a Monte-Carlo code themore » authors studied the dependence of the phase image contrast on exposure time and dose. Results: Using a glancing angle interferometer at high x-ray energy (∼45 keV mean value) in combination with a conventional x-ray tube the authors achieved fringe visibility values of nearly 50%, never reported before. High fringe visibility is shown to be an indispensable parameter for a potential clinical scanner. Sliding window interlaced scanning proved to have higher SNRs and CNRs in a region of interest and to also be a crucial part of a low dose CT system. DPC-CT images of a soft tissue phantom at exposures in the range typical for absorption based CT of musculoskeletal extremities were obtained. Assuming a human knee as the CT target, good soft tissue phase contrast could be obtained at an estimated absorbed dose level around 8 mGy, similar to conventional CT. Conclusions: DPC-CT with glancing-angle interferometers provides improved soft tissue contrast over absorption CT even at clinically compatible dose levels (estimated by a Monte-Carlo computer simulation). Further steps in image processing, data reconstruction, and spectral matching could make the technique fully clinically compatible. Nevertheless, due to its increased scan time and complexity the technique should be thought of not as replacing, but as complimentary to conventional CT, to be used in specific applications.« less

  5. System Design Description for the TMAD Code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Finfrock, S.H.

    This document serves as the System Design Description (SDD) for the TMAD Code System, which includes the TMAD code and the LIBMAKR code. The SDD provides a detailed description of the theory behind the code, and the implementation of that theory. It is essential for anyone who is attempting to review or modify the code or who otherwise needs to understand the internal workings of the code. In addition, this document includes, in Appendix A, the System Requirements Specification for the TMAD System.

  6. Temporal Coding of Volumetric Imagery

    NASA Astrophysics Data System (ADS)

    Llull, Patrick Ryan

    'Image volumes' refer to realizations of images in other dimensions such as time, spectrum, and focus. Recent advances in scientific, medical, and consumer applications demand improvements in image volume capture. Though image volume acquisition continues to advance, it maintains the same sampling mechanisms that have been used for decades; every voxel must be scanned and is presumed independent of its neighbors. Under these conditions, improving performance comes at the cost of increased system complexity, data rates, and power consumption. This dissertation explores systems and methods capable of efficiently improving sensitivity and performance for image volume cameras, and specifically proposes several sampling strategies that utilize temporal coding to improve imaging system performance and enhance our awareness for a variety of dynamic applications. Video cameras and camcorders sample the video volume (x,y,t) at fixed intervals to gain understanding of the volume's temporal evolution. Conventionally, one must reduce the spatial resolution to increase the framerate of such cameras. Using temporal coding via physical translation of an optical element known as a coded aperture, the compressive temporal imaging (CACTI) camera emonstrates a method which which to embed the temporal dimension of the video volume into spatial (x,y) measurements, thereby greatly improving temporal resolution with minimal loss of spatial resolution. This technique, which is among a family of compressive sampling strategies developed at Duke University, temporally codes the exposure readout functions at the pixel level. Since video cameras nominally integrate the remaining image volume dimensions (e.g. spectrum and focus) at capture time, spectral (x,y,t,lambda) and focal (x,y,t,z) image volumes are traditionally captured via sequential changes to the spectral and focal state of the system, respectively. The CACTI camera's ability to embed video volumes into images leads to exploration of other information within that video; namely, focal and spectral information. The next part of the thesis demonstrates derivative works of CACTI: compressive extended depth of field and compressive spectral-temporal imaging. These works successfully show the technique's extension of temporal coding to improve sensing performance in these other dimensions. Geometrical optics-related tradeoffs, such as the classic challenges of wide-field-of-view and high resolution photography, have motivated the development of mulitscale camera arrays. The advent of such designs less than a decade ago heralds a new era of research- and engineering-related challenges. One significant challenge is that of managing the focal volume (x,y,z ) over wide fields of view and resolutions. The fourth chapter shows advances on focus and image quality assessment for a class of multiscale gigapixel cameras developed at Duke. Along the same line of work, we have explored methods for dynamic and adaptive addressing of focus via point spread function engineering. We demonstrate another form of temporal coding in the form of physical translation of the image plane from its nominal focal position. We demonstrate this technique's capability to generate arbitrary point spread functions.

  7. Error-correction coding for digital communications

    NASA Astrophysics Data System (ADS)

    Clark, G. C., Jr.; Cain, J. B.

    This book is written for the design engineer who must build the coding and decoding equipment and for the communication system engineer who must incorporate this equipment into a system. It is also suitable as a senior-level or first-year graduate text for an introductory one-semester course in coding theory. Fundamental concepts of coding are discussed along with group codes, taking into account basic principles, practical constraints, performance computations, coding bounds, generalized parity check codes, polynomial codes, and important classes of group codes. Other topics explored are related to simple nonalgebraic decoding techniques for group codes, soft decision decoding of block codes, algebraic techniques for multiple error correction, the convolutional code structure and Viterbi decoding, syndrome decoding techniques, and sequential decoding techniques. System applications are also considered, giving attention to concatenated codes, coding for the white Gaussian noise channel, interleaver structures for coded systems, and coding for burst noise channels.

  8. BioShuttle-mediated Plasmid Transfer

    PubMed Central

    Braun, Klaus; von Brasch, Leonie; Pipkorn, Ruediger; Ehemann, Volker; Jenne, Juergen; Spring, Herbert; Debus, Juergen; Didinger, Bernd; Rittgen, Werner; Waldeck, Waldemar

    2007-01-01

    An efficient gene transfer into target tissues and cells is needed for safe and effective treatment of genetic diseases like cancer. In this paper, we describe the development of a transport system and show its ability for transporting plasmids. This non-viral peptide-based BioShuttle-mediated transfer system consists of a nuclear localization address sequence realizing the delivery of the plasmid phNIS-IRES-EGFP coding for two independent reporter genes into nuclei of HeLa cells. The quantification of the transfer efficiency was achieved by measurements of the sodium iodide symporter activity. EGFP gene expression was measured with Confocal Laser Scanning Microscopy and quantified with biostatistical methods by analysis of the frequency of the amplitude distribution in the CLSM images. The results demonstrate that the “BioShuttle”-Technology is an appropriate tool for an effective transfer of genetic material carried by a plasmid. PMID:18026568

  9. Development and Application of Syndromic Surveillance for Severe Weather Events Following Hurricane Sandy.

    PubMed

    Tsai, Stella; Hamby, Teresa; Chu, Alvin; Gleason, Jessie A; Goodrow, Gabrielle M; Gu, Hui; Lifshitz, Edward; Fagliano, Jerald A

    2016-06-01

    Following Hurricane Superstorm Sandy, the New Jersey Department of Health (NJDOH) developed indicators to enhance syndromic surveillance for extreme weather events in EpiCenter, an online system that collects and analyzes real-time chief complaint emergency department (ED) data and classifies each visit by indicator or syndrome. These severe weather indicators were finalized by using 2 steps: (1) key word inclusion by review of chief complaints from cases where diagnostic codes met selection criteria and (2) key word exclusion by evaluating cases with key words of interest that lacked selected diagnostic codes. Graphs compared 1-month, 3-month, and 1-year periods of 8 Hurricane Sandy-related severe weather event indicators against the same period in the following year. Spikes in overall ED visits were observed immediately after the hurricane for carbon monoxide (CO) poisoning, the 3 disrupted outpatient medical care indicators, asthma, and methadone-related substance use. Zip code level scan statistics indicated clusters of CO poisoning and increased medicine refill needs during the 2 weeks after Hurricane Sandy. CO poisoning clusters were identified in areas with power outages of 4 days or longer. This endeavor gave the NJDOH a clearer picture of the effects of Hurricane Sandy and yielded valuable state preparation information to monitor the effects of future severe weather events. (Disaster Med Public Health Preparedness. 2016;10:463-471).

  10. Concatenated coding systems employing a unit-memory convolutional code and a byte-oriented decoding algorithm

    NASA Technical Reports Server (NTRS)

    Lee, L.-N.

    1977-01-01

    Concatenated coding systems utilizing a convolutional code as the inner code and a Reed-Solomon code as the outer code are considered. In order to obtain very reliable communications over a very noisy channel with relatively modest coding complexity, it is proposed to concatenate a byte-oriented unit-memory convolutional code with an RS outer code whose symbol size is one byte. It is further proposed to utilize a real-time minimal-byte-error probability decoding algorithm, together with feedback from the outer decoder, in the decoder for the inner convolutional code. The performance of the proposed concatenated coding system is studied, and the improvement over conventional concatenated systems due to each additional feature is isolated.

  11. Concatenated coding systems employing a unit-memory convolutional code and a byte-oriented decoding algorithm

    NASA Technical Reports Server (NTRS)

    Lee, L. N.

    1976-01-01

    Concatenated coding systems utilizing a convolutional code as the inner code and a Reed-Solomon code as the outer code are considered. In order to obtain very reliable communications over a very noisy channel with relatively small coding complexity, it is proposed to concatenate a byte oriented unit memory convolutional code with an RS outer code whose symbol size is one byte. It is further proposed to utilize a real time minimal byte error probability decoding algorithm, together with feedback from the outer decoder, in the decoder for the inner convolutional code. The performance of the proposed concatenated coding system is studied, and the improvement over conventional concatenated systems due to each additional feature is isolated.

  12. Evaluation of tactual displays for flight control

    NASA Technical Reports Server (NTRS)

    Levison, W. H.; Tanner, R. B.; Triggs, T. J.

    1973-01-01

    Manual tracking experiments were conducted to determine the suitability of tactual displays for presenting flight-control information in multitask situations. Although tracking error scores are considerably greater than scores obtained with a continuous visual display, preliminary results indicate that inter-task interference effects are substantially less with the tactual display in situations that impose high visual scanning workloads. The single-task performance degradation found with the tactual display appears to be a result of the coding scheme rather than the use of the tactual sensory mode per se. Analysis with the state-variable pilot/vehicle model shows that reliable predictions of tracking errors can be obtained for wide-band tracking systems once the pilot-related model parameters have been adjusted to reflect the pilot-display interaction.

  13. Simulation of the injection damping and resonance correction systems for the HEB of the SSC

    NASA Astrophysics Data System (ADS)

    Li, M.; Zhang, P.; Machida, S.

    1993-12-01

    An injection damping and resonance correction system for the High Energy Booster (HEB) of the Superconducting Super Collider (SSC) was investigated by means of multiparticle tracking. For an injection damping study, the code Simpsons is modified to utilize two Beam Position Monitors (BPM) and two dampers. The particles of 200 Gev/c, numbered 1024 or more, with Gaussian distribution in 6-D phase space are injected into the HEB with certain injection offsets. The whole bunch of particles is then kicked in proportion to the BPM signals with some upper limit. Tracking these particles up to several hundred turns while the damping system is acting shows the turn-by-turn emittance growth, which is caused by the tune spread due to nonlinearity of the lattice and residual chromaticity with synchrotron oscillations. For a resonance correction study, the operating tune is scanned as a function of time so that a bunch goes through a resonance. The performance of the resonance correction system is demonstrated. We optimize the system parameters which satisfy the emittance budget of the HEB, taking into account the realistic hardware requirement.

  14. Interpretive Reporting of Protein Electrophoresis Data by Microcomputer

    PubMed Central

    Talamo, Thomas S.; Losos, Frank J.; Kessler, G. Frederick

    1982-01-01

    A microcomputer based system for interpretive reporting of protein electrophoretic data has been developed. Data for serum, urine and cerebrospinal fluid protein electrophoreses as well as immunoelectrophoresis can be entered. Patient demographic information is entered through the keyboard followed by manual entry of total and fractionated protein levels obtained after densitometer scanning of the electrophoretic strip. The patterns are then coded, interpreted, and final reports generated. In most cases interpretation time is less than one second. Misinterpretation by computer is uncommon and can be corrected by edit functions within the system. These discrepancies between computer and pathologist interpretation are automatically stored in a data file for later review and possible program modification. Any or all previous tests on a patient may be reviewed with graphic display of the electrophoretic pattern. The system has been in use for several months and is presently well accepted by both laboratory and clinical staff. It also allows rapid storage, retrieval and analysis of protein electrophoretic datab.

  15. Electronic patient registration and tracking at mass vaccination clinics: a clinical study.

    PubMed

    Billittier, Anthony J; Lupiani, Patrick; Masterson, Gary; Masterson, Tim; Zak, Christopher

    2003-01-01

    To protect the citizens of the United States from the use of dangerous biological agents, the Center for Disease Control and Prevention (CDC) has been actively preparing to deal with the consequences of such an attack. Their plans include the deployment of mass immunization clinics to handle postevent vaccinations. As part of the planning efforts by the Western New York Public Health Alliance, a Web-based electronic patient registration and tracking system was developed and tested at a recent trial smallpox vaccination clinic. Initial goals were to determine what the pitfalls and benefits of using such a system might be in comparison to other methods of data collection. This exercise proved that use of an electronic system capable of scanning two-dimensional bar codes was superior to both paper-based and optical character recognition (OCR) methods of data collection and management. Major improvements in speed and/or accuracy were evident in all areas of the clinic, especially in patient registration, vaccine tracking and postclinic data analysis.

  16. Development of a web-based CT dose calculator: WAZA-ARI.

    PubMed

    Ban, N; Takahashi, F; Sato, K; Endo, A; Ono, K; Hasegawa, T; Yoshitake, T; Katsunuma, Y; Kai, M

    2011-09-01

    A web-based computed tomography (CT) dose calculation system (WAZA-ARI) is being developed based on the modern techniques for the radiation transport simulation and for software implementation. Dose coefficients were calculated in a voxel-type Japanese adult male phantom (JM phantom), using the Particle and Heavy Ion Transport code System. In the Monte Carlo simulation, the phantom was irradiated with a 5-mm-thick, fan-shaped photon beam rotating in a plane normal to the body axis. The dose coefficients were integrated into the system, which runs as Java servlets within Apache Tomcat. Output of WAZA-ARI for GE LightSpeed 16 was compared with the dose values calculated similarly using MIRD and ICRP Adult Male phantoms. There are some differences due to the phantom configuration, demonstrating the significance of the dose calculation with appropriate phantoms. While the dose coefficients are currently available only for limited CT scanner models and scanning options, WAZA-ARI will be a useful tool in clinical practice when development is finalised.

  17. Technology-Enhanced Formative Assessment of Plant Identification

    NASA Astrophysics Data System (ADS)

    Conejo, Ricardo; Garcia-Viñas, Juan Ignacio; Gastón, Aitor; Barros, Beatriz

    2016-04-01

    Developing plant identification skills is an important part of the curriculum of any botany course in higher education. Frequent practice with dried and fresh plants is necessary to recognize the diversity of forms, states, and details that a species can present. We have developed a web-based assessment system for mobile devices that is able to pose appropriate questions according to the location of the student. A student's location can be obtained using the device position or by scanning a QR code attached to a dried plant sheet in a herbarium or to a fresh plant in an arboretum. The assessment questions are complemented with elaborated feedback that, according to the students' responses, provides indications of possible mistakes and correct answers. Three experiments were designed to measure the effectiveness of the formative assessment using dried and fresh plants. Three questionnaires were used to evaluate the system performance from the students' perspective. The results clearly indicate that formative assessment is objectively effective compared to traditional methods and that the students' attitudes towards the system were very positive.

  18. Computer Utilization by Schools: An Example.

    ERIC Educational Resources Information Center

    Tondow, Murray

    1968-01-01

    The Educational Data Services Department of the Palo Alto Unified School District is responsible for implementing data processing needs to improve the quality of education in Palo Alto, California. Information from the schools enters the Department data library to be scanned, coded, and corrected prior to IBM 1620 computer input. Operating 17…

  19. Scanning sky monitor (SSM) onboard AstroSat

    NASA Astrophysics Data System (ADS)

    Ramadevi, M. C.; Seetha, S.; Bhattacharya, Dipankar; Ravishankar, B. T.; Sitaramamurthy, N.; Meena, G.; Sharma, M. Ramakrishna; Kulkarni, Ravi; Babu, V. Chandra; Kumar; Singh, Brajpal; Jain, Anand; Yadav, Reena; Vaishali, S.; Ashoka, B. N.; Agarwal, Anil; Balaji, K.; Nagesh, G.; Kumar, Manoj; Gaan, Dhruti Ranjan; Kulshresta, Prashanth; Agarwal, Pankaj; Sebastian, Mathew; Rajarajan, A.; Radhika, D.; Nandi, Anuj; Girish, V.; Agarwal, Vivek Kumar; Kushwaha, Ankur; Iyer, Nirmal Kumar

    2017-10-01

    Scanning Sky Monitor (SSM) onboard AstroSat is an Xray sky monitor in the soft X-ray band designed with a large field of view to detect and locate transient X-ray sources and alert the astronomical community about interesting phenomena in the X-ray sky. SSM comprises position sensitive proportional counters with 1D coded mask for imaging. There are three detector units mounted on a platform capable of rotation which helps covering about 50% of the sky in one full rotation. This paper discusses the elaborate details of the instrument and few immediate results from the instrument after launch.

  20. PWMScan: a fast tool for scanning entire genomes with a position-specific weight matrix.

    PubMed

    Ambrosini, Giovanna; Groux, Romain; Bucher, Philipp

    2018-03-05

    Transcription factors (TFs) regulate gene expression by binding to specific short DNA sequences of 5 to 20-bp to regulate the rate of transcription of genetic information from DNA to messenger RNA. We present PWMScan, a fast web-based tool to scan server-resident genomes for matches to a user-supplied PWM or TF binding site model from a public database. The web server and source code are available at http://ccg.vital-it.ch/pwmscan and https://sourceforge.net/projects/pwmscan, respectively. giovanna.ambrosini@epfl.ch. SUPPLEMENTARY DATA ARE AVAILABLE AT BIOINFORMATICS ONLINE.

  1. Utility of a scanning densitometer in analyzing remotely sensed imagery

    NASA Technical Reports Server (NTRS)

    Dooley, J. T.

    1976-01-01

    The utility of a scanning densitometer for analyzing imagery in the NASA Lewis Research Center's regional remote sensing program was evaluated. Uses studied include: (1) quick-look screening of imagery by means of density slicing, magnification, color coding, and edge enhancement; (2) preliminary category classification of both low- and high-resolution data bases; and (3) quantitative measurement of the extent of features within selected areas. The densitometer was capable of providing fast, convenient, and relatively inexpensive preliminary analysis of aerial and satellite photography and scanner imagery involving land cover, water quality, strip mining, and energy conservation.

  2. Fracture modes in notched angleplied composite laminates

    NASA Technical Reports Server (NTRS)

    Irvine, T. B.; Ginty, C. A.

    1984-01-01

    The Composite Durability Structural Analysis (CODSTRAN) computer code is used to determine composite fracture. Fracture modes in solid and notched, unidirectional and angleplied graphite/epoxy composites were determined by using CODSTRAN. Experimental verification included both nondestructive (ultrasonic C-Scanning) and destructive (scanning electron microscopy) techniques. The fracture modes were found to be a function of ply orientations and whether the composite is notched or unnotched. Delaminations caused by stress concentrations around notch tips were also determined. Results indicate that the composite mechanics, structural analysis, laminate analysis, and fracture criteria modules embedded in CODSTRAN are valid for determining composite fracture modes.

  3. Tempest Simulations of Collisionless Damping of the Geodesic-Acoustic Mode in Edge-Plasma Pedestals

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xu, X. Q.; Xiong, Z.; Nevins, W. M.

    The fully nonlinear (full-f) four-dimensional TEMPEST gyrokinetic continuum code correctly produces the frequency and collisionless damping of geodesic-acoustic modes (GAMs) and zonal flow, with fully nonlinear Boltzmann electrons for the inverse aspect ratio {epsilon} scan and the tokamak safety factor q scan in homogeneous plasmas. TEMPEST simulations show that the GAMs exist in the edge pedestal for steep density and temperature gradients in the form of outgoing waves. The enhanced GAM damping may explain experimental beam emission spectroscopy measurements on the edge q scaling of the GAM amplitude.

  4. Tempest Simulations of Collisionless Damping of the Geodesic-Acoustic Mode in Edge-Plasma Pedestals

    NASA Astrophysics Data System (ADS)

    Xu, X. Q.; Xiong, Z.; Gao, Z.; Nevins, W. M.; McKee, G. R.

    2008-05-01

    The fully nonlinear (full-f) four-dimensional TEMPEST gyrokinetic continuum code correctly produces the frequency and collisionless damping of geodesic-acoustic modes (GAMs) and zonal flow, with fully nonlinear Boltzmann electrons for the inverse aspect ratio γ scan and the tokamak safety factor q scan in homogeneous plasmas. TEMPEST simulations show that the GAMs exist in the edge pedestal for steep density and temperature gradients in the form of outgoing waves. The enhanced GAM damping may explain experimental beam emission spectroscopy measurements on the edge q scaling of the GAM amplitude.

  5. TEMPEST simulations of collisionless damping of the geodesic-acoustic mode in edge-plasma pedestals.

    PubMed

    Xu, X Q; Xiong, Z; Gao, Z; Nevins, W M; McKee, G R

    2008-05-30

    The fully nonlinear (full-f) four-dimensional TEMPEST gyrokinetic continuum code correctly produces the frequency and collisionless damping of geodesic-acoustic modes (GAMs) and zonal flow, with fully nonlinear Boltzmann electrons for the inverse aspect ratio scan and the tokamak safety factor q scan in homogeneous plasmas. TEMPEST simulations show that the GAMs exist in the edge pedestal for steep density and temperature gradients in the form of outgoing waves. The enhanced GAM damping may explain experimental beam emission spectroscopy measurements on the edge q scaling of the GAM amplitude.

  6. Using health-system-wide data to understand hepatitis B virus prophylaxis and reactivation outcomes in patients receiving rituximab.

    PubMed

    Schmajuk, Gabriela; Tonner, Chris; Trupin, Laura; Li, Jing; Sarkar, Urmimala; Ludwig, Dana; Shiboski, Stephen; Sirota, Marina; Dudley, R Adams; Murray, Sara; Yazdany, Jinoos

    2017-03-01

    Hepatitis B virus (HBV) reactivation in the setting of rituximab use is a potentially fatal but preventable safety event. The rate of HBV screening and proportion of patients at risk who receive antiviral prophylaxis in patients initiating rituximab is unknown.We analyzed electronic health record (EHR) data from 2 health systems, a university center and a safety net health system, including diagnosis grouper codes, problem lists, medications, laboratory results, procedures codes, clinical encounter notes, and scanned documents. We identified all patients who received rituximab between 6/1/2012 and 1/1/2016. We calculated the proportion of rituximab users with inadequate screening for HBV according to the Centers for Disease Control guidelines for detecting latent HBV infection before their first rituximab infusion during the study period. We also assessed the proportion of patients with positive hepatitis B screening tests who were prescribed antiviral prophylaxis. Finally, we characterized safety failures and adverse events.We included 926 patients from the university and 132 patients from the safety net health system. Sixty-one percent of patients from the university had adequate screening for HBV compared with 90% from the safety net. Among patients at risk for reactivation based on results of HBV testing, 66% and 92% received antiviral prophylaxis at the university and safety net, respectively.We found wide variations in hepatitis B screening practices among patients receiving rituximab, resulting in unnecessary risks to patients. Interventions should be developed to improve patient safety procedures in this high-risk patient population.

  7. A character string scanner

    NASA Technical Reports Server (NTRS)

    Enison, R. L.

    1971-01-01

    A computer program called Character String Scanner (CSS), is presented. It is designed to search a data set for any specified group of characters and then to flag this group. The output of the CSS program is a listing of the data set being searched with the specified group of characters being flagged by asterisks. Therefore, one may readily identify specific keywords, groups of keywords or specified lines of code internal to a computer program, in a program output, or in any other specific data set. Possible applications of this program include the automatic scan of an output data set for pertinent keyword data, the editing of a program to change the appearance of a certain word or group of words, and the conversion of a set of code to a different set of code.

  8. Determination of tire cross-sectional geometric characteristics from a digitally scanned image

    NASA Astrophysics Data System (ADS)

    Danielson, Kent T.

    1995-08-01

    A semi-automated procedure is described for the accurate determination of geometrical characteristics using a scanned image of the tire cross-section. The procedure can be useful for cases when CAD drawings are not available or when a description of the actual cured tire is desired. Curves representing the perimeter of the tire cross-section are determined by an edge tracing scheme, and the plyline and cord-end positions are determined by locations of color intensities. The procedure provides an accurate description of the perimeter of the tire cross-section and the locations of plylines and cord-ends. The position, normals, and curvatures of the cross-sectional surface are included in this description. The locations of the plylines provide the necessary information for determining the ply thicknesses and relative position to a reference surface. Finally, the locations of the cord-ends provide a means to calculate the cord-ends per inch (epi). Menu driven software has been developed to facilitate the procedure using the commercial code, PV-Wave by Visual Numerics, Inc., to display the images. From a single user interface, separate modules are executed for image enhancement, curve fitting the edge trace of the cross-sectional perimeter, and determining the plyline and cord-end locations. The code can run on SUN or SGI workstations and requires the use of a mouse to specify options or identify items on the scanned image.

  9. Determination of tire cross-sectional geometric characteristics from a digitally scanned image

    NASA Technical Reports Server (NTRS)

    Danielson, Kent T.

    1995-01-01

    A semi-automated procedure is described for the accurate determination of geometrical characteristics using a scanned image of the tire cross-section. The procedure can be useful for cases when CAD drawings are not available or when a description of the actual cured tire is desired. Curves representing the perimeter of the tire cross-section are determined by an edge tracing scheme, and the plyline and cord-end positions are determined by locations of color intensities. The procedure provides an accurate description of the perimeter of the tire cross-section and the locations of plylines and cord-ends. The position, normals, and curvatures of the cross-sectional surface are included in this description. The locations of the plylines provide the necessary information for determining the ply thicknesses and relative position to a reference surface. Finally, the locations of the cord-ends provide a means to calculate the cord-ends per inch (epi). Menu driven software has been developed to facilitate the procedure using the commercial code, PV-Wave by Visual Numerics, Inc., to display the images. From a single user interface, separate modules are executed for image enhancement, curve fitting the edge trace of the cross-sectional perimeter, and determining the plyline and cord-end locations. The code can run on SUN or SGI workstations and requires the use of a mouse to specify options or identify items on the scanned image.

  10. Anomalous transport in the H-mode pedestal of Alcator C-Mod discharges

    NASA Astrophysics Data System (ADS)

    Pankin, A. Y.; Hughes, J. W.; Greenwald, M. J.; Kritz, A. H.; Rafiq, T.

    2017-02-01

    Anomalous transport in the H-mode pedestal region of five Alcator C-Mod discharges, representing a collisionality scan is analyzed. The understanding of anomalous transport in the pedestal region is important for the development of a comprehensive model for the H-mode pedestal slope. In this research, a possible role of the drift resistive inertial ballooning modes (Rafiq et al 2010 Phys. Plasmas 17 082511) in the edge of Alcator C-Mod discharges is analyzed. The stability analysis, carried out using the TRANSP code, indicates that the DRIBM modes are strongly unstable in Alcator C-Mod discharges with large electron collisionality. An improved interpretive analysis of H-mode pedestal experimental data is carried out utilizing the additive flux minimization technique (Pankin et al 2013 Phys. Plasmas 20 102501) together with the guiding-center neoclassical kinetic XGC0 code. The neoclassical and neutral physics are simulated in the XGC0 code and the anomalous fluxes are computed using the additive flux minimization technique. The anomalous fluxes are reconstructed and compared with each other for the collisionality scan Alcator C-Mod discharges. It is found that the electron thermal anomalous diffusivities at the pedestal top increase with the electron collisionality. This dependence can also point to the drift resistive inertial ballooning modes as the modes that drive the anomalous transport in the plasma edge of highly collisional discharges.

  11. Wide-field reflective scanning optical systems

    NASA Technical Reports Server (NTRS)

    Abel, I. R.

    1973-01-01

    Catoptric optical scanning system provides relatively fast line-scan rate for two-dimensional coverage. Rapid scan rates require low focal ratios between components and smallest possible masses. System is relatively free from monochromatic defects and chromatic aberrations.

  12. Low sidelobe level and high time resolution for metallic ultrasonic testing with linear-chirp-Golay coded excitation

    NASA Astrophysics Data System (ADS)

    Zhang, Jiaying; Gang, Tie; Ye, Chaofeng; Cong, Sen

    2018-04-01

    Linear-chirp-Golay (LCG)-coded excitation combined with pulse compression is proposed in this paper to improve the time resolution and suppress sidelobe in ultrasonic testing. The LCG-coded excitation is binary complementary pair Golay signal with linear-chirp signal applied on every sub pulse. Compared with conventional excitation which is a common ultrasonic testing method using a brief narrow pulse as exciting signal, the performances of LCG-coded excitation, in terms of time resolution improvement and sidelobe suppression, are studied via numerical and experimental investigations. The numerical simulations are implemented using Matlab K-wave toolbox. It is seen from the simulation results that time resolution of LCG excitation is 35.5% higher and peak sidelobe level (PSL) is 57.6 dB lower than linear-chirp excitation with 2.4 MHz chirp bandwidth and 3 μs time duration. In the B-scan experiment, time resolution of LCG excitation is higher and PSL is lower than conventional brief pulse excitation and chirp excitation. In terms of time resolution, LCG-coded signal has better performance than chirp signal. Moreover, the impact of chirp bandwidth on LCG-coded signal is less than that on chirp signal. In addition, the sidelobe of LCG-coded signal is lower than that of chirp signal with pulse compression.

  13. High-resolution diffusion tensor imaging of the human kidneys using a free-breathing, multi-slice, targeted field of view approach

    PubMed Central

    Chan, Rachel W; Von Deuster, Constantin; Stoeck, Christian T; Harmer, Jack; Punwani, Shonit; Ramachandran, Navin; Kozerke, Sebastian; Atkinson, David

    2014-01-01

    Fractional anisotropy (FA) obtained by diffusion tensor imaging (DTI) can be used to image the kidneys without any contrast media. FA of the medulla has been shown to correlate with kidney function. It is expected that higher spatial resolution would improve the depiction of small structures within the kidney. However, the achievement of high spatial resolution in renal DTI remains challenging as a result of respiratory motion and susceptibility to diffusion imaging artefacts. In this study, a targeted field of view (TFOV) method was used to obtain high-resolution FA maps and colour-coded diffusion tensor orientations, together with measures of the medullary and cortical FA, in 12 healthy subjects. Subjects were scanned with two implementations (dual and single kidney) of a TFOV DTI method. DTI scans were performed during free breathing with a navigator-triggered sequence. Results showed high consistency in the greyscale FA, colour-coded FA and diffusion tensors across subjects and between dual- and single-kidney scans, which have in-plane voxel sizes of 2 × 2 mm2 and 1.2 × 1.2 mm2, respectively. The ability to acquire multiple contiguous slices allowed the medulla and cortical FA to be quantified over the entire kidney volume. The mean medulla and cortical FA values were 0.38 ± 0.017 and 0.21 ± 0.019, respectively, for the dual-kidney scan, and 0.35 ± 0.032 and 0.20 ± 0.014, respectively, for the single-kidney scan. The mean FA between the medulla and cortex was significantly different (p < 0.001) for both dual- and single-kidney implementations. High-spatial-resolution DTI shows promise for improving the characterization and non-invasive assessment of kidney function. © 2014 The Authors. NMR in Biomedicine published by John Wiley & Sons, Ltd. PMID:25219683

  14. High-resolution diffusion tensor imaging of the human kidneys using a free-breathing, multi-slice, targeted field of view approach.

    PubMed

    Chan, Rachel W; Von Deuster, Constantin; Stoeck, Christian T; Harmer, Jack; Punwani, Shonit; Ramachandran, Navin; Kozerke, Sebastian; Atkinson, David

    2014-11-01

    Fractional anisotropy (FA) obtained by diffusion tensor imaging (DTI) can be used to image the kidneys without any contrast media. FA of the medulla has been shown to correlate with kidney function. It is expected that higher spatial resolution would improve the depiction of small structures within the kidney. However, the achievement of high spatial resolution in renal DTI remains challenging as a result of respiratory motion and susceptibility to diffusion imaging artefacts. In this study, a targeted field of view (TFOV) method was used to obtain high-resolution FA maps and colour-coded diffusion tensor orientations, together with measures of the medullary and cortical FA, in 12 healthy subjects. Subjects were scanned with two implementations (dual and single kidney) of a TFOV DTI method. DTI scans were performed during free breathing with a navigator-triggered sequence. Results showed high consistency in the greyscale FA, colour-coded FA and diffusion tensors across subjects and between dual- and single-kidney scans, which have in-plane voxel sizes of 2 × 2 mm(2) and 1.2 × 1.2 mm(2) , respectively. The ability to acquire multiple contiguous slices allowed the medulla and cortical FA to be quantified over the entire kidney volume. The mean medulla and cortical FA values were 0.38 ± 0.017 and 0.21 ± 0.019, respectively, for the dual-kidney scan, and 0.35 ± 0.032 and 0.20 ± 0.014, respectively, for the single-kidney scan. The mean FA between the medulla and cortex was significantly different (p < 0.001) for both dual- and single-kidney implementations. High-spatial-resolution DTI shows promise for improving the characterization and non-invasive assessment of kidney function. © 2014 The Authors. NMR in Biomedicine published by John Wiley & Sons, Ltd.

  15. Channel coding in the space station data system network

    NASA Technical Reports Server (NTRS)

    Healy, T.

    1982-01-01

    A detailed discussion of the use of channel coding for error correction, privacy/secrecy, channel separation, and synchronization is presented. Channel coding, in one form or another, is an established and common element in data systems. No analysis and design of a major new system would fail to consider ways in which channel coding could make the system more effective. The presence of channel coding on TDRS, Shuttle, the Advanced Communication Technology Satellite Program system, the JSC-proposed Space Operations Center, and the proposed 30/20 GHz Satellite Communication System strongly support the requirement for the utilization of coding for the communications channel. The designers of the space station data system have to consider the use of channel coding.

  16. A statewide teleradiology system reduces radiation exposure and charges in transferred trauma patients.

    PubMed

    Watson, Justin J J; Moren, Alexis; Diggs, Brian; Houser, Ben; Eastes, Lynn; Brand, Dawn; Bilyeu, Pamela; Schreiber, Martin; Kiraly, Laszlo

    2016-05-01

    Trauma transfer patients routinely undergo repeat imaging because of inefficiencies within the radiology system. In 2009, the virtual private network (VPN) telemedicine system was adopted throughout Oregon allowing virtual image transfer between hospitals. The startup cost was a nominal $3,000 per hospital. A retrospective review from 2007 to 2012 included 400 randomly selected adult trauma transfer patients based on a power analysis (200 pre/200 post). The primary outcome evaluated was reduction in repeat computed tomography (CT) scans. Secondary outcomes included cost savings, emergency department (ED) length of stay (LOS), and spared radiation. All data were analyzed using Mann-Whitney U and chi-square tests. P less than .05 indicated significance. Spared radiation was calculated as a weighted average per body region, and savings was calculated using charges obtained from Oregon Health and Science University radiology current procedural terminology codes. Four-hundred patients were included. Injury Severity Score, age, ED and overall LOS, mortality, trauma type, and gender were not statistically different between groups. The percentage of patients with repeat CT scans decreased after VPN implementation: CT abdomen (13.2% vs 2.8%, P < .01) and cervical spine (34.4% vs 18.2%, P < .01). Post-VPN, the total charges saved in 2012 for trauma transfer patients was $333,500, whereas the average radiation dose spared per person was 1.8 mSV. Length of stay in the ED for patients with Injury Severity Score less than 15 transferring to the ICU was decreased (P < .05). Implementation of a statewide teleradiology network resulted in fewer total repeat CT scans, significant savings, decrease in radiation exposure, and decreased LOS in the ED for patients with less complex injuries. The potential for health care savings by widespread adoption of a VPN is significant. Copyright © 2016 Elsevier Inc. All rights reserved.

  17. Low-Speed Fingerprint Image Capture System User`s Guide, June 1, 1993

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Whitus, B.R.; Goddard, J.S.; Jatko, W.B.

    1993-06-01

    The Low-Speed Fingerprint Image Capture System (LS-FICS) uses a Sun workstation controlling a Lenzar ElectroOptics Opacity 1000 imaging system to digitize fingerprint card images to support the Federal Bureau of Investigation`s (FBI`s) Automated Fingerprint Identification System (AFIS) program. The system also supports the operations performed by the Oak Ridge National Laboratory- (ORNL-) developed Image Transmission Network (ITN) prototype card scanning system. The input to the system is a single FBI fingerprint card of the agreed-upon standard format and a user-specified identification number. The output is a file formatted to be compatible with the National Institute of Standards and Technology (NIST)more » draft standard for fingerprint data exchange dated June 10, 1992. These NIST compatible files contain the required print and text images. The LS-FICS is designed to provide the FBI with the capability of scanning fingerprint cards into a digital format. The FBI will replicate the system to generate a data base of test images. The Host Workstation contains the image data paths and the compression algorithm. A local area network interface, disk storage, and tape drive are used for the image storage and retrieval, and the Lenzar Opacity 1000 scanner is used to acquire the image. The scanner is capable of resolving 500 pixels/in. in both x and y directions. The print images are maintained in full 8-bit gray scale and compressed with an FBI-approved wavelet-based compression algorithm. The text fields are downsampled to 250 pixels/in. and 2-bit gray scale. The text images are then compressed using a lossless Huffman coding scheme. The text fields retrieved from the output files are easily interpreted when displayed on the screen. Detailed procedures are provided for system calibration and operation. Software tools are provided to verify proper system operation.« less

  18. Automated surveillance of 911 call data for detection of possible water contamination incidents.

    PubMed

    Haas, Adam J; Gibbons, Darcy; Dangel, Chrissy; Allgeier, Steve

    2011-03-30

    Drinking water contamination, with the capability to affect large populations, poses a significant risk to public health. In recent water contamination events, the impact of contamination on public health appeared in data streams monitoring health-seeking behavior. While public health surveillance has traditionally focused on the detection of pathogens, developing methods for detection of illness from fast-acting chemicals has not been an emphasis. An automated surveillance system was implemented for Cincinnati's drinking water contamination warning system to monitor health-related 911 calls in the city of Cincinnati. Incident codes indicative of possible water contamination were filtered from all 911 calls for analysis. The 911 surveillance system uses a space-time scan statistic to detect potential water contamination incidents. The frequency and characteristics of the 911 alarms over a 2.5 year period were studied. During the evaluation, 85 alarms occurred, although most occurred prior to the implementation of an additional alerting constraint in May 2009. Data were available for analysis approximately 48 minutes after calls indicating alarms may be generated 1-2 hours after a rapid increase in call volume. Most alerts occurred in areas of high population density. The average alarm area was 9.22 square kilometers. The average number of cases in an alarm was nine calls. The 911 surveillance system provides timely notification of possible public health events, but did have limitations. While the alarms contained incident codes and location of the caller, additional information such as medical status was not available to assist validating the cause of the alarm. Furthermore, users indicated that a better understanding of 911 system functionality is necessary to understand how it would behave in an actual water contamination event.

  19. Automated surveillance of 911 call data for detection of possible water contamination incidents

    PubMed Central

    2011-01-01

    Background Drinking water contamination, with the capability to affect large populations, poses a significant risk to public health. In recent water contamination events, the impact of contamination on public health appeared in data streams monitoring health-seeking behavior. While public health surveillance has traditionally focused on the detection of pathogens, developing methods for detection of illness from fast-acting chemicals has not been an emphasis. Methods An automated surveillance system was implemented for Cincinnati's drinking water contamination warning system to monitor health-related 911 calls in the city of Cincinnati. Incident codes indicative of possible water contamination were filtered from all 911 calls for analysis. The 911 surveillance system uses a space-time scan statistic to detect potential water contamination incidents. The frequency and characteristics of the 911 alarms over a 2.5 year period were studied. Results During the evaluation, 85 alarms occurred, although most occurred prior to the implementation of an additional alerting constraint in May 2009. Data were available for analysis approximately 48 minutes after calls indicating alarms may be generated 1-2 hours after a rapid increase in call volume. Most alerts occurred in areas of high population density. The average alarm area was 9.22 square kilometers. The average number of cases in an alarm was nine calls. Conclusions The 911 surveillance system provides timely notification of possible public health events, but did have limitations. While the alarms contained incident codes and location of the caller, additional information such as medical status was not available to assist validating the cause of the alarm. Furthermore, users indicated that a better understanding of 911 system functionality is necessary to understand how it would behave in an actual water contamination event. PMID:21450105

  20. Laser direct marking applied to rasterizing miniature Data Matrix Code on aluminum alloy

    NASA Astrophysics Data System (ADS)

    Li, Xia-Shuang; He, Wei-Ping; Lei, Lei; Wang, Jian; Guo, Gai-Fang; Zhang, Teng-Yun; Yue, Ting

    2016-03-01

    Precise miniaturization of 2D Data Matrix (DM) Codes on Aluminum alloy formed by raster mode laser direct part marking is demonstrated. The characteristic edge over-burn effects, which render vector mode laser direct part marking inadequate for producing precise and readable miniature codes, are minimized with raster mode laser marking. To obtain the control mechanism for the contrast and print growth of miniature DM code by raster laser marking process, the temperature field model of long pulse laser interaction with material is established. From the experimental results, laser average power and Q frequency have an important effect on the contrast and print growth of miniature DM code, and the threshold of laser average power and Q frequency for an identifiable miniature DM code are respectively 3.6 W and 110 kHz, which matches the model well within normal operating conditions. In addition, the empirical model of correlation occurring between laser marking parameters and module size is also obtained, and the optimal processing parameter values for an identifiable miniature DM code of different but certain data size are given. It is also found that an increase of the repeat scanning number effectively improves the surface finish of bore, the appearance consistency of modules, which has benefit to reading. The reading quality of miniature DM code is greatly improved using ultrasonic cleaning in water by avoiding the interference of color speckles surrounding modules.

  1. The use of information technology to enhance patient safety and nursing efficiency.

    PubMed

    Lee, Tso-Ying; Sun, Gi-Tseng; Kou, Li-Tseng; Yeh, Mei-Ling

    2017-10-23

    Issues in patient safety and nursing efficiency have long been of concern. Advancing the role of nursing informatics is seen as the best way to address this. The aim of this study was to determine if the use, outcomes and satisfaction with a nursing information system (NIS) improved patient safety and the quality of nursing care in a hospital in Taiwan. This study adopts a quasi-experimental design. Nurses and patients were surveyed by questionnaire and data retrieval before and after the implementation of NIS in terms of blood drawing, nursing process, drug administration, bar code scanning, shift handover, and information and communication integration. Physiologic values were easier to read and interpret; it took less time to complete electronic records (3.7 vs. 9.1 min); the number of errors in drug administration was reduced (0.08% vs. 0.39%); bar codes reduced the number of errors in blood drawing (0 vs. 10) and transportation of specimens (0 vs. 0.42%); satisfaction with electronic shift handover increased significantly; there was a reduction in nursing turnover (14.9% vs. 16%); patient satisfaction increased significantly (3.46 vs. 3.34). Introduction of NIS improved patient safety and nursing efficiency and increased nurse and patient satisfaction. Medical organizations must continually improve the nursing information system if they are to provide patients with high quality service in a competitive environment.

  2. TOPAS Tool for Particle Simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Perl, Joseph

    2013-05-30

    TOPAS lets users simulate the passage of subatomic particles moving through any kind of radiation therapy treatment system, can import a patient geometry, can record dose and other quantities, has advanced graphics, and is fully four-dimensional (3D plus time) to handle the most challenging time-dependent aspects of modern cancer treatments.TOPAS unlocks the power of the most accurate particle transport simulation technique, the Monte Carlo (MC) method, while removing the painstaking coding work such methods used to require. Research physicists can use TOPAS to improve delivery systems towards safer and more effective radiation therapy treatments, easily setting up and running complexmore » simulations that previously used to take months of preparation. Clinical physicists can use TOPAS to increase accuracy while reducing side effects, simulating patient-specific treatment plans at the touch of a button. TOPAS is designed as a “user code” layered on top of the Geant4 Simulation Toolkit. TOPAS includes the standard Geant4 toolkit, plus additional code to make Geant4 easier to control and to extend Geant4 functionality. TOPAS aims to make proton simulation both “reliable” and “repeatable.” “Reliable” means both accurate physics and a high likelihood to simulate precisely what the user intended to simulate, reducing issues of wrong units, wrong materials, wrong scoring locations, etc. “Repeatable” means not just getting the same result from one simulation to another, but being able to easily restore a previously used setup and reducing sources of error when a setup is passed from one user to another. TOPAS control system incorporates key lessons from safety management, proactively removing possible sources of user error such as line-ordering mistakes In control files. TOPAS has been used to model proton therapy treatment examples including the UCSF eye treatment head, the MGH stereotactic alignment in radiosurgery treatment head and the MGH gantry treatment heads in passive scattering and scanning modes, and has demonstrated dose calculation based on patient-specific CT data.« less

  3. Medical comorbidity in narcolepsy: findings from the Burden of Narcolepsy Disease (BOND) study.

    PubMed

    Black, J; Reaven, N L; Funk, S E; McGaughey, K; Ohayon, M M; Guilleminault, C; Ruoff, C

    2017-05-01

    The objective of this study was to evaluate medical comorbidity patterns in patients with a narcolepsy diagnosis in the United States. This was a retrospective medical claims data analysis. Truven Health Analytics MarketScan® Research Databases were accessed to identify individuals ≥18 years of age with ≥1 diagnosis code for narcolepsy (International Classification of Diseases (ICD)-9, 347.0, 347.00, 347.01, 347.1, 347.10, or 347.11) continuously insured between 2006 and 2010, and controls without narcolepsy matched 5:1 on age, gender, region, and payer. Narcolepsy and control subjects were compared for frequency of comorbid conditions, identified by the appearance of >1 diagnosis code(s) mapped to a Clinical Classification System (CCS) level 1 category any time during the study period, and on specific subcategories, including recognized narcolepsy comorbidities of obstructive sleep apnea (OSA) and depression. The final study group included 9312 subjects with narcolepsy and 46,559 controls (each group: average age, 46.1 years; 59% female). As compared with controls, patients with narcolepsy showed a statistically significant excess prevalence in all the CCS multilevel categories, the only exceptions being conditions originating in the perinatal period and pregnancy/childbirth complications. The greatest excess prevalence in the narcolepsy cohort was seen for mental illness (31.1% excess prevalence; odds ratio (OR) 3.8, 95% confidence interval (CI) 3.6, 4.0), followed by diseases of the digestive system (21.4% excess prevalence; OR 2.7, 95% CI 2.5, 2.8) and nervous system/sense organs (excluding narcolepsy; 20.7% excess prevalence; OR 3.7, 95% CI 3.4, 3.9). In this claims analysis, a narcolepsy diagnosis was associated with a wide range of comorbid medical illness claims, at significantly higher rates than matched controls. Copyright © 2016. Published by Elsevier B.V.

  4. Mean Line Pump Flow Model in Rocket Engine System Simulation

    NASA Technical Reports Server (NTRS)

    Veres, Joseph P.; Lavelle, Thomas M.

    2000-01-01

    A mean line pump flow modeling method has been developed to provide a fast capability for modeling turbopumps of rocket engines. Based on this method, a mean line pump flow code PUMPA has been written that can predict the performance of pumps at off-design operating conditions, given the loss of the diffusion system at the design point. The pump code can model axial flow inducers, mixed-flow and centrifugal pumps. The code can model multistage pumps in series. The code features rapid input setup and computer run time, and is an effective analysis and conceptual design tool. The map generation capability of the code provides the map information needed for interfacing with a rocket engine system modeling code. The off-design and multistage modeling capabilities of the code permit parametric design space exploration of candidate pump configurations and provide pump performance data for engine system evaluation. The PUMPA code has been integrated with the Numerical Propulsion System Simulation (NPSS) code and an expander rocket engine system has been simulated. The mean line pump flow code runs as an integral part of the NPSS rocket engine system simulation and provides key pump performance information directly to the system model at all operating conditions.

  5. Automated Concurrent Blackboard System Generation in C++

    NASA Technical Reports Server (NTRS)

    Kaplan, J. A.; McManus, J. W.; Bynum, W. L.

    1999-01-01

    In his 1992 Ph.D. thesis, "Design and Analysis Techniques for Concurrent Blackboard Systems", John McManus defined several performance metrics for concurrent blackboard systems and developed a suite of tools for creating and analyzing such systems. These tools allow a user to analyze a concurrent blackboard system design and predict the performance of the system before any code is written. The design can be modified until simulated performance is satisfactory. Then, the code generator can be invoked to generate automatically all of the code required for the concurrent blackboard system except for the code implementing the functionality of each knowledge source. We have completed the port of the source code generator and a simulator for a concurrent blackboard system. The source code generator generates the necessary C++ source code to implement the concurrent blackboard system using Parallel Virtual Machine (PVM) running on a heterogeneous network of UNIX(trademark) workstations. The concurrent blackboard simulator uses the blackboard specification file to predict the performance of the concurrent blackboard design. The only part of the source code for the concurrent blackboard system that the user must supply is the code implementing the functionality of the knowledge sources.

  6. The Exchange Data Communication System based on Centralized Database for the Meat Industry

    NASA Astrophysics Data System (ADS)

    Kobayashi, Yuichi; Taniguchi, Yoji; Terada, Shuji; Komoda, Norihisa

    We propose applying the EDI system that is based on centralized database and supports conversion of code data to the meat industry. This system makes it possible to share exchange data on beef between enterprises from producers to retailers by using Web EDI technology. In order to efficiently convert code direct conversion of a sender's code to a receiver's code using a code map is used. This system that mounted this function has been implemented in September 2004. Twelve enterprises including retailers, and processing traders, and wholesalers were using the system as of June 2005. In this system, the number of code maps relevant to the introductory cost of the code conversion function was lower than the theoretical value and were close to the case that a standard code is mediated.

  7. Development of scanning holographic display using MEMS SLM

    NASA Astrophysics Data System (ADS)

    Takaki, Yasuhiro

    2016-10-01

    Holography is an ideal three-dimensional (3D) display technique, because it produces 3D images that naturally satisfy human 3D perception including physiological and psychological factors. However, its electronic implementation is quite challenging because ultra-high resolution is required for display devices to provide sufficient screen size and viewing zone. We have developed holographic display techniques to enlarge the screen size and the viewing zone by use of microelectromechanical systems spatial light modulators (MEMS-SLMs). Because MEMS-SLMs can generate hologram patterns at a high frame rate, the time-multiplexing technique is utilized to virtually increase the resolution. Three kinds of scanning systems have been combined with MEMS-SLMs; the screen scanning system, the viewing-zone scanning system, and the 360-degree scanning system. The screen scanning system reduces the hologram size to enlarge the viewing zone and the reduced hologram patterns are scanned on the screen to increase the screen size: the color display system with a screen size of 6.2 in. and a viewing zone angle of 11° was demonstrated. The viewing-zone scanning system increases the screen size and the reduced viewing zone is scanned to enlarge the viewing zone: a screen size of 2.0 in. and a viewing zone angle of 40° were achieved. The two-channel system increased the screen size to 7.4 in. The 360-degree scanning increases the screen size and the reduced viewing zone is scanned circularly: the display system having a flat screen with a diameter of 100 mm was demonstrated, which generates 3D images viewed from any direction around the flat screen.

  8. QR Codes in the Library: "It's Not Your Mother's Barcode!"

    ERIC Educational Resources Information Center

    Dobbs, Cheri

    2011-01-01

    Barcode scanning has become more than just fun. Now libraries and businesses are leveraging barcode technology as an innovative tool to market their products and ideas. Developed and popularized in Japan, these Quick Response (QR) or two-dimensional barcodes allow marketers to provide interactive content in an otherwise static environment. In this…

  9. A flexible 3D laser scanning system using a robotic arm

    NASA Astrophysics Data System (ADS)

    Fei, Zixuan; Zhou, Xiang; Gao, Xiaofei; Zhang, Guanliang

    2017-06-01

    In this paper, we present a flexible 3D scanning system based on a MEMS scanner mounted on an industrial arm with a turntable. This system has 7-degrees of freedom and is able to conduct a full field scan from any angle, suitable for scanning object with the complex shape. The existing non-contact 3D scanning system usually uses laser scanner that projects fixed stripe mounted on the Coordinate Measuring Machine (CMM) or industrial robot. These existing systems can't perform path planning without CAD models. The 3D scanning system presented in this paper can scan the object without CAD models, and we introduced this path planning method in the paper. We also propose a practical approach to calibrating the hand-in-eye system based on binocular stereo vision and analyzes the errors of the hand-eye calibration.

  10. ArtDeco: a beam-deconvolution code for absolute cosmic microwave background measurements

    NASA Astrophysics Data System (ADS)

    Keihänen, E.; Reinecke, M.

    2012-12-01

    We present a method for beam-deconvolving cosmic microwave background (CMB) anisotropy measurements. The code takes as input the time-ordered data along with the corresponding detector pointings and known beam shapes, and produces as output the harmonic aTlm, aElm, and aBlm coefficients of the observed sky. From these one can derive temperature and Q and U polarisation maps. The method is applicable to absolute CMB measurements with wide sky coverage, and is independent of the scanning strategy. We tested the code with extensive simulations, mimicking the resolution and data volume of Planck 30 GHz and 70 GHz channels, but with exaggerated beam asymmetry. We applied it to multipoles up to l = 1700 and examined the results in both pixel space and harmonic space. We also tested the method in presence of white noise. The code is released under the terms of the GNU General Public License and can be obtained from http://sourceforge.net/projects/art-deco/

  11. 3D for Geosciences: Interactive Tangibles and Virtual Models

    NASA Astrophysics Data System (ADS)

    Pippin, J. E.; Matheney, M.; Kitsch, N.; Rosado, G.; Thompson, Z.; Pierce, S. A.

    2016-12-01

    Point cloud processing provides a method of studying and modelling geologic features relevant to geoscience systems and processes. Here, software including Skanect, MeshLab, Blender, PDAL, and PCL are used in conjunction with 3D scanning hardware, including a Structure scanner and a Kinect camera, to create and analyze point cloud images of small scale topography, karst features, tunnels, and structures at high resolution. This project successfully scanned internal karst features ranging from small stalactites to large rooms, as well as an external waterfall feature. For comparison purposes, multiple scans of the same object were merged into single object files both automatically, using commercial software, and manually using open source libraries and code. Files with format .ply were manually converted into numeric data sets to be analyzed for similar regions between files in order to match them together. We can assume a numeric process would be more powerful and efficient than the manual method, however it could lack other useful features that GUI's may have. The digital models have applications in mining as efficient means of replacing topography functions such as measuring distances and areas. Additionally, it is possible to make simulation models such as drilling templates and calculations related to 3D spaces. Advantages of using methods described here for these procedures include the relatively quick time to obtain data and the easy transport of the equipment. With regard to openpit mining, obtaining 3D images of large surfaces and with precision would be a high value tool by georeferencing scan data to interactive maps. The digital 3D images obtained from scans may be saved as printable files to create physical 3D-printable models to create tangible objects based on scientific information, as well as digital "worlds" able to be navigated virtually. The data, models, and algorithms explored here can be used to convey complex scientific ideas to a range of professionals and audiences.

  12. Design of ACM system based on non-greedy punctured LDPC codes

    NASA Astrophysics Data System (ADS)

    Lu, Zijun; Jiang, Zihong; Zhou, Lin; He, Yucheng

    2017-08-01

    In this paper, an adaptive coded modulation (ACM) scheme based on rate-compatible LDPC (RC-LDPC) codes was designed. The RC-LDPC codes were constructed by a non-greedy puncturing method which showed good performance in high code rate region. Moreover, the incremental redundancy scheme of LDPC-based ACM system over AWGN channel was proposed. By this scheme, code rates vary from 2/3 to 5/6 and the complication of the ACM system is lowered. Simulations show that more and more obvious coding gain can be obtained by the proposed ACM system with higher throughput.

  13. Accelerated GPU based SPECT Monte Carlo simulations.

    PubMed

    Garcia, Marie-Paule; Bert, Julien; Benoit, Didier; Bardiès, Manuel; Visvikis, Dimitris

    2016-06-07

    Monte Carlo (MC) modelling is widely used in the field of single photon emission computed tomography (SPECT) as it is a reliable technique to simulate very high quality scans. This technique provides very accurate modelling of the radiation transport and particle interactions in a heterogeneous medium. Various MC codes exist for nuclear medicine imaging simulations. Recently, new strategies exploiting the computing capabilities of graphical processing units (GPU) have been proposed. This work aims at evaluating the accuracy of such GPU implementation strategies in comparison to standard MC codes in the context of SPECT imaging. GATE was considered the reference MC toolkit and used to evaluate the performance of newly developed GPU Geant4-based Monte Carlo simulation (GGEMS) modules for SPECT imaging. Radioisotopes with different photon energies were used with these various CPU and GPU Geant4-based MC codes in order to assess the best strategy for each configuration. Three different isotopes were considered: (99m) Tc, (111)In and (131)I, using a low energy high resolution (LEHR) collimator, a medium energy general purpose (MEGP) collimator and a high energy general purpose (HEGP) collimator respectively. Point source, uniform source, cylindrical phantom and anthropomorphic phantom acquisitions were simulated using a model of the GE infinia II 3/8" gamma camera. Both simulation platforms yielded a similar system sensitivity and image statistical quality for the various combinations. The overall acceleration factor between GATE and GGEMS platform derived from the same cylindrical phantom acquisition was between 18 and 27 for the different radioisotopes. Besides, a full MC simulation using an anthropomorphic phantom showed the full potential of the GGEMS platform, with a resulting acceleration factor up to 71. The good agreement with reference codes and the acceleration factors obtained support the use of GPU implementation strategies for improving computational efficiency of SPECT imaging simulations.

  14. Evaluation of the effect scan pattern has on the trueness and precision of six intraoral digital impression systems.

    PubMed

    Mennito, Anthony S; Evans, Zachary P; Lauer, Abigail W; Patel, Ravi B; Ludlow, Mark E; Renne, Walter G

    2018-03-01

    Clinicians have been slow to adopt digital impression technologies due possibly to perceived technique sensitivities involved in data acquisition. This research has two aims: determine whether scan pattern and sequence affects the accuracy of the three-dimensional (3D) model created from this digital impression and to compare the 5 imaging systems with regards to their scanning accuracy for sextant impressions. Six digital intraoral impression systems were used to scan a typodont sextant with optical properties similar to natural teeth. The impressions were taken using five different scan patterns and the resulting digital models were overlayed on a master digital model to determine the accuracy of each scanner performing each scan pattern. Furthermore, regardless of scan pattern, each digital impression system was evaluated for accuracy to the other systems in this same manner. No differences of significance were noted in the accuracy of 3D models created using six distinct scan patterns with one exception involving the CEREC Omnicam. Planmeca Planscan was determined to be the truest scanner while 3Shape Trios was determined to be the most precise for sextant impression making. Scan pattern does not significantly affect the accuracy of the resulting digital model for sextant scanning. Companies who make digital impression systems often recommend a scan pattern specific for their system. However, every clinical scanning scenario is different and may require a different approach. Knowing how important scan pattern is with regards to accuracy would be helpful for guiding a growing number of practitioners who are utilizing this technology. © 2018 Wiley Periodicals, Inc.

  15. GI-POP: a combinational annotation and genomic island prediction pipeline for ongoing microbial genome projects.

    PubMed

    Lee, Chi-Ching; Chen, Yi-Ping Phoebe; Yao, Tzu-Jung; Ma, Cheng-Yu; Lo, Wei-Cheng; Lyu, Ping-Chiang; Tang, Chuan Yi

    2013-04-10

    Sequencing of microbial genomes is important because of microbial-carrying antibiotic and pathogenetic activities. However, even with the help of new assembling software, finishing a whole genome is a time-consuming task. In most bacteria, pathogenetic or antibiotic genes are carried in genomic islands. Therefore, a quick genomic island (GI) prediction method is useful for ongoing sequencing genomes. In this work, we built a Web server called GI-POP (http://gipop.life.nthu.edu.tw) which integrates a sequence assembling tool, a functional annotation pipeline, and a high-performance GI predicting module, in a support vector machine (SVM)-based method called genomic island genomic profile scanning (GI-GPS). The draft genomes of the ongoing genome projects in contigs or scaffolds can be submitted to our Web server, and it provides the functional annotation and highly probable GI-predicting results. GI-POP is a comprehensive annotation Web server designed for ongoing genome project analysis. Researchers can perform annotation and obtain pre-analytic information include possible GIs, coding/non-coding sequences and functional analysis from their draft genomes. This pre-analytic system can provide useful information for finishing a genome sequencing project. Copyright © 2012 Elsevier B.V. All rights reserved.

  16. Global modeling of thermospheric airglow in the far ultraviolet

    NASA Astrophysics Data System (ADS)

    Solomon, Stanley C.

    2017-07-01

    The Global Airglow (GLOW) model has been updated and extended to calculate thermospheric emissions in the far ultraviolet, including sources from daytime photoelectron-driven processes, nighttime recombination radiation, and auroral excitation. It can be run using inputs from empirical models of the neutral atmosphere and ionosphere or from numerical general circulation models of the coupled ionosphere-thermosphere system. It uses a solar flux module, photoelectron generation routine, and the Nagy-Banks two-stream electron transport algorithm to simultaneously handle energetic electron distributions from photon and auroral electron sources. It contains an ion-neutral chemistry module that calculates excited and ionized species densities and the resulting airglow volume emission rates. This paper describes the inputs, algorithms, and code structure of the model and demonstrates example outputs for daytime and auroral cases. Simulations of far ultraviolet emissions by the atomic oxygen doublet at 135.6 nm and the molecular nitrogen Lyman-Birge-Hopfield bands, as viewed from geostationary orbit, are shown, and model calculations are compared to limb-scan observations by the Global Ultraviolet Imager on the TIMED satellite. The GLOW model code is provided to the community through an open-source academic research license.

  17. Encrypted holographic data storage based on orthogonal-phase-code multiplexing.

    PubMed

    Heanue, J F; Bashaw, M C; Hesselink, L

    1995-09-10

    We describe an encrypted holographic data-storage system that combines orthogonal-phase-code multiplexing with a random-phase key. The system offers the security advantages of random-phase coding but retains the low cross-talk performance and the minimum code storage requirements typical in an orthogonal-phase-code-multiplexing system.

  18. Interframe vector wavelet coding technique

    NASA Astrophysics Data System (ADS)

    Wus, John P.; Li, Weiping

    1997-01-01

    Wavelet coding is often used to divide an image into multi- resolution wavelet coefficients which are quantized and coded. By 'vectorizing' scalar wavelet coding and combining this with vector quantization (VQ), vector wavelet coding (VWC) can be implemented. Using a finite number of states, finite-state vector quantization (FSVQ) takes advantage of the similarity between frames by incorporating memory into the video coding system. Lattice VQ eliminates the potential mismatch that could occur using pre-trained VQ codebooks. It also eliminates the need for codebook storage in the VQ process, thereby creating a more robust coding system. Therefore, by using the VWC coding method in conjunction with the FSVQ system and lattice VQ, the formulation of a high quality very low bit rate coding systems is proposed. A coding system using a simple FSVQ system where the current state is determined by the previous channel symbol only is developed. To achieve a higher degree of compression, a tree-like FSVQ system is implemented. The groupings are done in this tree-like structure from the lower subbands to the higher subbands in order to exploit the nature of subband analysis in terms of the parent-child relationship. Class A and Class B video sequences from the MPEG-IV testing evaluations are used in the evaluation of this coding method.

  19. INCIDENCE AND PREVALENCE OF ACROMEGALY IN THE UNITED STATES: A CLAIMS-BASED ANALYSIS.

    PubMed

    Broder, Michael S; Chang, Eunice; Cherepanov, Dasha; Neary, Maureen P; Ludlam, William H

    2016-11-01

    Acromegaly, a rare endocrine disorder, results from excessive growth hormone secretion, leading to multisystem-associated morbidities. Using 2 large nationwide databases, we estimated the annual incidence and prevalence of acromegaly in the U.S. We used 2008 to 2013 data from the Truven Health MarketScan ® Commercial Claims and Encounters Database and IMS Health PharMetrics healthcare insurance claims databases, with health plan enrollees <65 years of age. Study patients had ≥2 claims with acromegaly (International Classification of Diseases, 9th Revision, Clinical Modification Code [ICD-9CM] 253.0), or 1 claim with acromegaly and 1 claim for pituitary tumor, pituitary surgery, or cranial stereotactic radiosurgery. Annual incidence was calculated for each year from 2009 to 2013, and prevalence in 2013. Estimates were stratified by age and sex. Incidence was up to 11.7 cases per million person-years (PMPY) in MarketScan and 9.6 cases PMPY in PharMetrics. Rates were similar by sex but typically lowest in ≤17 year olds and higher in >24 year olds. The prevalence estimates were 87.8 and 71.0 per million per year in MarketScan and PharMetrics, respectively. Prevalence consistently increased with age but was similar by sex in each database. The current U.S. incidence of acromegaly may be up to 4 times higher and prevalence may be up to 50% higher than previously reported in European studies. Our findings correspond with the estimates reported by a recent U.S. study that used a single managed care database, supporting the robustness of these estimates in this population. Our study indicates there are approximately 3,000 new cases of acromegaly per year, with a prevalence of about 25,000 acromegaly patients in the U.S. CT = computed tomography GH = growth hormone IGF-1 = insulin-like growth factor 1 ICD-9-CM Code = International Classification of Diseases, 9th Revision, Clinical Modification Codes MRI = magnetic resonance imaging PMPY = per million person-years.

  20. Pneu-Scan - A novel, lightweight two-axis telemetry tracking system

    NASA Astrophysics Data System (ADS)

    Sullivan, A.

    The development of Pneu-Scan, a conically scanning tracking antenna feed for telemetry applications, is described. Pneu-Scan has the advantage of being pneumatically driven, thereby eliminating the need for a heavy electric drive motor. Air from the dehydrator/pressurizer system is used to drive the Pneu-Scan pedestal at a scan speed which is proportional to the continuously varying pressure. The S-band tracking feed of the Pneu-Scan is less than five inches in diameter and is considerably lighter than single-channel monopulse (SCM) feeds. Aperture blocking of Pneu-Scan is more than two times smaller than conventional SCM designs. The antenna reflector of the Pneu-Scan system is a lightweight 5-foot graphite-epoxy parabolical reflector positioned by an elevator-over-azimuth pedestal. The elevation assembly is surrounded by an inflatable rotodome which rotates with azimuth. The rotating sphere was designed to have a minimum wind-induced torque, thereby minimizing the required drive power. The weight of the entire system is less than 135 pounds. The principle characteristics of the Pneu-Scan system are summarized in a table.

  1. Scanning Seismic Intrusion Detector

    NASA Technical Reports Server (NTRS)

    Lee, R. D.

    1982-01-01

    Scanning seismic intrusion detector employs array of automatically or manually scanned sensors to determine approximate location of intruder. Automatic-scanning feature enables one operator to tend system of many sensors. Typical sensors used with new system are moving-coil seismic pickups. Detector finds uses in industrial security systems.

  2. High rate concatenated coding systems using bandwidth efficient trellis inner codes

    NASA Technical Reports Server (NTRS)

    Deng, Robert H.; Costello, Daniel J., Jr.

    1989-01-01

    High-rate concatenated coding systems with bandwidth-efficient trellis inner codes and Reed-Solomon (RS) outer codes are investigated for application in high-speed satellite communication systems. Two concatenated coding schemes are proposed. In one the inner code is decoded with soft-decision Viterbi decoding, and the outer RS code performs error-correction-only decoding (decoding without side information). In the other, the inner code is decoded with a modified Viterbi algorithm, which produces reliability information along with the decoded output. In this algorithm, path metrics are used to estimate the entire information sequence, whereas branch metrics are used to provide reliability information on the decoded sequence. This information is used to erase unreliable bits in the decoded output. An errors-and-erasures RS decoder is then used for the outer code. The two schemes have been proposed for high-speed data communication on NASA satellite channels. The rates considered are at least double those used in current NASA systems, and the results indicate that high system reliability can still be achieved.

  3. A Comparative Study of Alternative Controls and Displays for by the Severely Physically Handicapped

    NASA Technical Reports Server (NTRS)

    Williams, D.; Simpson, C.; Barker, M.

    1984-01-01

    A modification of a row/column scanning system was investigated in order to increase the speed and accuracy with which communication aids can be accessed with one or two switches. A selection algorithm was developed and programmed in BASIC to automatically select individuals with the characteristic difficulty in controlling time dependent control and display systems. Four systems were compared: (1) row/column directed scan (2 switches); (2) row/column auto scan (1 switch); (3) row auto scan (1 switch); and (4) column auto scan (1 switch). For this sample population, there were no significant differences among systems for scan time to select the correct target. The row/column auto scan system resulted in significantly more errors than any of the other three systems. Thus, the most widely prescribed system for severely physically disabled individuals turns out for this group to have a higher error rate and no faster communication rate than three other systems that have been considered inappropriate for this group.

  4. System and method for compressive scanning electron microscopy

    DOEpatents

    Reed, Bryan W

    2015-01-13

    A scanning transmission electron microscopy (STEM) system is disclosed. The system may make use of an electron beam scanning system configured to generate a plurality of electron beam scans over substantially an entire sample, with each scan varying in electron-illumination intensity over a course of the scan. A signal acquisition system may be used for obtaining at least one of an image, a diffraction pattern, or a spectrum from the scans, the image, diffraction pattern, or spectrum representing only information from at least one of a select subplurality or linear combination of all pixel locations comprising the image. A dataset may be produced from the information. A subsystem may be used for mathematically analyzing the dataset to predict actual information that would have been produced by each pixel location of the image.

  5. Studying NASA's Transition to Ka-Band Communications for Low Earth Orbit

    NASA Technical Reports Server (NTRS)

    Chelmins, David; Reinhart, Richard; Mortensen, Dale; Welch, Bryan; Downey, Joseph; Evans, Mike

    2014-01-01

    As the S-band spectrum becomes crowded, future space missions will need to consider moving command and telemetry services to Ka-band. NASAs Space Communications and Navigation (SCaN) Testbed provides a software-defined radio (SDR) platform that is capable of supporting investigation of this service transition. The testbed contains two S-band SDRs and one Ka-band SDR. Over the past year, SCaN Testbed has demonstrated Ka-band communications capabilities with NASAs Tracking and Data Relay Satellite System (TDRSS) using both open- and closed-loop antenna tracking profiles. A number of technical areas need to be addressed for successful transition to Ka-band. The smaller antenna beamwidth at Ka-band increases the criticality of antenna pointing, necessitating closed loop tracking algorithms and new techniques for received power estimation. Additionally, the antenna pointing routines require enhanced knowledge of spacecraft position and attitude for initial acquisition, versus an S-band antenna. Ka-band provides a number of technical advantages for bulk data transfer. Unlike at S-band, a larger bandwidth may be available for space missions, allowing increased data rates. The potential for high rate data transfer can also be extended for direct-to-ground links through use of variable or adaptive coding and modulation. Specific examples of Ka-band research from SCaN Testbeds first year of operation will be cited, such as communications link performance with TDRSS, and the effects of truss flexure on antenna pointing.

  6. Studying NASA's Transition to Ka-Band Communications for Low Earth Orbit

    NASA Technical Reports Server (NTRS)

    Chelmins, David T.; Reinhart, Richard C.; Mortensen, Dale; Welch, Bryan; Downey, Joseph; Evans, Michael

    2014-01-01

    As the S-band spectrum becomes crowded, future space missions will need to consider moving command and telemetry services to Ka-band. NASA's Space Communications and Navigation (SCaN) Testbed provides a software-defined radio (SDR) platform that is capable of supporting investigation of this service transition. The testbed contains two S-band SDRs and one Ka-band SDR. Over the past year, SCaN Testbed has demonstrated Ka-band communications capabilities with NASAs Tracking and Data Relay Satellite System (TDRSS) using both open- and closed-loop antenna tracking profiles. A number of technical areas need to be addressed for successful transition to Ka-band. The smaller antenna beamwidth at Ka-band increases the criticality of antenna pointing, necessitating closed loop tracking algorithms and new techniques for received power estimation. Additionally, the antenna pointing routines require enhanced knowledge of spacecraft position and attitude for initial acquisition, versus an S-band antenna. Ka-band provides a number of technical advantages for bulk data transfer. Unlike at S-band, a larger bandwidth may be available for space missions, allowing increased data rates. The potential for high rate data transfer can also be extended for direct-to-ground links through use of variable or adaptive coding and modulation. Specific examples of Ka-band research from SCaN Testbeds first year of operation will be cited, such as communications link performance with TDRSS, and the effects of truss flexure on antenna pointing.

  7. 42 CFR 405.512 - Carriers' procedural terminology and coding systems.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 42 Public Health 2 2013-10-01 2013-10-01 false Carriers' procedural terminology and coding systems... Determining Reasonable Charges § 405.512 Carriers' procedural terminology and coding systems. (a) General. Procedural terminology and coding systems are designed to provide physicians and third party payers with a...

  8. 42 CFR 405.512 - Carriers' procedural terminology and coding systems.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 42 Public Health 2 2011-10-01 2011-10-01 false Carriers' procedural terminology and coding systems... Determining Reasonable Charges § 405.512 Carriers' procedural terminology and coding systems. (a) General. Procedural terminology and coding systems are designed to provide physicians and third party payers with a...

  9. 42 CFR 405.512 - Carriers' procedural terminology and coding systems.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 42 Public Health 2 2014-10-01 2014-10-01 false Carriers' procedural terminology and coding systems... Determining Reasonable Charges § 405.512 Carriers' procedural terminology and coding systems. (a) General. Procedural terminology and coding systems are designed to provide physicians and third party payers with a...

  10. 42 CFR 405.512 - Carriers' procedural terminology and coding systems.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 42 Public Health 2 2012-10-01 2012-10-01 false Carriers' procedural terminology and coding systems... Determining Reasonable Charges § 405.512 Carriers' procedural terminology and coding systems. (a) General. Procedural terminology and coding systems are designed to provide physicians and third party payers with a...

  11. 42 CFR 405.512 - Carriers' procedural terminology and coding systems.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 42 Public Health 2 2010-10-01 2010-10-01 false Carriers' procedural terminology and coding systems... Determining Reasonable Charges § 405.512 Carriers' procedural terminology and coding systems. (a) General. Procedural terminology and coding systems are designed to provide physicians and third party payers with a...

  12. Massively parallel data processing for quantitative total flow imaging with optical coherence microscopy and tomography

    NASA Astrophysics Data System (ADS)

    Sylwestrzak, Marcin; Szlag, Daniel; Marchand, Paul J.; Kumar, Ashwin S.; Lasser, Theo

    2017-08-01

    We present an application of massively parallel processing of quantitative flow measurements data acquired using spectral optical coherence microscopy (SOCM). The need for massive signal processing of these particular datasets has been a major hurdle for many applications based on SOCM. In view of this difficulty, we implemented and adapted quantitative total flow estimation algorithms on graphics processing units (GPU) and achieved a 150 fold reduction in processing time when compared to a former CPU implementation. As SOCM constitutes the microscopy counterpart to spectral optical coherence tomography (SOCT), the developed processing procedure can be applied to both imaging modalities. We present the developed DLL library integrated in MATLAB (with an example) and have included the source code for adaptations and future improvements. Catalogue identifier: AFBT_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AFBT_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU GPLv3 No. of lines in distributed program, including test data, etc.: 913552 No. of bytes in distributed program, including test data, etc.: 270876249 Distribution format: tar.gz Programming language: CUDA/C, MATLAB. Computer: Intel x64 CPU, GPU supporting CUDA technology. Operating system: 64-bit Windows 7 Professional. Has the code been vectorized or parallelized?: Yes, CPU code has been vectorized in MATLAB, CUDA code has been parallelized. RAM: Dependent on users parameters, typically between several gigabytes and several tens of gigabytes Classification: 6.5, 18. Nature of problem: Speed up of data processing in optical coherence microscopy Solution method: Utilization of GPU for massively parallel data processing Additional comments: Compiled DLL library with source code and documentation, example of utilization (MATLAB script with raw data) Running time: 1,8 s for one B-scan (150 × faster in comparison to the CPU data processing time)

  13. Microwave metamaterials—from passive to digital and programmable controls of electromagnetic waves

    NASA Astrophysics Data System (ADS)

    Cui, Tie Jun

    2017-08-01

    Since 2004, my group at Southeast University has been carrying out research into microwave metamaterials, which are classified into three catagories: metamaterials based on the effective medium model, plasmonic metamaterials for spoof surface plasmon polaritons (SPPs), and coding and programmable metamaterials. For effective-medium metamaterials, we have developed a general theory to accurately describe effective permittivity and permeability in semi-analytical forms, from which we have designed and realized a three dimensional (3D) wideband ground-plane invisibility cloak, a free-space electrostatic invisibility cloak, an electromagnetic black hole, optical/radar illusions, and radially anisotropic zero-index metamaterial for omni-directional radiation and a nearly perfect power combination of source array, etc. We have also considered the engineering applications of microwave metamaterials, such as a broadband and low-loss 3D transformation-optics lens for wide-angle scanning, a 3D planar gradient-index lens for high-gain radiations, and a random metasurface for reducing radar cross sections. In the area of plasmonic metamaterials, we proposed an ultrathin, narrow, and flexible corrugated metallic strip to guide SPPs with a small bending loss and radiation loss, from which we designed and realized a series of SPP passive devices (e.g. power divider, coupler, filter, and resonator) and active devices (e.g. amplifier and duplexer). We also showed a significant feature of the ultrathin SPP waveguide in overcoming the challenge of signal integrity in traditional integrated circuits, which will help build a high-performance SPP wireless communication system. In the area of coding and programmable metamaterials, we proposed a new measure to describe a metamaterial from the viewpoint of information theory. We have illustrated theoretically and experimentally that coding metamaterials composed of digital units can be controlled by coding sequences, leading to different functions. We realised that when the digital state of a coding unit is controlled by a field programmable gate array, the programmable metamaterial, which is capable of manipulating electromagnetic waves in real time, can generate many different functions.

  14. A novel concatenated code based on the improved SCG-LDPC code for optical transmission systems

    NASA Astrophysics Data System (ADS)

    Yuan, Jian-guo; Xie, Ya; Wang, Lin; Huang, Sheng; Wang, Yong

    2013-01-01

    Based on the optimization and improvement for the construction method of systematically constructed Gallager (SCG) (4, k) code, a novel SCG low density parity check (SCG-LDPC)(3969, 3720) code to be suitable for optical transmission systems is constructed. The novel SCG-LDPC (6561,6240) code with code rate of 95.1% is constructed by increasing the length of SCG-LDPC (3969,3720) code, and in a way, the code rate of LDPC codes can better meet the high requirements of optical transmission systems. And then the novel concatenated code is constructed by concatenating SCG-LDPC(6561,6240) code and BCH(127,120) code with code rate of 94.5%. The simulation results and analyses show that the net coding gain (NCG) of BCH(127,120)+SCG-LDPC(6561,6240) concatenated code is respectively 2.28 dB and 0.48 dB more than those of the classic RS(255,239) code and SCG-LDPC(6561,6240) code at the bit error rate (BER) of 10-7.

  15. Trace-shortened Reed-Solomon codes

    NASA Technical Reports Server (NTRS)

    Mceliece, R. J.; Solomon, G.

    1994-01-01

    Reed-Solomon (RS) codes have been part of standard NASA telecommunications systems for many years. RS codes are character-oriented error-correcting codes, and their principal use in space applications has been as outer codes in concatenated coding systems. However, for a given character size, say m bits, RS codes are limited to a length of, at most, 2(exp m). It is known in theory that longer character-oriented codes would be superior to RS codes in concatenation applications, but until recently no practical class of 'long' character-oriented codes had been discovered. In 1992, however, Solomon discovered an extensive class of such codes, which are now called trace-shortened Reed-Solomon (TSRS) codes. In this article, we will continue the study of TSRS codes. Our main result is a formula for the dimension of any TSRS code, as a function of its error-correcting power. Using this formula, we will give several examples of TSRS codes, some of which look very promising as candidate outer codes in high-performance coded telecommunications systems.

  16. An Interactive Concatenated Turbo Coding System

    NASA Technical Reports Server (NTRS)

    Liu, Ye; Tang, Heng; Lin, Shu; Fossorier, Marc

    1999-01-01

    This paper presents a concatenated turbo coding system in which a Reed-Solomon outer code is concatenated with a binary turbo inner code. In the proposed system, the outer code decoder and the inner turbo code decoder interact to achieve both good bit error and frame error performances. The outer code decoder helps the inner turbo code decoder to terminate its decoding iteration while the inner turbo code decoder provides soft-output information to the outer code decoder to carry out a reliability-based soft- decision decoding. In the case that the outer code decoding fails, the outer code decoder instructs the inner code decoder to continue its decoding iterations until the outer code decoding is successful or a preset maximum number of decoding iterations is reached. This interaction between outer and inner code decoders reduces decoding delay. Also presented in the paper are an effective criterion for stopping the iteration process of the inner code decoder and a new reliability-based decoding algorithm for nonbinary codes.

  17. TEMPEST Simulations of Collisionless Damping of Geodesic-Acoustic Mode in Edge Plasma Pedestal

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xu, X Q; Xiong, Z; Nevins, W M

    The fully nonlinear (full-f) 4D TEMPEST gyrokinetic continuum code produces frequency, collisionless damping of GAM and zonal flow with fully nonlinear Boltzmann electrons for the inverse aspect ratio {epsilon}-scan and the tokamak safety factor q-scan in homogeneous plasmas. The TEMPEST simulation shows that GAM exists in edge plasma pedestal for steep density and temperature gradients, and an initial GAM relaxes to the standard neoclassical residual, rather than Rosenbluth-Hinton residual due to the presence of ion-ion collisions. The enhanced GAM damping explains experimental BES measurements on the edge q scaling of the GAM amplitude.

  18. TEMPEST Simulations of Collisionless Damping of Geodesic-Acoustic Mode in Edge Plasma Pedestal

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xu, X; Xiong, Z; Nevins, W

    The fully nonlinear 4D TEMPEST gyrokinetic continuum code produces frequency, collisionless damping of geodesic-acoustic mode (GAM) and zonal flow with fully nonlinear Boltzmann electrons for the inverse aspect ratio {epsilon}-scan and the tokamak safety factor q-scan in homogeneous plasmas. The TEMPEST simulation shows that GAM exists in edge plasma pedestal for steep density and temperature gradients, and an initial GAM relaxes to the standard neoclassical residual, rather than Rosenbluth-Hinton residual due to the presence of ion-ion collisions. The enhanced GAM damping explains experimental BES measurements on the edge q scaling of the GAM amplitude.

  19. Infrared zone-scanning system.

    PubMed

    Belousov, Aleksandr; Popov, Gennady

    2006-03-20

    Challenges encountered in designing an infrared viewing optical system that uses a small linear detector array based on a zone-scanning approach are discussed. Scanning is performed by a rotating refractive polygon prism with tilted facets, which, along with high-speed line scanning, makes the scanning gear as simple as possible. A method of calculation of a practical optical system to compensate for aberrations during prism rotation is described.

  20. Gene finding in metatranscriptomic sequences.

    PubMed

    Ismail, Wazim Mohammed; Ye, Yuzhen; Tang, Haixu

    2014-01-01

    Metatranscriptomic sequencing is a highly sensitive bioassay of functional activity in a microbial community, providing complementary information to the metagenomic sequencing of the community. The acquisition of the metatranscriptomic sequences will enable us to refine the annotations of the metagenomes, and to study the gene activities and their regulation in complex microbial communities and their dynamics. In this paper, we present TransGeneScan, a software tool for finding genes in assembled transcripts from metatranscriptomic sequences. By incorporating several features of metatranscriptomic sequencing, including strand-specificity, short intergenic regions, and putative antisense transcripts into a Hidden Markov Model, TranGeneScan can predict a sense transcript containing one or multiple genes (in an operon) or an antisense transcript. We tested TransGeneScan on a mock metatranscriptomic data set containing three known bacterial genomes. The results showed that TranGeneScan performs better than metagenomic gene finders (MetaGeneMark and FragGeneScan) on predicting protein coding genes in assembled transcripts, and achieves comparable or even higher accuracy than gene finders for microbial genomes (Glimmer and GeneMark). These results imply, with the assistance of metatranscriptomic sequencing, we can obtain a broad and precise picture about the genes (and their functions) in a microbial community. TransGeneScan is available as open-source software on SourceForge at https://sourceforge.net/projects/transgenescan/.

  1. Analysis of random point images with the use of symbolic computation codes and generalized Catalan numbers

    NASA Astrophysics Data System (ADS)

    Reznik, A. L.; Tuzikov, A. V.; Solov'ev, A. A.; Torgov, A. V.

    2016-11-01

    Original codes and combinatorial-geometrical computational schemes are presented, which are developed and applied for finding exact analytical formulas that describe the probability of errorless readout of random point images recorded by a scanning aperture with a limited number of threshold levels. Combinatorial problems encountered in the course of the study and associated with the new generalization of Catalan numbers are formulated and solved. An attempt is made to find the explicit analytical form of these numbers, which is, on the one hand, a necessary stage of solving the basic research problem and, on the other hand, an independent self-consistent problem.

  2. Error-correction coding

    NASA Technical Reports Server (NTRS)

    Hinds, Erold W. (Principal Investigator)

    1996-01-01

    This report describes the progress made towards the completion of a specific task on error-correcting coding. The proposed research consisted of investigating the use of modulation block codes as the inner code of a concatenated coding system in order to improve the overall space link communications performance. The study proposed to identify and analyze candidate codes that will complement the performance of the overall coding system which uses the interleaved RS (255,223) code as the outer code.

  3. Design and implementation of a 3D-MR/CT geometric image distortion phantom/analysis system for stereotactic radiosurgery.

    PubMed

    Damyanovich, A Z; Rieker, M; Zhang, B; Bissonnette, J-P; Jaffray, D A

    2018-03-27

    The design, construction and application of a multimodality, 3D magnetic resonance/computed tomography (MR/CT) image distortion phantom and analysis system for stereotactic radiosurgery (SRS) is presented. The phantom is characterized by (1) a 1 × 1 × 1 (cm) 3 MRI/CT-visible 3D-Cartesian grid; (2) 2002 grid vertices that are 3D-intersections of MR-/CT-visible 'lines' in all three orthogonal planes; (3) a 3D-grid that is MR-signal positive/CT-signal negative; (4) a vertex distribution sufficiently 'dense' to characterize geometrical parameters properly, and (5) a grid/vertex resolution consistent with SRS localization accuracy. When positioned correctly, successive 3D-vertex planes along any orthogonal axis of the phantom appear as 1 × 1 (cm) 2 -2D grids, whereas between vertex planes, images are defined by 1 × 1 (cm) 2 -2D arrays of signal points. Image distortion is evaluated using a centroid algorithm that automatically identifies the center of each 3D-intersection and then calculates the deviations dx, dy, dz and dr for each vertex point; the results are presented as a color-coded 2D or 3D distribution of deviations. The phantom components and 3D-grid are machined to sub-millimeter accuracy, making the device uniquely suited to SRS applications; as such, we present it here in a form adapted for use with a Leksell stereotactic frame. Imaging reproducibility was assessed via repeated phantom imaging across ten back-to-back scans; 80%-90% of the differences in vertex deviations dx, dy, dz and dr between successive 3 T MRI scans were found to be  ⩽0.05 mm for both axial and coronal acquisitions, and over  >95% of the differences were observed to be  ⩽0.05 mm for repeated CT scans, clearly demonstrating excellent reproducibility. Applications of the 3D-phantom/analysis system are presented, using a 32-month time-course assessment of image distortion/gradient stability and statistical control chart for 1.5 T and 3 T GE TwinSpeed MRI systems.

  4. Design and implementation of a 3D-MR/CT geometric image distortion phantom/analysis system for stereotactic radiosurgery

    NASA Astrophysics Data System (ADS)

    Damyanovich, A. Z.; Rieker, M.; Zhang, B.; Bissonnette, J.-P.; Jaffray, D. A.

    2018-04-01

    The design, construction and application of a multimodality, 3D magnetic resonance/computed tomography (MR/CT) image distortion phantom and analysis system for stereotactic radiosurgery (SRS) is presented. The phantom is characterized by (1) a 1 × 1 × 1 (cm)3 MRI/CT-visible 3D-Cartesian grid; (2) 2002 grid vertices that are 3D-intersections of MR-/CT-visible ‘lines’ in all three orthogonal planes; (3) a 3D-grid that is MR-signal positive/CT-signal negative; (4) a vertex distribution sufficiently ‘dense’ to characterize geometrical parameters properly, and (5) a grid/vertex resolution consistent with SRS localization accuracy. When positioned correctly, successive 3D-vertex planes along any orthogonal axis of the phantom appear as 1 × 1 (cm)2-2D grids, whereas between vertex planes, images are defined by 1 × 1 (cm)2-2D arrays of signal points. Image distortion is evaluated using a centroid algorithm that automatically identifies the center of each 3D-intersection and then calculates the deviations dx, dy, dz and dr for each vertex point; the results are presented as a color-coded 2D or 3D distribution of deviations. The phantom components and 3D-grid are machined to sub-millimeter accuracy, making the device uniquely suited to SRS applications; as such, we present it here in a form adapted for use with a Leksell stereotactic frame. Imaging reproducibility was assessed via repeated phantom imaging across ten back-to-back scans; 80%–90% of the differences in vertex deviations dx, dy, dz and dr between successive 3 T MRI scans were found to be  ⩽0.05 mm for both axial and coronal acquisitions, and over  >95% of the differences were observed to be  ⩽0.05 mm for repeated CT scans, clearly demonstrating excellent reproducibility. Applications of the 3D-phantom/analysis system are presented, using a 32-month time-course assessment of image distortion/gradient stability and statistical control chart for 1.5 T and 3 T GE TwinSpeed MRI systems.

  5. A trichrome beam model for biological dose calculation in scanned carbon-ion radiotherapy treatment planning.

    PubMed

    Inaniwa, T; Kanematsu, N

    2015-01-07

    In scanned carbon-ion (C-ion) radiotherapy, some primary C-ions undergo nuclear reactions before reaching the target and the resulting particles deliver doses to regions at a significant distance from the central axis of the beam. The effects of these particles on physical dose distribution are accounted for in treatment planning by representing the transverse profile of the scanned C-ion beam as the superposition of three Gaussian distributions. In the calculation of biological dose distribution, however, the radiation quality of the scanned C-ion beam has been assumed to be uniform over its cross-section, taking the average value over the plane at a given depth (monochrome model). Since these particles, which have relatively low radiation quality, spread widely compared to the primary C-ions, the radiation quality of the beam should vary with radial distance from the central beam axis. To represent its transverse distribution, we propose a trichrome beam model in which primary C-ions, heavy fragments with atomic number Z ≥ 3, and light fragments with Z ≤ 2 are assigned to the first, second, and third Gaussian components, respectively. Assuming a realistic beam-delivery system, we performed computer simulations using Geant4 Monte Carlo code for analytical beam modeling of the monochrome and trichrome models. The analytical beam models were integrated into a treatment planning system for scanned C-ion radiotherapy. A target volume of 20  ×  20  ×  40 mm(3) was defined within a water phantom. A uniform biological dose of 2.65 Gy (RBE) was planned for the target with the two beam models based on the microdosimetric kinetic model (MKM). The plans were recalculated with Geant4, and the recalculated biological dose distributions were compared with the planned distributions. The mean target dose of the recalculated distribution with the monochrome model was 2.72 Gy (RBE), while the dose with the trichrome model was 2.64 Gy (RBE). The monochrome model underestimated the RBE within the target due to the assumption of no radial variations in radiation quality. Conversely, the trichrome model accurately predicted the RBE even in a small target. Our results verify the applicability of the trichrome model for clinical use in C-ion radiotherapy treatment planning.

  6. A trichrome beam model for biological dose calculation in scanned carbon-ion radiotherapy treatment planning

    NASA Astrophysics Data System (ADS)

    Inaniwa, T.; Kanematsu, N.

    2015-01-01

    In scanned carbon-ion (C-ion) radiotherapy, some primary C-ions undergo nuclear reactions before reaching the target and the resulting particles deliver doses to regions at a significant distance from the central axis of the beam. The effects of these particles on physical dose distribution are accounted for in treatment planning by representing the transverse profile of the scanned C-ion beam as the superposition of three Gaussian distributions. In the calculation of biological dose distribution, however, the radiation quality of the scanned C-ion beam has been assumed to be uniform over its cross-section, taking the average value over the plane at a given depth (monochrome model). Since these particles, which have relatively low radiation quality, spread widely compared to the primary C-ions, the radiation quality of the beam should vary with radial distance from the central beam axis. To represent its transverse distribution, we propose a trichrome beam model in which primary C-ions, heavy fragments with atomic number Z ≥ 3, and light fragments with Z ≤ 2 are assigned to the first, second, and third Gaussian components, respectively. Assuming a realistic beam-delivery system, we performed computer simulations using Geant4 Monte Carlo code for analytical beam modeling of the monochrome and trichrome models. The analytical beam models were integrated into a treatment planning system for scanned C-ion radiotherapy. A target volume of 20  ×  20  ×  40 mm3 was defined within a water phantom. A uniform biological dose of 2.65 Gy (RBE) was planned for the target with the two beam models based on the microdosimetric kinetic model (MKM). The plans were recalculated with Geant4, and the recalculated biological dose distributions were compared with the planned distributions. The mean target dose of the recalculated distribution with the monochrome model was 2.72 Gy (RBE), while the dose with the trichrome model was 2.64 Gy (RBE). The monochrome model underestimated the RBE within the target due to the assumption of no radial variations in radiation quality. Conversely, the trichrome model accurately predicted the RBE even in a small target. Our results verify the applicability of the trichrome model for clinical use in C-ion radiotherapy treatment planning.

  7. Micro-optical design of a three-dimensional microlens scanner for vertically integrated micro-opto-electro-mechanical systems.

    PubMed

    Baranski, Maciej; Bargiel, Sylwester; Passilly, Nicolas; Gorecki, Christophe; Jia, Chenping; Frömel, Jörg; Wiemer, Maik

    2015-08-01

    This paper presents the optical design of a miniature 3D scanning system, which is fully compatible with the vertical integration technology of micro-opto-electro-mechanical systems (MOEMS). The constraints related to this integration strategy are considered, resulting in a simple three-element micro-optical setup based on an afocal scanning microlens doublet and a focusing microlens, which is tolerant to axial position inaccuracy. The 3D scanning is achieved by axial and lateral displacement of microlenses of the scanning doublet, realized by micro-electro-mechanical systems microactuators (the transmission scanning approach). Optical scanning performance of the system is determined analytically by use of the extended ray transfer matrix method, leading to two different optical configurations, relying either on a ball lens or plano-convex microlenses. The presented system is aimed to be a core component of miniature MOEMS-based optical devices, which require a 3D optical scanning function, e.g., miniature imaging systems (confocal or optical coherence microscopes) or optical tweezers.

  8. User Instructions for the Systems Assessment Capability, Rev. 1, Computer Codes Volume 3: Utility Codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Eslinger, Paul W.; Aaberg, Rosanne L.; Lopresti, Charles A.

    2004-09-14

    This document contains detailed user instructions for a suite of utility codes developed for Rev. 1 of the Systems Assessment Capability. The suite of computer codes for Rev. 1 of Systems Assessment Capability performs many functions.

  9. Latent palmprint matching.

    PubMed

    Jain, Anil K; Feng, Jianjiang

    2009-06-01

    The evidential value of palmprints in forensic applications is clear as about 30 percent of the latents recovered from crime scenes are from palms. While biometric systems for palmprint-based personal authentication in access control type of applications have been developed, they mostly deal with low-resolution (about 100 ppi) palmprints and only perform full-to-full palmprint matching. We propose a latent-to-full palmprint matching system that is needed in forensic applications. Our system deals with palmprints captured at 500 ppi (the current standard in forensic applications) or higher resolution and uses minutiae as features to be compatible with the methodology used by latent experts. Latent palmprint matching is a challenging problem because latent prints lifted at crime scenes are of poor image quality, cover only a small area of the palm, and have a complex background. Other difficulties include a large number of minutiae in full prints (about 10 times as many as fingerprints), and the presence of many creases in latents and full prints. A robust algorithm to reliably estimate the local ridge direction and frequency in palmprints is developed. This facilitates the extraction of ridge and minutiae features even in poor quality palmprints. A fixed-length minutia descriptor, MinutiaCode, is utilized to capture distinctive information around each minutia and an alignment-based minutiae matching algorithm is used to match two palmprints. Two sets of partial palmprints (150 live-scan partial palmprints and 100 latent palmprints) are matched to a background database of 10,200 full palmprints to test the proposed system. Despite the inherent difficulty of latent-to-full palmprint matching, rank-1 recognition rates of 78.7 and 69 percent, respectively, were achieved in searching live-scan partial palmprints and latent palmprints against the background database.

  10. Virtual Instrument Simulator for CERES

    NASA Technical Reports Server (NTRS)

    Chapman, John J.

    1997-01-01

    A benchtop virtual instrument simulator for CERES (Clouds and the Earth's Radiant Energy System) has been built at NASA, Langley Research Center in Hampton, VA. The CERES instruments will fly on several earth orbiting platforms notably NASDA's Tropical Rainfall Measurement Mission (TRMM) and NASA's Earth Observing System (EOS) satellites. CERES measures top of the atmosphere radiative fluxes using microprocessor controlled scanning radiometers. The CERES Virtual Instrument Simulator consists of electronic circuitry identical to the flight unit's twin microprocessors and telemetry interface to the supporting spacecraft electronics and two personal computers (PC) connected to the I/O ports that control azimuth and elevation gimbals. Software consists of the unmodified TRW developed Flight Code and Ground Support Software which serves as the instrument monitor and NASA/TRW developed engineering models of the scanners. The CERES Instrument Simulator will serve as a testbed for testing of custom instrument commands intended to solve in-flight anomalies of the instruments which could arise during the CERES mission. One of the supporting computers supports the telemetry display which monitors the simulator microprocessors during the development and testing of custom instrument commands. The CERES engineering development software models have been modified to provide a virtual instrument running on a second supporting computer linked in real time to the instrument flight microprocessor control ports. The CERES Instrument Simulator will be used to verify memory uploads by the CERES Flight Operations TEAM at NASA. Plots of the virtual scanner models match the actual instrument scan plots. A high speed logic analyzer has been used to track the performance of the flight microprocessor. The concept of using an identical but non-flight qualified microprocessor and electronics ensemble linked to a virtual instrument with identical system software affords a relatively inexpensive simulation system capable of high fidelity.

  11. OCTGRAV: Sparse Octree Gravitational N-body Code on Graphics Processing Units

    NASA Astrophysics Data System (ADS)

    Gaburov, Evghenii; Bédorf, Jeroen; Portegies Zwart, Simon

    2010-10-01

    Octgrav is a very fast tree-code which runs on massively parallel Graphical Processing Units (GPU) with NVIDIA CUDA architecture. The algorithms are based on parallel-scan and sort methods. The tree-construction and calculation of multipole moments is carried out on the host CPU, while the force calculation which consists of tree walks and evaluation of interaction list is carried out on the GPU. In this way, a sustained performance of about 100GFLOP/s and data transfer rates of about 50GB/s is achieved. It takes about a second to compute forces on a million particles with an opening angle of heta approx 0.5. To test the performance and feasibility, we implemented the algorithms in CUDA in the form of a gravitational tree-code which completely runs on the GPU. The tree construction and traverse algorithms are portable to many-core devices which have support for CUDA or OpenCL programming languages. The gravitational tree-code outperforms tuned CPU code during the tree-construction and shows a performance improvement of more than a factor 20 overall, resulting in a processing rate of more than 2.8 million particles per second. The code has a convenient user interface and is freely available for use.

  12. Provider Distribution Changes in Dual-Energy X-Ray Absorptiometry in the Medicare Population Over the Past Decade.

    PubMed

    Intenzo, Charles M; Parker, Laurence; Levin, David C; Kim, Sung M; Rao, Vijay M

    2016-01-01

    Both radiologists as well as nonimaging physicians perform dual-energy X-ray absorptiometry (DXA) imaging in the United States. This study aims to compare provider distribution between these physician groups on the Medicare population, which is the predominant age group of patients evaluated by this imaging procedure. Using the 2 relevant Current Procedural Terminology, Fourth Edition codes for DXA scans, source data were obtained from the CMS Physician Supplier Procedure Summary Master Files from 2003 through 2013. DXA scan procedure volumes for radiologists and nonradiologists on Medicare patients were tabulated. Utilization rates were calculated. From 2003 to 2013, the total number of DXA scans performed on Medicare patients decreased by 2%. However, over the same period, the number of scans performed by radiologists had increased by 25% over nonimaging specialists, whose utilization had declined by approximately the same amount. From 2003 to 2013, the rate of utilization of DXA scans in the Medicare fee-for-service population declined somewhat. However, radiologists continue to gain market share from other specialists and now predominate in this type of imaging by a substantial margin. Copyright © 2016 International Society for Clinical Densitometry. Published by Elsevier Inc. All rights reserved.

  13. Comparative Geometrical Accuracy Investigations of Hand-Held 3d Scanning Systems - AN Update

    NASA Astrophysics Data System (ADS)

    Kersten, T. P.; Lindstaedt, M.; Starosta, D.

    2018-05-01

    Hand-held 3D scanning systems are increasingly available on the market from several system manufacturers. These systems are deployed for 3D recording of objects with different size in diverse applications, such as industrial reverse engineering, and documentation of museum exhibits etc. Typical measurement distances range from 0.5 m to 4.5 m. Although they are often easy-to-use, the geometric performance of these systems, especially the precision and accuracy, are not well known to many users. First geometrical investigations of a variety of diverse hand-held 3D scanning systems were already carried out by the Photogrammetry & Laser Scanning Lab of the HafenCity University Hamburg (HCU Hamburg) in cooperation with two other universities in 2016. To obtain more information about the accuracy behaviour of the latest generation of hand-held 3D scanning systems, HCU Hamburg conducted further comparative geometrical investigations using structured light systems with speckle pattern (Artec Spider, Mantis Vision PocketScan 3D, Mantis Vision F5-SR, Mantis Vision F5-B, and Mantis Vision F6), and photogrammetric systems (Creaform HandySCAN 700 and Shining FreeScan X7). In the framework of these comparative investigations geometrically stable reference bodies were used. The appropriate reference data was acquired by measurements with two structured light projection systems (AICON smartSCAN and GOM ATOS I 2M). The comprehensive test results of the different test scenarios are presented and critically discussed in this contribution.

  14. Transient dynamics capability at Sandia National Laboratories

    NASA Technical Reports Server (NTRS)

    Attaway, Steven W.; Biffle, Johnny H.; Sjaardema, G. D.; Heinstein, M. W.; Schoof, L. A.

    1993-01-01

    A brief overview of the transient dynamics capabilities at Sandia National Laboratories, with an emphasis on recent new developments and current research is presented. In addition, the Sandia National Laboratories (SNL) Engineering Analysis Code Access System (SEACAS), which is a collection of structural and thermal codes and utilities used by analysts at SNL, is described. The SEACAS system includes pre- and post-processing codes, analysis codes, database translation codes, support libraries, Unix shell scripts for execution, and an installation system. SEACAS is used at SNL on a daily basis as a production, research, and development system for the engineering analysts and code developers. Over the past year, approximately 190 days of CPU time were used by SEACAS codes on jobs running from a few seconds up to two and one-half days of CPU time. SEACAS is running on several different systems at SNL including Cray Unicos, Hewlett Packard PH-UX, Digital Equipment Ultrix, and Sun SunOS. An overview of SEACAS, including a short description of the codes in the system, are presented. Abstracts and references for the codes are listed at the end of the report.

  15. The Effectiveness of Using Procedural Scaffoldings in a Paper-Plus-Smartphone Collaborative Learning Context

    ERIC Educational Resources Information Center

    Huang, Hui-Wen; Wu, Chih-Wei; Chen, Nian-Shing

    2012-01-01

    The purpose of this study was to evaluate the effectiveness of using procedural scaffoldings in fostering students' group discourse levels and learning outcomes in a paper-plus-smartphone collaborative learning context. All participants used built-in camera smartphones to learn new knowledge by scanning Quick Response (QR) codes, a type of…

  16. Preservation and Access to Manuscript Collections of the Czech National Library.

    ERIC Educational Resources Information Center

    Karen, Vladimir; Psohlavec, Stanislav

    In 1996, the Czech National Library started a large-scale digitization of its extensive and invaluable collection of historical manuscripts and printed books. Each page of the selected documents is scanned using a high-resolution, full-color digital camera, processed, and archived on a CD-ROM disk. HTML coded description is added to the entire…

  17. Organ and effective doses in newborn patients during helical multislice computed tomography examination

    NASA Astrophysics Data System (ADS)

    Staton, Robert J.; Lee, Choonik; Lee, Choonsik; Williams, Matt D.; Hintenlang, David E.; Arreola, Manuel M.; Williams, Jonathon L.; Bolch, Wesley E.

    2006-10-01

    In this study, two computational phantoms of the newborn patient were used to assess individual organ doses and effective doses delivered during head, chest, abdomen, pelvis, and torso examinations using the Siemens SOMATOM Sensation 16 helical multi-slice computed tomography (MSCT) scanner. The stylized phantom used to model the patient anatomy was the revised ORNL newborn phantom by Han et al (2006 Health Phys.90 337). The tomographic phantom used in the study was that developed by Nipper et al (2002 Phys. Med. Biol. 47 3143) as recently revised by Staton et al (2006 Med. Phys. 33 3283). The stylized model was implemented within the MCNP5 radiation transport code, while the tomographic phantom was incorporated within the EGSnrc code. In both codes, the x-ray source was modelled as a fan beam originating from the focal spot at a fan angle of 52° and a focal-spot-to-axis distance of 57 cm. The helical path of the source was explicitly modelled based on variations in collimator setting (12 mm or 24 mm), detector pitch and scan length. Tube potentials of 80, 100 and 120 kVp were considered in this study. Beam profile data were acquired using radiological film measurements on a 16 cm PMMA phantom, which yielded effective beam widths of 14.7 mm and 26.8 mm for collimator settings of 12 mm and 24 mm, respectively. Values of absolute organ absorbed dose were determined via the use of normalization factors defined as the ratio of the CTDI100 measured in-phantom and that determined by Monte Carlo simulation of the PMMA phantom and ion chamber. Across various technique factors, effective dose differences between the stylized and tomographic phantoms ranged from +2% to +9% for head exams, -4% to -2% for chest exams, +8% to +24% for abdominal exams, -16% to -12% for pelvic exams and -7% to 0% for chest-abdomen-pelvis (CAP) exams. In many cases, however, relatively close agreement in effective dose was accomplished at the expense of compensating errors in individual organ dose. Per cent differences in organ dose between the stylized and tomographic phantoms at 120 kVp and 12 mm collimator setting ranged from -25% (skin) to +164% (muscle) for head exams, -92% (thyroid) to +98% (ovaries) for chest exams, -144% (uterus) to +112% (ovaries) for abdominal exams, -98% (SI wall) to +20% (thymus) for pelvic exams and -60% (extrathoracic airways) to +13% (ovaries) for CAP exams. Better agreement was seen between the two phantom types for organs entirely within the scan field. In these cases, corresponding per cent differences in organ absorbed dose did not vary more than 17%. For all scans, the effective dose was found to range approximately 1-13 mSv across the scan parameters and scan regions. The largest effective dose occurred for CAP scans at 120 kVp.

  18. Giving Students Control over Their Learning; from Self-guided Museum Visits and Field Trips to Using Scanning Technology to Link Content to Earth Samples

    NASA Astrophysics Data System (ADS)

    Kirkby, K. C.; Phipps, M.

    2011-12-01

    While it may seem counterintuitive, sometimes stepping back is one of the more effective pedagogical approaches instructors can make. On museum visits, an instructor's presence fundamentally alters students' experiences and can curtail student learning by limiting questions or discouraging students from exploring their own interests. Students often rely on the instructor and become passive observers, rather than engaged learners. As an alternative to instructor-led visits, self-guided student explorations of museum exhibits proved to be both popular and pedagogically effective. On pre-instruction and post-instruction surveys, these ungraded, self-guided explorations match or exceed the efficacy of traditional graded lab instruction and completely eclipse gains normally achieved by traditional lecture instruction. In addition, these explorations achieve the remarkable goal of integrating undergraduate earth science instruction into students' social lives. Based on the success of the self-guided museum explorations, this fall saw the debut of an attempt to expand this concept to field experiences. A self-guided student field exploration of Saint Anthony Falls focuses on the intersections of geological processes with human history. Students explore the waterfalls' evolution, its early interpretation by 18th and 19th century Dakota and Euro-America societies, and its subsequent social and economic impacts on Upper Midwest societies. Self-guided explorations allow students to explore field settings on their own or with friends and family in a more relaxed manner. At the same time, these explorations give students control over, and responsibility for, their own learning - a powerful pedagogical approach. Student control over their learning is also the goal of an initiative to use scanning technologies, such as linear bar codes, 2D barcodes and radio-frequency identification (RFID), to revolutionize sample identification and study. Scanning technology allows students to practice pattern recognition of earth materials even before they begin to check their properties. As importantly, scanning systems allow students to select a physical earth material sample and link that sample with web page content about its origin, geologic setting, economic uses, or its social and historical relevance. With scanning systems, students are not dependent on instructors for clarification or confirmation, so they can explore earth materials at their own pace and in ways that fit their individual learning style. Despite a greatly reduced emphasis on sample identification in laboratory activities, students who integrated scanning technology and web content with earth material samples did better on unannounced end-of-term identification quizzes than students taught traditional identification methods. Integrating scanning technologies into earth material study represents the first transformative change in how geoscientists have taught introductory sample identification since the 1800's.

  19. A low-noise wide-dynamic-range event-driven detector using SOI pixel technology for high-energy particle imaging

    NASA Astrophysics Data System (ADS)

    Shrestha, Sumeet; Kamehama, Hiroki; Kawahito, Shoji; Yasutomi, Keita; Kagawa, Keiichiro; Takeda, Ayaki; Tsuru, Takeshi Go; Arai, Yasuo

    2015-08-01

    This paper presents a low-noise wide-dynamic-range pixel design for a high-energy particle detector in astronomical applications. A silicon on insulator (SOI) based detector is used for the detection of wide energy range of high energy particles (mainly for X-ray). The sensor has a thin layer of SOI CMOS readout circuitry and a thick layer of high-resistivity detector vertically stacked in a single chip. Pixel circuits are divided into two parts; signal sensing circuit and event detection circuit. The event detection circuit consisting of a comparator and logic circuits which detect the incidence of high energy particle categorizes the incident photon it into two energy groups using an appropriate energy threshold and generate a two-bit code for an event and energy level. The code for energy level is then used for selection of the gain of the in-pixel amplifier for the detected signal, providing a function of high-dynamic-range signal measurement. The two-bit code for the event and energy level is scanned in the event scanning block and the signals from the hit pixels only are read out. The variable-gain in-pixel amplifier uses a continuous integrator and integration-time control for the variable gain. The proposed design allows the small signal detection and wide dynamic range due to the adaptive gain technique and capability of correlated double sampling (CDS) technique of kTC noise canceling of the charge detector.

  20. Reproducibility of peripapillary retinal nerve fiber layer thickness with spectral domain cirrus high-definition optical coherence tomography in normal eyes.

    PubMed

    Hong, Samin; Kim, Chan Yun; Lee, Won Seok; Seong, Gong Je

    2010-01-01

    To assess the reproducibility of the new spectral domain Cirrus high-definition optical coherence tomography (HD-OCT; Carl Zeiss Meditec, Dublin, CA, USA) for analysis of peripapillary retinal nerve fiber layer (RNFL) thickness in healthy eyes. Thirty healthy Korean volunteers were enrolled. Three optic disc cube 200 x 200 Cirrus HD-OCT scans were taken on the same day in discontinuous sessions by the same operator without using the repeat scan function. The reproducibility of the calculated RNFL thickness and probability code were determined by the intraclass correlation coefficient (ICC), coefficient of variation (CV), test-retest variability, and Fleiss' generalized kappa (kappa). Thirty-six eyes were analyzed. For average RNFL thickness, the ICC was 0.970, CV was 2.38%, and test-retest variability was 4.5 microm. For all quadrants except the nasal, ICCs were 0.972 or higher and CVs were 4.26% or less. Overall test-retest variability ranged from 5.8 to 8.1 microm. The kappa value of probability codes for average RNFL thickness was 0.690. The kappa values of quadrants and clock-hour sectors were lower in the nasal areas than in other areas. The reproducibility of Cirrus HD-OCT to analyze peripapillary RNFL thickness in healthy eyes was excellent compared with the previous reports for time domain Stratus OCT. For the calculated RNFL thickness and probability code, variability was relatively higher in the nasal area, and more careful analyses are needed.

  1. The application of coded excitation technology in medical ultrasonic Doppler imaging

    NASA Astrophysics Data System (ADS)

    Li, Weifeng; Chen, Xiaodong; Bao, Jing; Yu, Daoyin

    2008-03-01

    Medical ultrasonic Doppler imaging is one of the most important domains of modern medical imaging technology. The application of coded excitation technology in medical ultrasonic Doppler imaging system has the potential of higher SNR and deeper penetration depth than conventional pulse-echo imaging system, it also improves the image quality, and enhances the sensitivity of feeble signal, furthermore, proper coded excitation is beneficial to received spectrum of Doppler signal. Firstly, this paper analyzes the application of coded excitation technology in medical ultrasonic Doppler imaging system abstractly, showing the advantage and bright future of coded excitation technology, then introduces the principle and the theory of coded excitation. Secondly, we compare some coded serials (including Chirp and fake Chirp signal, Barker codes, Golay's complementary serial, M-sequence, etc). Considering Mainlobe Width, Range Sidelobe Level, Signal-to-Noise Ratio and sensitivity of Doppler signal, we choose Barker codes as coded serial. At last, we design the coded excitation circuit. The result in B-mode imaging and Doppler flow measurement coincided with our expectation, which incarnated the advantage of application of coded excitation technology in Digital Medical Ultrasonic Doppler Endoscope Imaging System.

  2. Accuracy and time requirements of a bar-code inventory system for medical supplies.

    PubMed

    Hanson, L B; Weinswig, M H; De Muth, J E

    1988-02-01

    The effects of implementing a bar-code system for issuing medical supplies to nursing units at a university teaching hospital were evaluated. Data on the time required to issue medical supplies to three nursing units at a 480-bed, tertiary-care teaching hospital were collected (1) before the bar-code system was implemented (i.e., when the manual system was in use), (2) one month after implementation, and (3) four months after implementation. At the same times, the accuracy of the central supply perpetual inventory was monitored using 15 selected items. One-way analysis of variance tests were done to determine any significant differences between the bar-code and manual systems. Using the bar-code system took longer than using the manual system because of a significant difference in the time required for order entry into the computer. Multiple-use requirements of the central supply computer system made entering bar-code data a much slower process. There was, however, a significant improvement in the accuracy of the perpetual inventory. Using the bar-code system for issuing medical supplies to the nursing units takes longer than using the manual system. However, the accuracy of the perpetual inventory was significantly improved with the implementation of the bar-code system.

  3. A new generation scanning system for the high-speed analysis of nuclear emulsions

    NASA Astrophysics Data System (ADS)

    Alexandrov, A.; Buonaura, A.; Consiglio, L.; D'Ambrosio, N.; De Lellis, G.; Di Crescenzo, A.; Galati, G.; Lauria, A.; Montesi, M. C.; Tioukov, V.; Vladymyrov, M.

    2016-06-01

    The development of automatic scanning systems was a fundamental issue for large scale neutrino detectors exploiting nuclear emulsions as particle trackers. Such systems speed up significantly the event analysis in emulsion, allowing the feasibility of experiments with unprecedented statistics. In the early 1990s, R&D programs were carried out by Japanese and European laboratories leading to automatic scanning systems more and more efficient. The recent progress in the technology of digital signal processing and of image acquisition allows the fulfillment of new systems with higher performances. In this paper we report the description and the performance of a new generation scanning system able to operate at the record speed of 84 cm2/hour and based on the Large Angle Scanning System for OPERA (LASSO) software infrastructure developed by the Naples scanning group. Such improvement, reduces the scanning time by a factor 4 with respect to the available systems, allowing the readout of huge amount of nuclear emulsions in reasonable time. This opens new perspectives for the employment of such detectors in a wider variety of applications.

  4. 42 CFR 414.502 - Definitions.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... laboratory test for which a new or substantially revised Healthcare Common Procedure Coding System Code is assigned on or after January 1, 2005. Substantially Revised Healthcare Common Procedure Coding System Code...

  5. 42 CFR 414.502 - Definitions.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... laboratory test for which a new or substantially revised Healthcare Common Procedure Coding System Code is assigned on or after January 1, 2005. Substantially Revised Healthcare Common Procedure Coding System Code...

  6. Channel coding for underwater acoustic single-carrier CDMA communication system

    NASA Astrophysics Data System (ADS)

    Liu, Lanjun; Zhang, Yonglei; Zhang, Pengcheng; Zhou, Lin; Niu, Jiong

    2017-01-01

    CDMA is an effective multiple access protocol for underwater acoustic networks, and channel coding can effectively reduce the bit error rate (BER) of the underwater acoustic communication system. For the requirements of underwater acoustic mobile networks based on CDMA, an underwater acoustic single-carrier CDMA communication system (UWA/SCCDMA) based on the direct-sequence spread spectrum is proposed, and its channel coding scheme is studied based on convolution, RA, Turbo and LDPC coding respectively. The implementation steps of the Viterbi algorithm of convolutional coding, BP and minimum sum algorithms of RA coding, Log-MAP and SOVA algorithms of Turbo coding, and sum-product algorithm of LDPC coding are given. An UWA/SCCDMA simulation system based on Matlab is designed. Simulation results show that the UWA/SCCDMA based on RA, Turbo and LDPC coding have good performance such that the communication BER is all less than 10-6 in the underwater acoustic channel with low signal to noise ratio (SNR) from -12 dB to -10dB, which is about 2 orders of magnitude lower than that of the convolutional coding. The system based on Turbo coding with Log-MAP algorithm has the best performance.

  7. Prenatal magnetic resonance imaging: towards optimized patient information.

    PubMed

    Leithner, K; Pörnbacher, S; Assem-Hilger, E; Krampl-Bettelheim, E; Prayer, D

    2009-08-01

    To investigate the perception of fetal magnetic resonance imaging (MRI) by women confronted with the necessity of a targeted prenatal examination because of suspicion of an abnormality, in order to develop a pre-scan information leaflet tailored to the information requirements of these women. Sixty-two women were assessed by qualitative interview immediately before and after scanning. Data were analyzed by means of a qualitative content analysis. The transcribed interviews were coded within established categories, including knowledge of the purpose of the exam, understanding of the procedure, expectation of the baby's reaction, satisfaction with pre-information, experience of fetal MRI, distressing conditions during scanning, anxiety and suggestions for improvement of the scanning procedure. Pre-scan interviews indicated 66% of our sample to be well-informed about the purpose of fetal MRI. A realistic, detailed description of the examination was given by 37%. Only 32% expected the scanning to be safe for their baby. Despite the overall good tolerance of fetal MRI (63%), post-scan interviews revealed that 58% of women had experienced anxiety during MRI, which was partly due to the fearful perception of intensified fetal body movements during scanning. The quality of the pre-information leaflet was rated as sufficiently informative by 68% of the women. Suggestions for improvement were centered on physical conditions, the presence of the partner during scanning, and the availability of pre-scan briefings. Based on women's needs, detailed information about the fetal MRI procedure should be provided, containing clear-cut explanations about the purpose, course, method and possible distressing conditions. A leaflet describing these details should be given to women by the referring physician well in advance of the examination, and the woman given the opportunity to discuss unclear points.

  8. MuSim, a Graphical User Interface for Multiple Simulation Programs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Roberts, Thomas; Cummings, Mary Anne; Johnson, Rolland

    2016-06-01

    MuSim is a new user-friendly program designed to interface to many different particle simulation codes, regardless of their data formats or geometry descriptions. It presents the user with a compelling graphical user interface that includes a flexible 3-D view of the simulated world plus powerful editing and drag-and-drop capabilities. All aspects of the design can be parametrized so that parameter scans and optimizations are easy. It is simple to create plots and display events in the 3-D viewer (with a slider to vary the transparency of solids), allowing for an effortless comparison of different simulation codes. Simulation codes: G4beamline, MAD-X,more » and MCNP; more coming. Many accelerator design tools and beam optics codes were written long ago, with primitive user interfaces by today's standards. MuSim is specifically designed to make it easy to interface to such codes, providing a common user experience for all, and permitting the construction and exploration of models with very little overhead. For today's technology-driven students, graphical interfaces meet their expectations far better than text-based tools, and education in accelerator physics is one of our primary goals.« less

  9. Holographic Airborne Rotating Lidar Instrument Experiment (HARLIE)

    NASA Technical Reports Server (NTRS)

    Schwemmer, Geary K.

    1998-01-01

    Scanning holographic lidar receivers are currently in use in two operational lidar systems, PHASERS (Prototype Holographic Atmospheric Scanner for Environmental Remote Sensing) and now HARLIE (Holographic Airborne Rotating Lidar Instrument Experiment). These systems are based on volume phase holograms made in dichromated gelatin (DCG) sandwiched between 2 layers of high quality float glass. They have demonstrated the practical application of this technology to compact scanning lidar systems at 532 and 1064 nm wavelengths, the ability to withstand moderately high laser power and energy loading, sufficient optical quality for most direct detection systems, overall efficiencies rivaling conventional receivers, and the stability to last several years under typical lidar system environments. Their size and weight are approximately half of similar performing scanning systems using reflective optics. The cost of holographic systems will eventually be lower than the reflective optical systems depending on their degree of commercialization. There are a number of applications that require or can greatly benefit from a scanning capability. Several of these are airborne systems, which either use focal plane scanning, as in the Laser Vegetation Imaging System or use primary aperture scanning, as in the Airborne Oceanographic Lidar or the Large Aperture Scanning Airborne Lidar. The latter class requires a large clear aperture opening or window in the aircraft. This type of system can greatly benefit from the use of scanning transmission holograms of the HARLIE type because the clear aperture required is only about 25% larger than the collecting aperture as opposed to 200-300% larger for scan angles of 45 degrees off nadir.

  10. Two-photon imaging of spatially extended neuronal network dynamics with high temporal resolution.

    PubMed

    Lillis, Kyle P; Eng, Alfred; White, John A; Mertz, Jerome

    2008-07-30

    We describe a simple two-photon fluorescence imaging strategy, called targeted path scanning (TPS), to monitor the dynamics of spatially extended neuronal networks with high spatiotemporal resolution. Our strategy combines the advantages of mirror-based scanning, minimized dead time, ease of implementation, and compatibility with high-resolution low-magnification objectives. To demonstrate the performance of TPS, we monitor the calcium dynamics distributed across an entire juvenile rat hippocampus (>1.5mm), at scan rates of 100 Hz, with single cell resolution and single action potential sensitivity. Our strategy for fast, efficient two-photon microscopy over spatially extended regions provides a particularly attractive solution for monitoring neuronal population activity in thick tissue, without sacrificing the signal-to-noise ratio or high spatial resolution associated with standard two-photon microscopy. Finally, we provide the code to make our technique generally available.

  11. [Transposition errors during learning to reproduce a sequence by the right- and the left-hand movements: simulation of positional and movement coding].

    PubMed

    Liakhovetskiĭ, V A; Bobrova, E V; Skopin, G N

    2012-01-01

    Transposition errors during the reproduction of a hand movement sequence make it possible to receive important information on the internal representation of this sequence in the motor working memory. Analysis of such errors showed that learning to reproduce sequences of the left-hand movements improves the system of positional coding (coding ofpositions), while learning of the right-hand movements improves the system of vector coding (coding of movements). Learning of the right-hand movements after the left-hand performance involved the system of positional coding "imposed" by the left hand. Learning of the left-hand movements after the right-hand performance activated the system of vector coding. Transposition errors during learning to reproduce movement sequences can be explained by neural network using either vector coding or both vector and positional coding.

  12. Enabling Handicapped Nonreaders to Independently Obtain Information: Initial Development of an Inexpensive Bar Code Reader System.

    ERIC Educational Resources Information Center

    VanBiervliet, Alan

    A project to develop and evaluate a bar code reader system as a self-directed information and instructional aid for handicapped nonreaders is described. The bar code technology involves passing a light sensitive pen or laser over a printed code with bars which correspond to coded numbers. A system would consist of a portable device which could…

  13. Technology Infusion of CodeSonar into the Space Network Ground Segment

    NASA Technical Reports Server (NTRS)

    Benson, Markland J.

    2009-01-01

    This slide presentation reviews the applicability of CodeSonar to the Space Network software. CodeSonar is a commercial off the shelf system that analyzes programs written in C, C++ or Ada for defects in the code. Software engineers use CodeSonar results as an input to the existing source code inspection process. The study is focused on large scale software developed using formal processes. The systems studied are mission critical in nature but some use commodity computer systems.

  14. Trellis coded multilevel DPSK system with doppler correction for mobile satellite channels

    NASA Technical Reports Server (NTRS)

    Divsalar, Dariush (Inventor); Simon, Marvin K. (Inventor)

    1991-01-01

    A trellis coded multilevel differential phase shift keyed mobile communication system. The system of the present invention includes a trellis encoder for translating input signals into trellis codes; a differential encoder for differentially encoding the trellis coded signals; a transmitter for transmitting the differentially encoded trellis coded signals; a receiver for receiving the transmitted signals; a differential demodulator for demodulating the received differentially encoded trellis coded signals; and a trellis decoder for decoding the differentially demodulated signals.

  15. SU-F-BRD-15: Quality Correction Factors in Scanned Or Broad Proton Therapy Beams Are Indistinguishable

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sorriaux, J; Lee, J; ICTEAM Institute, Universite catholique de Louvain, Louvain-la-Neuve

    2015-06-15

    Purpose: The IAEA TRS-398 code of practice details the reference conditions for reference dosimetry of proton beams using ionization chambers and the required beam quality correction factors (kQ). Pencil beam scanning (PBS) requires multiple spots to reproduce the reference conditions. The objective is to demonstrate, using Monte Carlo (MC) calculations, that kQ factors for broad beams can be used for scanned beams under the same reference conditions with no significant additional uncertainty. We consider hereafter the general Alfonso formalism (Alfonso et al, 2008) for non-standard beam. Methods: To approach the reference conditions and the associated dose distributions, PBS must combinemore » many pencil beams with range modulation and shaping techniques different than those used in passive systems (broad beams). This might lead to a different energy spectrum at the measurement point. In order to evaluate the impact of these differences on kQ factors, ion chamber responses are computed with MC (Geant4 9.6) in a dedicated scanned pencil beam (Q-pcsr) producing a 10×10cm2 composite field with a flat dose distribution from 10 to 16 cm depth. Ion chamber responses are also computed by MC in a broad beam with quality Q-ds (double scattering). The dose distribution of Q -pcsr matches the dose distribution of Q-ds. k-(Q-pcsr,Q-ds) is computed for a 2×2×0.2cm{sup 3} idealized air cavity and a realistic plane-parallel ion chamber (IC). Results: Under reference conditions, quality correction factors for a scanned composite field versus a broad beam are the same for air cavity dose response, k-(Q-pcsr,Q-ds) =1.001±0.001 and for a Roos IC, k-(Q-pcsr,Q-ds) =0.999±0.005. Conclusion: Quality correction factors for ion chamber response in scanned and broad proton therapy beams are identical under reference conditions within the calculation uncertainties. The results indicate that quality correction factors published in IAEA TRS-398 can be used for scanned beams in the SOBP of a high-energy proton beam. Jefferson Sorriaux is financed by the Walloon Region under the convention 1217662. Jefferson Sorriaux is sponsored by a public-private partnership IBA - Walloon Region.« less

  16. Fundamental radiological and geometric performance of two types of proton beam modulated discrete scanning systems.

    PubMed

    Farr, J B; Dessy, F; De Wilde, O; Bietzer, O; Schönenberg, D

    2013-07-01

    The purpose of this investigation was to compare and contrast the measured fundamental properties of two new types of modulated proton scanning systems. This provides a basis for clinical expectations based on the scanned beam quality and a benchmark for computational models. Because the relatively small beam and fast scanning gave challenges to the characterization, a secondary purpose was to develop and apply new approaches where necessary to do so. The following performances of the proton scanning systems were investigated: beamlet alignment, static in-air beamlet size and shape, scanned in-air penumbra, scanned fluence map accuracy, geometric alignment of scanning system to isocenter, maximum field size, lateral and longitudinal field uniformity of a 1 l cubic uniform field, output stability over time, gantry angle invariance, monitoring system linearity, and reproducibility. A range of detectors was used: film, ionization chambers, lateral multielement and longitudinal multilayer ionization chambers, and a scintillation screen combined with a digital video camera. Characterization of the scanned fluence maps was performed with a software analysis tool. The resulting measurements and analysis indicated that the two types of delivery systems performed within specification for those aspects investigated. The significant differences were observed between the two types of scanning systems where one type exhibits a smaller spot size and associated penumbra than the other. The differential is minimum at maximum energy and increases inversely with decreasing energy. Additionally, the large spot system showed an increase in dose precision to a static target with layer rescanning whereas the small spot system did not. The measured results from the two types of modulated scanning types of system were consistent with their designs under the conditions tested. The most significant difference between the types of system was their proton spot size and associated resolution, factors of magnetic optics, and vacuum length. The need and benefit of mutielement detectors and high-resolution sensors was also shown. The use of a fluence map analytical software tool was particularly effective in characterizing the dynamic proton energy-layer scanning.

  17. SSL/TLS Vulnerability Detection Using Black Box Approach

    NASA Astrophysics Data System (ADS)

    Gunawan, D.; Sitorus, E. H.; Rahmat, R. F.; Hizriadi, A.

    2018-03-01

    Socket Secure Layer (SSL) and Transport Layer Security (TLS) are cryptographic protocols that provide data encryption to secure the communication over a network. However, in some cases, there are vulnerability found in the implementation of SSL/TLS because of weak cipher key, certificate validation error or session handling error. One of the most vulnerable SSL/TLS bugs is heartbleed. As the security is essential in data communication, this research aims to build a scanner that detect the SSL/TLS vulnerability by using black box approach. This research will focus on heartbleed case. In addition, this research also gathers information about existing SSL in the server. The black box approach is used to test the output of a system without knowing the process inside the system itself. For testing purpose, this research scanned websites and found that some of the websites still have SSL/TLS vulnerability. Thus, the black box approach can be used to detect the vulnerability without considering the source code and the process inside the application.

  18. Extreme ultraviolet patterned mask inspection performance of advanced projection electron microscope system for 11nm half-pitch generation

    NASA Astrophysics Data System (ADS)

    Hirano, Ryoichi; Iida, Susumu; Amano, Tsuyoshi; Watanabe, Hidehiro; Hatakeyama, Masahiro; Murakami, Takeshi; Suematsu, Kenichi; Terao, Kenji

    2016-03-01

    Novel projection electron microscope optics have been developed and integrated into a new inspection system named EBEYE-V30 ("Model EBEYE" is an EBARA's model code) , and the resulting system shows promise for application to half-pitch (hp) 16-nm node extreme ultraviolet lithography (EUVL) patterned mask inspection. To improve the system's inspection throughput for 11-nm hp generation defect detection, a new electron-sensitive area image sensor with a high-speed data processing unit, a bright and stable electron source, and an image capture area deflector that operates simultaneously with the mask scanning motion have been developed. A learning system has been used for the mask inspection tool to meet the requirements of hp 11-nm node EUV patterned mask inspection. Defects are identified by the projection electron microscope system using the "defectivity" from the characteristics of the acquired image. The learning system has been developed to reduce the labor and costs associated with adjustment of the detection capability to cope with newly-defined mask defects. We describe the integration of the developed elements into the inspection tool and the verification of the designed specification. We have also verified the effectiveness of the learning system, which shows enhanced detection capability for the hp 11-nm node.

  19. WWER-1000 core and reflector parameters investigation in the LR-0 reactor

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zaritsky, S. M.; Alekseev, N. I.; Bolshagin, S. N.

    2006-07-01

    Measurements and calculations carried out in the core and reflector of WWER-1000 mock-up are discussed: - the determination of the pin-to-pin power distribution in the core by means of gamma-scanning of fuel pins and pin-to-pin calculations with Monte Carlo code MCU-REA and diffusion codes MOBY-DICK (with WIMS-D4 cell constants preparation) and RADAR - the fast neutron spectra measurements by proton recoil method inside the experimental channel in the core and inside the channel in the baffle, and corresponding calculations in P{sub 3}S{sub 8} approximation of discrete ordinates method with code DORT and BUGLE-96 library - the neutron spectra evaluations (adjustment)more » in the same channels in energy region 0.5 eV-18 MeV based on the activation and solid state track detectors measurements. (authors)« less

  20. Musculoskeletal disorder costs and medical claim filing in the US retail trade sector.

    PubMed

    Bhattacharya, Anasua; Leigh, J Paul

    2011-01-01

    The average costs of Musculoskeletal Disorder (MSD) and odds ratios for filing medical claims related to MSD were examined. The medical claims were identified by ICD 9 codes for four US Census regions within retail trade. Large private firms' medical claims data from Thomson Reuters Inc. MarketScan databases for the years 2003 through 2006 were used. Average costs were highest for claims related to lumbar region (ICD 9 Code: 724.02) and number of claims were largest for low back syndrome (ICD 9 Code: 724.2). Whereas the odds of filing an MSD claim did not vary greatly over time, average costs declined over time. The odds of filing claims rose with age and were higher for females and southerners than men and non-southerners. Total estimated national medical costs for MSDs within retail trade were $389 million (2007 USD).

  1. Molecular dynamics and dynamic Monte-Carlo simulation of irradiation damage with focused ion beams

    NASA Astrophysics Data System (ADS)

    Ohya, Kaoru

    2017-03-01

    The focused ion beam (FIB) has become an important tool for micro- and nanostructuring of samples such as milling, deposition and imaging. However, this leads to damage of the surface on the nanometer scale from implanted projectile ions and recoiled material atoms. It is therefore important to investigate each kind of damage quantitatively. We present a dynamic Monte-Carlo (MC) simulation code to simulate the morphological and compositional changes of a multilayered sample under ion irradiation and a molecular dynamics (MD) simulation code to simulate dose-dependent changes in the backscattering-ion (BSI)/secondary-electron (SE) yields of a crystalline sample. Recent progress in the codes for research to simulate the surface morphology and Mo/Si layers intermixing in an EUV lithography mask irradiated with FIBs, and the crystalline orientation effect on BSI and SE yields relating to the channeling contrast in scanning ion microscopes, is also presented.

  2. Laser Scanner For Automatic Storage

    NASA Astrophysics Data System (ADS)

    Carvalho, Fernando D.; Correia, Bento A.; Rebordao, Jose M.; Rodrigues, F. Carvalho

    1989-01-01

    The automated magazines are beeing used at industry more and more. One of the problems related with the automation of a Store House is the identification of the products envolved. Already used for stock management, the Bar Codes allows an easy way to identify one product. Applied to automated magazines, the bar codes allows a great variety of items in a small code. In order to be used by the national producers of automated magazines, a devoted laser scanner has been develloped. The Prototype uses an He-Ne laser whose beam scans a field angle of 75 degrees at 16 Hz. The scene reflectivity is transduced by a photodiode into an electrical signal, which is then binarized. This digital signal is the input of the decodifying program. The machine is able to see barcodes and to decode the information. A parallel interface allows the comunication with the central unit, which is responsible for the management of automated magazine.

  3. Optical design for uniform scanning in MEMS-based 3D imaging lidar.

    PubMed

    Lee, Xiaobao; Wang, Chunhui

    2015-03-20

    This paper proposes a method for the optical system design of uniform scanning in a larger scan field of view (FOV) in 3D imaging lidar. The theoretical formulas are derived for the design scheme. By employing the optical design software ZEMAX, a foldaway uniform scanning optical system based on MEMS has been designed, and the scanning uniformity and spot size of the system on the target plane, perpendicular to optical axis, are analyzed and discussed. Results show that the designed system can scan uniformly within the FOV of 40°×40° with small spot size for the target at distance of about 100 m.

  4. A New Look at the Eclipse Timing Variation Diagram Analysis of Selected 3-body W UMa Systems

    NASA Astrophysics Data System (ADS)

    Christopoulou, P.-E.; Papageorgiou, A.

    2015-07-01

    The light travel effect produced by the presence of tertiary components can reveal much about the origin and evolution of over-contact binaries. Monitoring of W UMa systems over the last decade and/or the use of publicly available photometric surveys (NSVS, ASAS, etc.) has uncovered or suggested the presence of many unseen companions, which calls for an in-depth investigation of the parameters derived from cyclic period variations in order to confirm or reject the assumption of hidden companion(s). Progress in the analysis of eclipse timing variations is summarized here both from the empirical and the theoretical points of view, and a more extensive investigation of the proposed orbital parameters of third bodies is proposed. The code we have developed for this, implemented in Python, is set up to handle heuristic scanning with parameter perturbation in parameter space, and to establish realistic uncertainties from the least squares fitting. A computational example is given for TZ Boo, a W UMa system with a spectroscopically detected third component. Future options to be implemented include MCMC and bootstrapping.

  5. Optical System Design for Noncontact, Normal Incidence, THz Imaging of in vivo Human Cornea.

    PubMed

    Sung, Shijun; Dabironezare, Shahab; Llombart, Nuria; Selvin, Skyler; Bajwa, Neha; Chantra, Somporn; Nowroozi, Bryan; Garritano, James; Goell, Jacob; Li, Alex; Deng, Sophie X; Brown, Elliott; Grundfest, Warren S; Taylor, Zachary D

    2018-01-01

    Reflection mode Terahertz (THz) imaging of corneal tissue water content (CTWC) is a proposed method for early, accurate detection and study of corneal diseases. Despite promising results from ex vivo and in vivo cornea studies, interpretation of the reflectivity data is confounded by the contact between corneal tissue and dielectric windows used to flatten the imaging field. Herein, we present an optical design for non-contact THz imaging of cornea. A beam scanning methodology performs angular, normal incidence sweeps of a focused beam over the corneal surface while keeping the source, detector, and patient stationary. A quasioptical analysis method is developed to analyze the theoretical resolution and imaging field intensity profile. These results are compared to the electric field distribution computed with a physical optics analysis code. Imaging experiments validate the optical theories behind the design and suggest that quasioptical methods are sufficient for designing of THz corneal imaging systems. Successful imaging operations support the feasibility of non-contact in vivo imaging. We believe that this optical system design will enable the first, clinically relevant, in vivo exploration of CTWC using THz technology.

  6. SU-D-BRD-03: A Gateway for GPU Computing in Cancer Radiotherapy Research

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jia, X; Folkerts, M; Shi, F

    Purpose: Graphics Processing Unit (GPU) has become increasingly important in radiotherapy. However, it is still difficult for general clinical researchers to access GPU codes developed by other researchers, and for developers to objectively benchmark their codes. Moreover, it is quite often to see repeated efforts spent on developing low-quality GPU codes. The goal of this project is to establish an infrastructure for testing GPU codes, cross comparing them, and facilitating code distributions in radiotherapy community. Methods: We developed a system called Gateway for GPU Computing in Cancer Radiotherapy Research (GCR2). A number of GPU codes developed by our group andmore » other developers can be accessed via a web interface. To use the services, researchers first upload their test data or use the standard data provided by our system. Then they can select the GPU device on which the code will be executed. Our system offers all mainstream GPU hardware for code benchmarking purpose. After the code running is complete, the system automatically summarizes and displays the computing results. We also released a SDK to allow the developers to build their own algorithm implementation and submit their binary codes to the system. The submitted code is then systematically benchmarked using a variety of GPU hardware and representative data provided by our system. The developers can also compare their codes with others and generate benchmarking reports. Results: It is found that the developed system is fully functioning. Through a user-friendly web interface, researchers are able to test various GPU codes. Developers also benefit from this platform by comprehensively benchmarking their codes on various GPU platforms and representative clinical data sets. Conclusion: We have developed an open platform allowing the clinical researchers and developers to access the GPUs and GPU codes. This development will facilitate the utilization of GPU in radiation therapy field.« less

  7. Development of new two-dimensional spectral/spatial code based on dynamic cyclic shift code for OCDMA system

    NASA Astrophysics Data System (ADS)

    Jellali, Nabiha; Najjar, Monia; Ferchichi, Moez; Rezig, Houria

    2017-07-01

    In this paper, a new two-dimensional spectral/spatial codes family, named two dimensional dynamic cyclic shift codes (2D-DCS) is introduced. The 2D-DCS codes are derived from the dynamic cyclic shift code for the spectral and spatial coding. The proposed system can fully eliminate the multiple access interference (MAI) by using the MAI cancellation property. The effect of shot noise, phase-induced intensity noise and thermal noise are used to analyze the code performance. In comparison with existing two dimensional (2D) codes, such as 2D perfect difference (2D-PD), 2D Extended Enhanced Double Weight (2D-Extended-EDW) and 2D hybrid (2D-FCC/MDW) codes, the numerical results show that our proposed codes have the best performance. By keeping the same code length and increasing the spatial code, the performance of our 2D-DCS system is enhanced: it provides higher data rates while using lower transmitted power and a smaller spectral width.

  8. Performance of data-compression codes in channels with errors. Final report, October 1986-January 1987

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1987-10-01

    Huffman codes, comma-free codes, and block codes with shift indicators are important candidate-message compression codes for improving the efficiency of communications systems. This study was undertaken to determine if these codes could be used to increase the thruput of the fixed very-low-frequency (FVLF) communication system. This applications involves the use of compression codes in a channel with errors.

  9. The adaption and use of research codes for performance assessment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liebetrau, A.M.

    1987-05-01

    Models of real-world phenomena are developed for many reasons. The models are usually, if not always, implemented in the form of a computer code. The characteristics of a code are determined largely by its intended use. Realizations or implementations of detailed mathematical models of complex physical and/or chemical processes are often referred to as research or scientific (RS) codes. Research codes typically require large amounts of computing time. One example of an RS code is a finite-element code for solving complex systems of differential equations that describe mass transfer through some geologic medium. Considerable computing time is required because computationsmore » are done at many points in time and/or space. Codes used to evaluate the overall performance of real-world physical systems are called performance assessment (PA) codes. Performance assessment codes are used to conduct simulated experiments involving systems that cannot be directly observed. Thus, PA codes usually involve repeated simulations of system performance in situations that preclude the use of conventional experimental and statistical methods. 3 figs.« less

  10. Performance of the NIRS fast scanning system for heavy-ion radiotherapy.

    PubMed

    Furukawa, Takuji; Inaniwa, Taku; Sato, Shinji; Shirai, Toshiyuki; Takei, Yuka; Takeshita, Eri; Mizushima, Kota; Iwata, Yoshiyuki; Himukai, Takeshi; Mori, Shinichiro; Fukuda, Shigekazu; Minohara, Shinichi; Takada, Eiichi; Murakami, Takeshi; Noda, Koji

    2010-11-01

    A project to construct a new treatment facility, as an extension of the existing HIMAC facility, has been initiated for the further development of carbon-ion therapy at NIRS. This new treatment facility is equipped with a 3D irradiation system with pencil-beam scanning. The challenge of this project is to realize treatment of a moving target by scanning irradiation. To achieve fast rescanning within an acceptable irradiation time, the authors developed a fast scanning system. In order to verify the validity of the design and to demonstrate the performance of the fast scanning prior to use in the new treatment facility, a new scanning-irradiation system was developed and installed into the existing HIMAC physics-experiment course. The authors made strong efforts to develop (1) the fast scanning magnet and its power supply, (2) the high-speed control system, and (3) the beam monitoring. The performance of the system including 3D dose conformation was tested by using the carbon beam from the HIMAC accelerator. The performance of the fast scanning system was verified by beam tests. Precision of the scanned beam position was less than +/-0.5 mm. By cooperating with the planning software, the authors verified the homogeneity of the delivered field within +/-3% for the 3D delivery. This system took only 20 s to deliver the physical dose of 1 Gy to a spherical target having a diameter of 60 mm with eight rescans. In this test, the average of the spot-staying time was considerably reduced to 154 micros, while the minimum staying time was 30 micros. As a result of this study, the authors verified that the new scanning delivery system can produce an accurate 3D dose distribution for the target volume in combination with the planning software.

  11. The application of LDPC code in MIMO-OFDM system

    NASA Astrophysics Data System (ADS)

    Liu, Ruian; Zeng, Beibei; Chen, Tingting; Liu, Nan; Yin, Ninghao

    2018-03-01

    The combination of MIMO and OFDM technology has become one of the key technologies of the fourth generation mobile communication., which can overcome the frequency selective fading of wireless channel, increase the system capacity and improve the frequency utilization. Error correcting coding introduced into the system can further improve its performance. LDPC (low density parity check) code is a kind of error correcting code which can improve system reliability and anti-interference ability, and the decoding is simple and easy to operate. This paper mainly discusses the application of LDPC code in MIMO-OFDM system.

  12. Dopamine Modulates Adaptive Prediction Error Coding in the Human Midbrain and Striatum.

    PubMed

    Diederen, Kelly M J; Ziauddeen, Hisham; Vestergaard, Martin D; Spencer, Tom; Schultz, Wolfram; Fletcher, Paul C

    2017-02-15

    Learning to optimally predict rewards requires agents to account for fluctuations in reward value. Recent work suggests that individuals can efficiently learn about variable rewards through adaptation of the learning rate, and coding of prediction errors relative to reward variability. Such adaptive coding has been linked to midbrain dopamine neurons in nonhuman primates, and evidence in support for a similar role of the dopaminergic system in humans is emerging from fMRI data. Here, we sought to investigate the effect of dopaminergic perturbations on adaptive prediction error coding in humans, using a between-subject, placebo-controlled pharmacological fMRI study with a dopaminergic agonist (bromocriptine) and antagonist (sulpiride). Participants performed a previously validated task in which they predicted the magnitude of upcoming rewards drawn from distributions with varying SDs. After each prediction, participants received a reward, yielding trial-by-trial prediction errors. Under placebo, we replicated previous observations of adaptive coding in the midbrain and ventral striatum. Treatment with sulpiride attenuated adaptive coding in both midbrain and ventral striatum, and was associated with a decrease in performance, whereas bromocriptine did not have a significant impact. Although we observed no differential effect of SD on performance between the groups, computational modeling suggested decreased behavioral adaptation in the sulpiride group. These results suggest that normal dopaminergic function is critical for adaptive prediction error coding, a key property of the brain thought to facilitate efficient learning in variable environments. Crucially, these results also offer potential insights for understanding the impact of disrupted dopamine function in mental illness. SIGNIFICANCE STATEMENT To choose optimally, we have to learn what to expect. Humans dampen learning when there is a great deal of variability in reward outcome, and two brain regions that are modulated by the brain chemical dopamine are sensitive to reward variability. Here, we aimed to directly relate dopamine to learning about variable rewards, and the neural encoding of associated teaching signals. We perturbed dopamine in healthy individuals using dopaminergic medication and asked them to predict variable rewards while we made brain scans. Dopamine perturbations impaired learning and the neural encoding of reward variability, thus establishing a direct link between dopamine and adaptation to reward variability. These results aid our understanding of clinical conditions associated with dopaminergic dysfunction, such as psychosis. Copyright © 2017 Diederen et al.

  13. VENTURE/PC manual: A multidimensional multigroup neutron diffusion code system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shapiro, A.; Huria, H.C.; Cho, K.W.

    1991-12-01

    VENTURE/PC is a recompilation of part of the Oak Ridge BOLD VENTURE code system, which will operate on an IBM PC or compatible computer. Neutron diffusion theory solutions are obtained for multidimensional, multigroup problems. This manual contains information associated with operating the code system. The purpose of the various modules used in the code system, and the input for these modules are discussed. The PC code structure is also given. Version 2 included several enhancements not given in the original version of the code. In particular, flux iterations can be done in core rather than by reading and writing tomore » disk, for problems which allow sufficient memory for such in-core iterations. This speeds up the iteration process. Version 3 does not include any of the special processors used in the previous versions. These special processors utilized formatted input for various elements of the code system. All such input data is now entered through the Input Processor, which produces standard interface files for the various modules in the code system. In addition, a Standard Interface File Handbook is included in the documentation which is distributed with the code, to assist in developing the input for the Input Processor.« less

  14. Thermal hydraulic-severe accident code interfaces for SCDAP/RELAP5/MOD3.2

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Coryell, E.W.; Siefken, L.J.; Harvego, E.A.

    1997-07-01

    The SCDAP/RELAP5 computer code is designed to describe the overall reactor coolant system thermal-hydraulic response, core damage progression, and fission product release during severe accidents. The code is being developed at the Idaho National Engineering Laboratory under the primary sponsorship of the Office of Nuclear Regulatory Research of the U.S. Nuclear Regulatory Commission. The code is the result of merging the RELAP5, SCDAP, and COUPLE codes. The RELAP5 portion of the code calculates the overall reactor coolant system, thermal-hydraulics, and associated reactor system responses. The SCDAP portion of the code describes the response of the core and associated vessel structures.more » The COUPLE portion of the code describes response of lower plenum structures and debris and the failure of the lower head. The code uses a modular approach with the overall structure, input/output processing, and data structures following the pattern established for RELAP5. The code uses a building block approach to allow the code user to easily represent a wide variety of systems and conditions through a powerful input processor. The user can represent a wide variety of experiments or reactor designs by selecting fuel rods and other assembly structures from a range of representative core component models, and arrange them in a variety of patterns within the thermalhydraulic network. The COUPLE portion of the code uses two-dimensional representations of the lower plenum structures and debris beds. The flow of information between the different portions of the code occurs at each system level time step advancement. The RELAP5 portion of the code describes the fluid transport around the system. These fluid conditions are used as thermal and mass transport boundary conditions for the SCDAP and COUPLE structures and debris beds.« less

  15. Two dimension MDW OCDMA code cross-correlation for reduction of phase induced intensity noise

    NASA Astrophysics Data System (ADS)

    Ahmed, Israa Sh.; Aljunid, Syed A.; Nordin, Junita M.; Dulaimi, Layth A. Khalil Al; Matem, Rima

    2017-11-01

    In this paper, we first review 2-D MDW code cross correlation equations and table to be improved significantly by using code correlation properties. These codes can be used in the synchronous optical CDMA systems for multi access interference cancellation and maximum suppress the phase induced intensity noise. Low Psr is due to the reduction of interference noise that is induced by the 2-D MDW code PIIN suppression. High data rate causes increases in BER, requires high effective power and severely deteriorates the system performance. The 2-D W/T MDW code has an excellent system performance where the value of PIIN is suppressed as low as possible at the optimum Psr with high data bit rate. The 2-D MDW code shows better tolerance to PIIN in comparison to others with enhanced system performance. We prove by numerical analysis that the PIIN maximally suppressed by MDW code through the minimizing property of cross correlation in comparison to 2-D PDC and 2-D MQC OCDMA code.scheme systems.

  16. Performance enhancement of optical code-division multiple-access systems using transposed modified Walsh code

    NASA Astrophysics Data System (ADS)

    Sikder, Somali; Ghosh, Shila

    2018-02-01

    This paper presents the construction of unipolar transposed modified Walsh code (TMWC) and analysis of its performance in optical code-division multiple-access (OCDMA) systems. Specifically, the signal-to-noise ratio, bit error rate (BER), cardinality, and spectral efficiency were investigated. The theoretical analysis demonstrated that the wavelength-hopping time-spreading system using TMWC was robust against multiple-access interference and more spectrally efficient than systems using other existing OCDMA codes. In particular, the spectral efficiency was calculated to be 1.0370 when TMWC of weight 3 was employed. The BER and eye pattern for the designed TMWC were also successfully obtained using OptiSystem simulation software. The results indicate that the proposed code design is promising for enhancing network capacity.

  17. 10 CFR 434.99 - Explanation of numbering system for codes.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 10 Energy 3 2011-01-01 2011-01-01 false Explanation of numbering system for codes. 434.99 Section 434.99 Energy DEPARTMENT OF ENERGY ENERGY CONSERVATION ENERGY CODE FOR NEW FEDERAL COMMERCIAL AND MULTI-FAMILY HIGH RISE RESIDENTIAL BUILDINGS § 434.99 Explanation of numbering system for codes. (a) For...

  18. 10 CFR 434.99 - Explanation of numbering system for codes.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 10 Energy 3 2010-01-01 2010-01-01 false Explanation of numbering system for codes. 434.99 Section 434.99 Energy DEPARTMENT OF ENERGY ENERGY CONSERVATION ENERGY CODE FOR NEW FEDERAL COMMERCIAL AND MULTI-FAMILY HIGH RISE RESIDENTIAL BUILDINGS § 434.99 Explanation of numbering system for codes. (a) For...

  19. 48 CFR 452.219-70 - Size Standard and NAICS Code Information.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 4 2010-10-01 2010-10-01 false Size Standard and NAICS Code Information. 452.219-70 Section 452.219-70 Federal Acquisition Regulations System DEPARTMENT OF... System Code(s) and business size standard(s) describing the products and/or services to be acquired under...

  20. Performance optimization of spectral amplitude coding OCDMA system using new enhanced multi diagonal code

    NASA Astrophysics Data System (ADS)

    Imtiaz, Waqas A.; Ilyas, M.; Khan, Yousaf

    2016-11-01

    This paper propose a new code to optimize the performance of spectral amplitude coding-optical code division multiple access (SAC-OCDMA) system. The unique two-matrix structure of the proposed enhanced multi diagonal (EMD) code and effective correlation properties, between intended and interfering subscribers, significantly elevates the performance of SAC-OCDMA system by negating multiple access interference (MAI) and associated phase induce intensity noise (PIIN). Performance of SAC-OCDMA system based on the proposed code is thoroughly analyzed for two detection techniques through analytic and simulation analysis by referring to bit error rate (BER), signal to noise ratio (SNR) and eye patterns at the receiving end. It is shown that EMD code while using SDD technique provides high transmission capacity, reduces the receiver complexity, and provides better performance as compared to complementary subtraction detection (CSD) technique. Furthermore, analysis shows that, for a minimum acceptable BER of 10-9 , the proposed system supports 64 subscribers at data rates of up to 2 Gbps for both up-down link transmission.

  1. Differential Binary Encoding Method for Calibrating Image Sensors Based on IOFBs

    PubMed Central

    Fernández, Pedro R.; Lázaro-Galilea, José Luis; Gardel, Alfredo; Espinosa, Felipe; Bravo, Ignacio; Cano, Ángel

    2012-01-01

    Image transmission using incoherent optical fiber bundles (IOFBs) requires prior calibration to obtain the spatial in-out fiber correspondence necessary to reconstruct the image captured by the pseudo-sensor. This information is recorded in a Look-Up Table called the Reconstruction Table (RT), used later for reordering the fiber positions and reconstructing the original image. This paper presents a very fast method based on image-scanning using spaces encoded by a weighted binary code to obtain the in-out correspondence. The results demonstrate that this technique yields a remarkable reduction in processing time and the image reconstruction quality is very good compared to previous techniques based on spot or line scanning, for example. PMID:22666023

  2. AlaScan: A Graphical User Interface for Alanine Scanning Free-Energy Calculations.

    PubMed

    Ramadoss, Vijayaraj; Dehez, François; Chipot, Christophe

    2016-06-27

    Computation of the free-energy changes that underlie molecular recognition and association has gained significant importance due to its considerable potential in drug discovery. The massive increase of computational power in recent years substantiates the application of more accurate theoretical methods for the calculation of binding free energies. The impact of such advances is the application of parent approaches, like computational alanine scanning, to investigate in silico the effect of amino-acid replacement in protein-ligand and protein-protein complexes, or probe the thermostability of individual proteins. Because human effort represents a significant cost that precludes the routine use of this form of free-energy calculations, minimizing manual intervention constitutes a stringent prerequisite for any such systematic computation. With this objective in mind, we propose a new plug-in, referred to as AlaScan, developed within the popular visualization program VMD to automate the major steps in alanine-scanning calculations, employing free-energy perturbation as implemented in the widely used molecular dynamics code NAMD. The AlaScan plug-in can be utilized upstream, to prepare input files for selected alanine mutations. It can also be utilized downstream to perform the analysis of different alanine-scanning calculations and to report the free-energy estimates in a user-friendly graphical user interface, allowing favorable mutations to be identified at a glance. The plug-in also assists the end-user in assessing the reliability of the calculation through rapid visual inspection.

  3. Scan Line Difference Compression Algorithm Simulation Study.

    DTIC Science & Technology

    1985-08-01

    introduced during the signal transmission process. ----------- SLDC Encoder------- I Image I IConditionedl IConditioned I LError Control I I Source I...I Error Control _____ _struction - Decoder I I Decoder I ----------- SLDC Decoder-------- Figure A-I. -- Overall Data Compression Process This...of noise or an effective channel coding subsystem providing the necessary error control . A- 2 ~~~~~~~~~ ..* : ~ -. . .- .** - .. . .** .* ... . . The

  4. NHEXAS PHASE I ARIZONA STUDY--STANDARD OPERATING PROCEDURE FOR LIGHT PEN OPERATION AND VERIFICATION OF SCANNED BAR CODES (UA-D-33.0)

    EPA Science Inventory

    The purpose of this SOP is to define the steps needed to operate the light pens, and verify the values produced by light pens used in the Arizona NHEXAS project and the "Border" study. Keywords: data; equipment; light pens.

    The National Human Exposure Assessment Survey (NHEXAS)...

  5. Spectroscopic Imaging with an Uncooled Microbolometer Infrared Camera and Step-Scan FTIR

    DTIC Science & Technology

    2006-12-01

    NAVAL POSTGRADUATE SCHOOL MONTEREY, CALIFORNIA THESIS Approved for public released; distribution is unlimited SPECTROSCOPIC...Form Approved OMB No. 0704-0188 Public reporting burden for this collection of information is estimated to average 1 hour per response, including the...STATEMENT Approved for public release; distribution is unlimited 12b. DISTRIBUTION CODE 13. ABSTRACT (maximum 200 words) The purpose of this

  6. Random oligonucleotide mutagenesis: application to a large protein coding sequence of a major histocompatibility complex class I gene, H-2DP.

    PubMed Central

    Murray, R; Pederson, K; Prosser, H; Muller, D; Hutchison, C A; Frelinger, J A

    1988-01-01

    We have used random oligonucleotide mutagenesis (or saturation mutagenesis) to create a library of point mutations in the alpha 1 protein domain of a Major Histocompatibility Complex (MHC) molecule. This protein domain is critical for T cell and B cell recognition. We altered the MHC class I H-2DP gene sequence such that synthetic mutant alpha 1 exons (270 bp of coding sequence), which contain mutations identified by sequence analysis, can replace the wild type alpha 1 exon. The synthetic exons were constructed from twelve overlapping oligonucleotides which contained an average of 1.3 random point mutations per intact exon. DNA sequence analysis of mutant alpha 1 exons has shown a point mutant distribution that fits a Poisson distribution, and thus emphasizes the utility of this mutagenesis technique to "scan" a large protein sequence for important mutations. We report our use of saturation mutagenesis to scan an entire exon of the H-2DP gene, a cassette strategy to replace the wild type alpha 1 exon with individual mutant alpha 1 exons, and analysis of mutant molecules expressed on the surface of transfected mouse L cells. Images PMID:2903482

  7. High-frequency Pulse-compression Ultrasound Imaging with an Annular Array

    NASA Astrophysics Data System (ADS)

    Mamou, J.; Ketterling, J. A.; Silverman, R. H.

    High-frequency ultrasound (HFU) allows fine-resolution imaging at the expense of limited depth-of-field (DOF) and shallow acoustic penetration depth. Coded-excitation imaging permits a significant increase in the signal-to-noise ratio (SNR) and therefore, the acoustic penetration depth. A 17-MHz, five-element annular array with a focal length of 31 mm and a total aperture of 10 mm was fabricated using a 25-μm thick piezopolymer membrane. An optimized 8-μs linear chirp spanning 6.5-32 MHz was used to excite the transducer. After data acquisition, the received signals were linearly filtered by a compression filter and synthetically focused. To compare the chirp-array imaging method with conventional impulse imaging in terms of resolution, a 25-μm wire was scanned and the -6-dB axial and lateral resolutions were computed at depths ranging from 20.5 to 40.5 mm. A tissue-mimicking phantom containing 10-μm glass beads was scanned, and backscattered signals were analyzed to evaluate SNR and penetration depth. Finally, ex-vivo ophthalmic images were formed and chirp-coded images showed features that were not visible in conventional impulse images.

  8. Production code control system for hydrodynamics simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Slone, D.M.

    1997-08-18

    We describe how the Production Code Control System (pCCS), written in Perl, has been used to control and monitor the execution of a large hydrodynamics simulation code in a production environment. We have been able to integrate new, disparate, and often independent, applications into the PCCS framework without the need to modify any of our existing application codes. Both users and code developers see a consistent interface to the simulation code and associated applications regardless of the physical platform, whether an MPP, SMP, server, or desktop workstation. We will also describe our use of Perl to develop a configuration managementmore » system for the simulation code, as well as a code usage database and report generator. We used Perl to write a backplane that allows us plug in preprocessors, the hydrocode, postprocessors, visualization tools, persistent storage requests, and other codes. We need only teach PCCS a minimal amount about any new tool or code to essentially plug it in and make it usable to the hydrocode. PCCS has made it easier to link together disparate codes, since using Perl has removed the need to learn the idiosyncrasies of system or RPC programming. The text handling in Perl makes it easy to teach PCCS about new codes, or changes to existing codes.« less

  9. BCH codes for large IC random-access memory systems

    NASA Technical Reports Server (NTRS)

    Lin, S.; Costello, D. J., Jr.

    1983-01-01

    In this report some shortened BCH codes for possible applications to large IC random-access memory systems are presented. These codes are given by their parity-check matrices. Encoding and decoding of these codes are discussed.

  10. High performance and cost effective CO-OFDM system aided by polar code.

    PubMed

    Liu, Ling; Xiao, Shilin; Fang, Jiafei; Zhang, Lu; Zhang, Yunhao; Bi, Meihua; Hu, Weisheng

    2017-02-06

    A novel polar coded coherent optical orthogonal frequency division multiplexing (CO-OFDM) system is proposed and demonstrated through experiment for the first time. The principle of a polar coded CO-OFDM signal is illustrated theoretically and the suitable polar decoding method is discussed. Results show that the polar coded CO-OFDM signal achieves a net coding gain (NCG) of more than 10 dB at bit error rate (BER) of 10-3 over 25-Gb/s 480-km transmission in comparison with conventional CO-OFDM. Also, compared to the 25-Gb/s low-density parity-check (LDPC) coded CO-OFDM 160-km system, the polar code provides a NCG of 0.88 dB @BER = 10-3. Moreover, the polar code can relieve the laser linewidth requirement massively to get a more cost-effective CO-OFDM system.

  11. Design and construction of a cost-efficient Arduino-based mirror galvanometer system for scanning optical microscopy

    NASA Astrophysics Data System (ADS)

    Hsu, Jen-Feng; Dhingra, Shonali; D'Urso, Brian

    2017-01-01

    Mirror galvanometer systems (galvos) are commonly employed in research and commercial applications in areas involving laser imaging, laser machining, laser-light shows, and others. Here, we present a robust, moderate-speed, and cost-efficient home-built galvo system. The mechanical part of this design consists of one mirror, which is tilted around two axes with multiple surface transducers. We demonstrate the ability of this galvo by scanning the mirror using a computer, via a custom driver circuit. The performance of the galvo, including scan range, noise, linearity, and scan speed, is characterized. As an application, we show that this galvo system can be used in a confocal scanning microscopy system.

  12. A Review on Spectral Amplitude Coding Optical Code Division Multiple Access

    NASA Astrophysics Data System (ADS)

    Kaur, Navpreet; Goyal, Rakesh; Rani, Monika

    2017-06-01

    This manuscript deals with analysis of Spectral Amplitude Coding Optical Code Division Multiple Access (SACOCDMA) system. The major noise source in optical CDMA is co-channel interference from other users known as multiple access interference (MAI). The system performance in terms of bit error rate (BER) degrades as a result of increased MAI. It is perceived that number of users and type of codes used for optical system directly decide the performance of system. MAI can be restricted by efficient designing of optical codes and implementing them with unique architecture to accommodate more number of users. Hence, it is a necessity to design a technique like spectral direct detection (SDD) technique with modified double weight code, which can provide better cardinality and good correlation property.

  13. Adaptive variable-length coding for efficient compression of spacecraft television data.

    NASA Technical Reports Server (NTRS)

    Rice, R. F.; Plaunt, J. R.

    1971-01-01

    An adaptive variable length coding system is presented. Although developed primarily for the proposed Grand Tour missions, many features of this system clearly indicate a much wider applicability. Using sample to sample prediction, the coding system produces output rates within 0.25 bit/picture element (pixel) of the one-dimensional difference entropy for entropy values ranging from 0 to 8 bit/pixel. This is accomplished without the necessity of storing any code words. Performance improvements of 0.5 bit/pixel can be simply achieved by utilizing previous line correlation. A Basic Compressor, using concatenated codes, adapts to rapid changes in source statistics by automatically selecting one of three codes to use for each block of 21 pixels. The system adapts to less frequent, but more dramatic, changes in source statistics by adjusting the mode in which the Basic Compressor operates on a line-to-line basis. Furthermore, the compression system is independent of the quantization requirements of the pulse-code modulation system.

  14. Development of the micro-scanning optical system of yellow laser applied to the ophthalmologic area

    NASA Astrophysics Data System (ADS)

    Ortega, Tiago A.; Mota, Alessandro D.; Costal, Glauco Z.; Fontes, Yuri C.; Rossi, Giuliano; Yasuoka, Fatima M. M.; Stefani, Mario A.; de Castro N., Jarbas C.

    2012-10-01

    In this work, the development of a laser scanning system for ophthalmology with micrometric positioning precision is presented. It is a semi-automatic scanning system for retina photocoagulation and laser trabeculoplasty. The equipment is a solid state laser fully integrated to the slit lamp. An optical system is responsible for producing different laser spot sizes on the image plane and a pair of galvanometer mirrors generates the scanning patterns.

  15. Coordinated design of coding and modulation systems

    NASA Technical Reports Server (NTRS)

    Massey, J. L.

    1976-01-01

    Work on partial unit memory codes continued; it was shown that for a given virtual state complexity, the maximum free distance over the class of all convolutional codes is achieved within the class of unit memory codes. The effect of phase-lock loop (PLL) tracking error on coding system performance was studied by using the channel cut-off rate as the measure of quality of a modulation system. Optimum modulation signal sets for a non-white Gaussian channel considered an heuristic selection rule based on a water-filling argument. The use of error correcting codes to perform data compression by the technique of syndrome source coding was researched and a weight-and-error-locations scheme was developed that is closely related to LDSC coding.

  16. [Data coding in the Israeli healthcare system - do choices provide the answers to our system's needs?].

    PubMed

    Zelingher, Julian; Ash, Nachman

    2013-05-01

    The IsraeLi healthcare system has undergone major processes for the adoption of health information technologies (HIT), and enjoys high Levels of utilization in hospital and ambulatory care. Coding is an essential infrastructure component of HIT, and ts purpose is to represent data in a simplified and common format, enhancing its manipulation by digital systems. Proper coding of data enables efficient identification, storage, retrieval and communication of data. UtiLization of uniform coding systems by different organizations enables data interoperability between them, facilitating communication and integrating data elements originating in different information systems from various organizations. Current needs in Israel for heaLth data coding include recording and reporting of diagnoses for hospitalized patients, outpatients and visitors of the Emergency Department, coding of procedures and operations, coding of pathology findings, reporting of discharge diagnoses and causes of death, billing codes, organizational data warehouses and national registries. New national projects for cLinicaL data integration, obligatory reporting of quality indicators and new Ministry of Health (MOH) requirements for HIT necessitate a high Level of interoperability that can be achieved only through the adoption of uniform coding. Additional pressures were introduced by the USA decision to stop the maintenance of the ICD-9-CM codes that are also used by Israeli healthcare, and the adoption of ICD-10-C and ICD-10-PCS as the main coding system for billing purpose. The USA has also mandated utilization of SNOMED-CT as the coding terminology for the ELectronic Health Record problem list, and for reporting quality indicators to the CMS. Hence, the Israeli MOH has recently decided that discharge diagnoses will be reported using ICD-10-CM codes, and SNOMED-CT will be used to code the cLinical information in the EHR. We reviewed the characteristics, strengths and weaknesses of these two coding systems. In summary, the adoption of ICD-10-CM is in line with the USA decision to abandon ICD-9-CM, and the Israeli heaLthcare system could benefit from USA heaLthcare efforts in this direction. The Large content of SNOMED-CT and its sophisticated hierarchical data structure will enable advanced cLinicaL decision support and quality improvement applications.

  17. Low-density parity-check codes for volume holographic memory systems.

    PubMed

    Pishro-Nik, Hossein; Rahnavard, Nazanin; Ha, Jeongseok; Fekri, Faramarz; Adibi, Ali

    2003-02-10

    We investigate the application of low-density parity-check (LDPC) codes in volume holographic memory (VHM) systems. We show that a carefully designed irregular LDPC code has a very good performance in VHM systems. We optimize high-rate LDPC codes for the nonuniform error pattern in holographic memories to reduce the bit error rate extensively. The prior knowledge of noise distribution is used for designing as well as decoding the LDPC codes. We show that these codes have a superior performance to that of Reed-Solomon (RS) codes and regular LDPC counterparts. Our simulation shows that we can increase the maximum storage capacity of holographic memories by more than 50 percent if we use irregular LDPC codes with soft-decision decoding instead of conventionally employed RS codes with hard-decision decoding. The performance of these LDPC codes is close to the information theoretic capacity.

  18. Identification and Classification of Orthogonal Frequency Division Multiple Access (OFDMA) Signals Used in Next Generation Wireless Systems

    DTIC Science & Technology

    2012-03-01

    advanced antenna systems AMC adaptive modulation and coding AWGN additive white Gaussian noise BPSK binary phase shift keying BS base station BTC ...QAM-16, and QAM-64, and coding types include convolutional coding (CC), convolutional turbo coding (CTC), block turbo coding ( BTC ), zero-terminating

  19. Design and analysis of a sub-aperture scanning machine for the transmittance measurements of large-aperture optical system

    NASA Astrophysics Data System (ADS)

    He, Yingwei; Li, Ping; Feng, Guojin; Cheng, Li; Wang, Yu; Wu, Houping; Liu, Zilong; Zheng, Chundi; Sha, Dingguo

    2010-11-01

    For measuring large-aperture optical system transmittance, a novel sub-aperture scanning machine with double-rotating arms (SSMDA) was designed to obtain sub-aperture beam spot. Optical system full-aperture transmittance measurements can be achieved by applying sub-aperture beam spot scanning technology. The mathematical model of the SSMDA based on a homogeneous coordinate transformation matrix is established to develop a detailed methodology for analyzing the beam spot scanning errors. The error analysis methodology considers two fundamental sources of scanning errors, namely (1) the length systematic errors and (2) the rotational systematic errors. As the systematic errors of the parameters are given beforehand, computational results of scanning errors are between -0.007~0.028mm while scanning radius is not lager than 400.000mm. The results offer theoretical and data basis to the research on transmission characteristics of large optical system.

  20. Security printing of covert quick response codes using upconverting nanoparticle inks

    NASA Astrophysics Data System (ADS)

    Meruga, Jeevan M.; Cross, William M.; May, P. Stanley; Luu, QuocAnh; Crawford, Grant A.; Kellar, Jon J.

    2012-10-01

    Counterfeiting costs governments and private industries billions of dollars annually due to loss of value in currency and other printed items. This research involves using lanthanide doped β-NaYF4 nanoparticles for security printing applications. Inks comprised of Yb3+/Er3+ and Yb3+/Tm3+ doped β-NaYF4 nanoparticles with oleic acid as the capping agent in toluene and methyl benzoate with poly(methyl methacrylate) (PMMA) as the binding agent were used to print quick response (QR) codes. The QR codes were made using an AutoCAD file and printed with Optomec direct-write aerosol jetting®. The printed QR codes are invisible under ambient lighting conditions, but are readable using a near-IR laser, and were successfully scanned using a smart phone. This research demonstrates that QR codes, which have been used primarily for information sharing applications, can also be used for security purposes. Higher levels of security were achieved by printing both green and blue upconverting inks, based on combinations of Er3+/Yb3+ and Tm3+/Yb3+, respectively, in a single QR code. The near-infrared (NIR)-to-visible upconversion luminescence properties of the two-ink QR codes were analyzed, including the influence of NIR excitation power density on perceived color, in term of the CIE 1931 chromaticity index. It was also shown that this security ink can be optimized for line width, thickness and stability on different substrates.

  1. Security printing of covert quick response codes using upconverting nanoparticle inks.

    PubMed

    Meruga, Jeevan M; Cross, William M; Stanley May, P; Luu, QuocAnh; Crawford, Grant A; Kellar, Jon J

    2012-10-05

    Counterfeiting costs governments and private industries billions of dollars annually due to loss of value in currency and other printed items. This research involves using lanthanide doped β-NaYF(4) nanoparticles for security printing applications. Inks comprised of Yb(3+)/Er(3+) and Yb(3+)/Tm(3+) doped β-NaYF(4) nanoparticles with oleic acid as the capping agent in toluene and methyl benzoate with poly(methyl methacrylate) (PMMA) as the binding agent were used to print quick response (QR) codes. The QR codes were made using an AutoCAD file and printed with Optomec direct-write aerosol jetting(®). The printed QR codes are invisible under ambient lighting conditions, but are readable using a near-IR laser, and were successfully scanned using a smart phone. This research demonstrates that QR codes, which have been used primarily for information sharing applications, can also be used for security purposes. Higher levels of security were achieved by printing both green and blue upconverting inks, based on combinations of Er(3+)/Yb(3+) and Tm(3+)/Yb(3+), respectively, in a single QR code. The near-infrared (NIR)-to-visible upconversion luminescence properties of the two-ink QR codes were analyzed, including the influence of NIR excitation power density on perceived color, in term of the CIE 1931 chromaticity index. It was also shown that this security ink can be optimized for line width, thickness and stability on different substrates.

  2. Deductive Glue Code Synthesis for Embedded Software Systems Based on Code Patterns

    NASA Technical Reports Server (NTRS)

    Liu, Jian; Fu, Jicheng; Zhang, Yansheng; Bastani, Farokh; Yen, I-Ling; Tai, Ann; Chau, Savio N.

    2006-01-01

    Automated code synthesis is a constructive process that can be used to generate programs from specifications. It can, thus, greatly reduce the software development cost and time. The use of formal code synthesis approach for software generation further increases the dependability of the system. Though code synthesis has many potential benefits, the synthesis techniques are still limited. Meanwhile, components are widely used in embedded system development. Applying code synthesis to component based software development (CBSD) process can greatly enhance the capability of code synthesis while reducing the component composition efforts. In this paper, we discuss the issues and techniques for applying deductive code synthesis techniques to CBSD. For deductive synthesis in CBSD, a rule base is the key for inferring appropriate component composition. We use the code patterns to guide the development of rules. Code patterns have been proposed to capture the typical usages of the components. Several general composition operations have been identified to facilitate systematic composition. We present the technique for rule development and automated generation of new patterns from existing code patterns. A case study of using this method in building a real-time control system is also presented.

  3. Automatic detection system of shaft part surface defect based on machine vision

    NASA Astrophysics Data System (ADS)

    Jiang, Lixing; Sun, Kuoyuan; Zhao, Fulai; Hao, Xiangyang

    2015-05-01

    Surface physical damage detection is an important part of the shaft parts quality inspection and the traditional detecting methods are mostly human eye identification which has many disadvantages such as low efficiency, bad reliability. In order to improve the automation level of the quality detection of shaft parts and establish its relevant industry quality standard, a machine vision inspection system connected with MCU was designed to realize the surface detection of shaft parts. The system adopt the monochrome line-scan digital camera and use the dark-field and forward illumination technology to acquire images with high contrast; the images were segmented to Bi-value images through maximum between-cluster variance method after image filtering and image enhancing algorithms; then the mainly contours were extracted based on the evaluation criterion of the aspect ratio and the area; then calculate the coordinates of the centre of gravity of defects area, namely locating point coordinates; At last, location of the defects area were marked by the coding pen communicated with MCU. Experiment show that no defect was omitted and false alarm error rate was lower than 5%, which showed that the designed system met the demand of shaft part on-line real-time detection.

  4. An RFID-Based Smart Structure for the Supply Chain: Resilient Scanning Proofs and Ownership Transfer with Positive Secrecy Capacity Channels.

    PubMed

    Burmester, Mike; Munilla, Jorge; Ortiz, Andrés; Caballero-Gil, Pino

    2017-07-04

    The National Strategy for Global Supply Chain Security published in 2012 by the White House identifies two primary goals for strengthening global supply chains: first, to promote the efficient and secure movement of goods, and second to foster a resilient supply chain. The Internet of Things (IoT), and in particular Radio Frequency Identification (RFID) technology, can be used to realize these goals. For product identification, tracking and real-time awareness, RFID tags are attached to goods. As tagged goods move along the supply chain from the suppliers to the manufacturers, and then on to the retailers until eventually they reach the customers, two major security challenges can be identified: (I) to protect the shipment of goods that are controlled by potentially untrusted carriers; and (II) to secure the transfer of ownership at each stage of the chain. For the former, grouping proofs in which the tags of the scanned goods generate a proof of "simulatenous" presence can be employed, while for the latter, ownership transfer protocols (OTP) are used. This paper describes enhanced security solutions for both challenges. We first extend earlier work on grouping proofs and group codes to capture resilient group scanning with untrusted readers; then, we describe a modified version of a recently published OTP based on channels with positive secrecy capacity adapted to be implemented on common RFID systems in the supply chain. The proposed solutions take into account the limitations of low cost tags employed in the supply chain, which are only required to generate pseudorandom numbers and compute one-way hash functions.

  5. An Automated Medical Information Management System (OpScan-MIMS) in a Clinical Setting

    PubMed Central

    Margolis, S.; Baker, T.G.; Ritchey, M.G.; Alterescu, S.; Friedman, C.

    1981-01-01

    This paper describes an automated medical information management system within a clinic setting. The system includes an optically scanned data entry system (OpScan), a generalized, interactive retrieval and storage software system(Medical Information Management System, MIMS) and the use of time-sharing. The system has the advantages of minimal hardware purchase and maintenance, rapid data entry and retrieval, user-created programs, no need for user knowledge of computer language or technology and is cost effective. The OpScan-MIMS system has been operational for approximately 16 months in a sexually transmitted disease clinic. The system's application to medical audit, quality assurance, clinic management and clinical training are demonstrated.

  6. 3D neutronic codes coupled with thermal-hydraulic system codes for PWR, and BWR and VVER reactors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Langenbuch, S.; Velkov, K.; Lizorkin, M.

    1997-07-01

    This paper describes the objectives of code development for coupling 3D neutronics codes with thermal-hydraulic system codes. The present status of coupling ATHLET with three 3D neutronics codes for VVER- and LWR-reactors is presented. After describing the basic features of the 3D neutronic codes BIPR-8 from Kurchatov-Institute, DYN3D from Research Center Rossendorf and QUABOX/CUBBOX from GRS, first applications of coupled codes for different transient and accident scenarios are presented. The need of further investigations is discussed.

  7. VENTURE/PC manual: A multidimensional multigroup neutron diffusion code system. Version 3

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shapiro, A.; Huria, H.C.; Cho, K.W.

    1991-12-01

    VENTURE/PC is a recompilation of part of the Oak Ridge BOLD VENTURE code system, which will operate on an IBM PC or compatible computer. Neutron diffusion theory solutions are obtained for multidimensional, multigroup problems. This manual contains information associated with operating the code system. The purpose of the various modules used in the code system, and the input for these modules are discussed. The PC code structure is also given. Version 2 included several enhancements not given in the original version of the code. In particular, flux iterations can be done in core rather than by reading and writing tomore » disk, for problems which allow sufficient memory for such in-core iterations. This speeds up the iteration process. Version 3 does not include any of the special processors used in the previous versions. These special processors utilized formatted input for various elements of the code system. All such input data is now entered through the Input Processor, which produces standard interface files for the various modules in the code system. In addition, a Standard Interface File Handbook is included in the documentation which is distributed with the code, to assist in developing the input for the Input Processor.« less

  8. Safe Active Scanning for Energy Delivery Systems Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Helms, J.; Salazar, B.; Scheibel, P.

    The Department of Energy’s Cybersecurity for Energy Delivery Systems Program has funded Safe(r) Active Scanning for Energy Delivery Systems, led by Lawrence Livermore National Laboratory, to investigate and analyze the impacts of active scanning in the operational environment of energy delivery systems. In collaboration with Pacific Northwest National Laboratory and Idaho National Laboratory, active scans across three testbeds including 38 devices were performed. This report gives a summary of the initial literature survey performed on the SASEDS project as well as industry partner interview summaries and main findings from Phase 1 of the project. Additionally, the report goes into themore » details of scanning techniques, methodologies for testing, testbed descriptions, and scanning results, with appendices to elaborate on the specific scans that were performed. As a result of testing, a single device out of 38 exhibited problems when actively scanned, and a reboot was required to fix it. This single failure indicates that active scanning is not likely to have a detrimental effect on the safety and resilience of energy delivery systems. We provide a path forward for future research that could enable wide adoption of active scanning and lead utilities to incorporate active scanning as part of their default network security plans to discover and rectify rogue devices, adversaries, and services that may be on the network. This increased network visibility will allow operational technology cybersecurity practitioners to improve their situational awareness of networks and their vulnerabilities.« less

  9. TU-CD-207-05: A Novel Digital Tomosynthesis System Using Orthogonal Scanning Technique: A Feasibility Study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kim, J; Park, C; Kauweloa, K

    2015-06-15

    Purpose: As an alternative to full tomographic imaging technique such as cone-beam computed tomography (CBCT), there is growing interest to adopt digital tomosynthesis (DTS) for the use of diagnostic as well as therapeutic applications. The aim of this study is to propose a new DTS system using novel orthogonal scanning technique, which can provide superior image quality DTS images compared to the conventional DTS scanning system. Methods: Unlike conventional DTS scanning system, the proposed DTS is reconstructed with two sets of orthogonal patient scans. 1) X-ray projections that are acquired along transverse trajectory and 2) an additional sets of X-raymore » projections acquired along the vertical direction at the mid angle of the previous transverse scan. To reconstruct DTS, we have used modified filtered backprojection technique to account for the different scanning directions of each projection set. We have evaluated the performance of our method using numerical planning CT data of liver cancer patient and a physical pelvis phantom experiment. The results were compared with conventional DTS techniques with single transverse and vertical scanning. Results: The experiments on both numerical simulation as well as physical experiment showed that the resolution as well as contrast of anatomical structures was much clearer using our method. Specifically, the image quality comparing with transversely scanned DTS showed that the edge and contrast of anatomical structures along Left-Right (LR) directions was comparable however, considerable discrepancy and enhancement could be observed along Superior-Inferior (SI) direction using our method. The opposite was observed when vertically scanned DTS was compared. Conclusion: In this study, we propose a novel DTS system using orthogonal scanning technique. The results indicated that the image quality of our novel DTS system was superior compared to conventional DTS system. This makes our DTS system potentially useful in various on-line clinical applications.« less

  10. Error control techniques for satellite and space communications

    NASA Technical Reports Server (NTRS)

    Costello, D. J., Jr.

    1986-01-01

    High rate concatenated coding systems with trellis inner codes and Reed-Solomon (RS) outer codes for application in satellite communication systems are considered. Two types of inner codes are studied: high rate punctured binary convolutional codes which result in overall effective information rates between 1/2 and 1 bit per channel use; and bandwidth efficient signal space trellis codes which can achieve overall effective information rates greater than 1 bit per channel use. Channel capacity calculations with and without side information performed for the concatenated coding system. Concatenated coding schemes are investigated. In Scheme 1, the inner code is decoded with the Viterbi algorithm and the outer RS code performs error-correction only (decoding without side information). In scheme 2, the inner code is decoded with a modified Viterbi algorithm which produces reliability information along with the decoded output. In this algorithm, path metrics are used to estimate the entire information sequence, while branch metrics are used to provide the reliability information on the decoded sequence. This information is used to erase unreliable bits in the decoded output. An errors-and-erasures RS decoder is then used for the outer code. These two schemes are proposed for use on NASA satellite channels. Results indicate that high system reliability can be achieved with little or no bandwidth expansion.

  11. High dynamic range coding imaging system

    NASA Astrophysics Data System (ADS)

    Wu, Renfan; Huang, Yifan; Hou, Guangqi

    2014-10-01

    We present a high dynamic range (HDR) imaging system design scheme based on coded aperture technique. This scheme can help us obtain HDR images which have extended depth of field. We adopt Sparse coding algorithm to design coded patterns. Then we utilize the sensor unit to acquire coded images under different exposure settings. With the guide of the multiple exposure parameters, a series of low dynamic range (LDR) coded images are reconstructed. We use some existing algorithms to fuse and display a HDR image by those LDR images. We build an optical simulation model and get some simulation images to verify the novel system.

  12. Verification of Gyrokinetic codes: Theoretical background and applications

    NASA Astrophysics Data System (ADS)

    Tronko, Natalia; Bottino, Alberto; Görler, Tobias; Sonnendrücker, Eric; Told, Daniel; Villard, Laurent

    2017-05-01

    In fusion plasmas, the strong magnetic field allows the fast gyro-motion to be systematically removed from the description of the dynamics, resulting in a considerable model simplification and gain of computational time. Nowadays, the gyrokinetic (GK) codes play a major role in the understanding of the development and the saturation of turbulence and in the prediction of the subsequent transport. Naturally, these codes require thorough verification and validation. Here, we present a new and generic theoretical framework and specific numerical applications to test the faithfulness of the implemented models to theory and to verify the domain of applicability of existing GK codes. For a sound verification process, the underlying theoretical GK model and the numerical scheme must be considered at the same time, which has rarely been done and therefore makes this approach pioneering. At the analytical level, the main novelty consists in using advanced mathematical tools such as variational formulation of dynamics for systematization of basic GK code's equations to access the limits of their applicability. The verification of the numerical scheme is proposed via the benchmark effort. In this work, specific examples of code verification are presented for two GK codes: the multi-species electromagnetic ORB5 (PIC) and the radially global version of GENE (Eulerian). The proposed methodology can be applied to any existing GK code. We establish a hierarchy of reduced GK Vlasov-Maxwell equations implemented in the ORB5 and GENE codes using the Lagrangian variational formulation. At the computational level, detailed verifications of global electromagnetic test cases developed from the CYCLONE Base Case are considered, including a parametric β-scan covering the transition from ITG to KBM and the spectral properties at the nominal β value.

  13. Mechanical code comparator

    DOEpatents

    Peter, Frank J.; Dalton, Larry J.; Plummer, David W.

    2002-01-01

    A new class of mechanical code comparators is described which have broad potential for application in safety, surety, and security applications. These devices can be implemented as micro-scale electromechanical systems that isolate a secure or otherwise controlled device until an access code is entered. This access code is converted into a series of mechanical inputs to the mechanical code comparator, which compares the access code to a pre-input combination, entered previously into the mechanical code comparator by an operator at the system security control point. These devices provide extremely high levels of robust security. Being totally mechanical in operation, an access control system properly based on such devices cannot be circumvented by software attack alone.

  14. HERCULES: A Pattern Driven Code Transformation System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kartsaklis, Christos; Hernandez, Oscar R; Hsu, Chung-Hsing

    2012-01-01

    New parallel computers are emerging, but developing efficient scientific code for them remains difficult. A scientist must manage not only the science-domain complexity but also the performance-optimization complexity. HERCULES is a code transformation system designed to help the scientist to separate the two concerns, which improves code maintenance, and facilitates performance optimization. The system combines three technologies, code patterns, transformation scripts and compiler plugins, to provide the scientist with an environment to quickly implement code transformations that suit his needs. Unlike existing code optimization tools, HERCULES is unique in its focus on user-level accessibility. In this paper we discuss themore » design, implementation and an initial evaluation of HERCULES.« less

  15. Performance analysis of optical wireless communication system based on two-fold turbo code

    NASA Astrophysics Data System (ADS)

    Chen, Jun; Huang, Dexiu; Yuan, Xiuhua

    2005-11-01

    Optical wireless communication (OWC) is beginning to emerge in the telecommunications market as a strategy to meet last-mile demand owing to its unique combination of features. Turbo codes have an impressive near Shannon-limit error correcting performance. Twofold turbo codes have been recently introduced as the least complex member of the multifold turbo code family. In this paper, at first, we present the mathematical model of signal and optical wireless channel with fading and bit error rate model with scintillation, then we provide a new turbo code method to use in OWC system, we can obtain a better BER curse of OWC system with twofold turbo code than with common turbo code.

  16. Integration of QR codes into an anesthesia information management system for resident case log management.

    PubMed

    Avidan, Alexander; Weissman, Charles; Levin, Phillip D

    2015-04-01

    Quick response (QR) codes containing anesthesia syllabus data were introduced into an anesthesia information management system. The code was generated automatically at the conclusion of each case and available for resident case logging using a smartphone or tablet. The goal of this study was to evaluate the use and usability/user-friendliness of such system. Resident case logging practices were assessed prior to introducing the QR codes. QR code use and satisfactions amongst residents was reassessed at three and six months. Before QR code introduction only 12/23 (52.2%) residents maintained a case log. Most of the remaining residents (9/23, 39.1%) expected to receive a case list from the anesthesia information management system database at the end of their residency. At three months and six months 17/26 (65.4%) and 15/25 (60.0%) residents, respectively, were using the QR codes. Satisfaction was rated as very good or good. QR codes for residents' case logging with smartphones or tablets were successfully introduced in an anesthesia information management system and used by most residents. QR codes can be successfully implemented into medical practice to support data transfer. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  17. Ultrasonic scanning system for in-place inspection of brazed tube joints

    NASA Technical Reports Server (NTRS)

    Haynes, J. L.; Wages, C. G.; Haralson, H. S. (Inventor)

    1973-01-01

    A miniaturized ultrasonic scanning system for nondestructive in-place, non-immersion testing of brazed joints in stainless-steel tubing is described. The system is capable of scanning brazed tube joints, with limited clearance access, in 1/4 through 5/8 inch union, tee, elbow and cross configurations. The system has the capability to detect defective conditions now associated with material density changes in addition to those which are depended upon density variations. The system includes a miniaturized scanning head assembly that fits around a tube joint and rotates the transducer around and down the joint in a continuous spiral motion. The C-scan recorder is similar in principle to conventional models except that it was specially designed to track the continuous spiral scan of the tube joint. The scanner and recorder can be operated with most commercially available ultrasonic flaw detectors.

  18. Interferometry-based free space communication and information processing

    NASA Astrophysics Data System (ADS)

    Arain, Muzammil Arshad

    This dissertation studies, analyzes, and experimentally demonstrates the innovative use of interference phenomenon in the field of opto-electronic information processing and optical communications. A number of optical systems using interferometric techniques both in the optical and the electronic domains has been demonstrated in the filed of signal transmission and processing, optical metrology, defense, and physical sensors. Specifically it has been shown that the interference of waves in the form of holography can be exploited to realize a novel optical scanner called Code Multiplexed Optical Scanner (C-MOS). The C-MOS features large aperture, wide scan angles, 3-D beam control, no moving parts, and high beam scanning resolution. A C-MOS based free space optical transceiver for bi-directional communication has also been experimentally demonstrated. For high speed, large bandwidth, and high frequency operation, an optically implemented reconfigurable RF transversal filter design is presented that implements wide range of filtering algorithms. A number of techniques using heterodyne interferometry via acousto-optic device for optical path length measurements have been described. Finally, a whole new class of interferometric sensors for optical metrology and sensing applications is presented. A non-traditional interferometric output signal processing scheme has been developed. Applications include, for example, temperature sensors for harsh environments for a wide temperature range from room temperature to 1000°C.

  19. Ion recombination and polarity correction factors for a plane-parallel ionization chamber in a proton scanning beam.

    PubMed

    Liszka, Małgorzata; Stolarczyk, Liliana; Kłodowska, Magdalena; Kozera, Anna; Krzempek, Dawid; Mojżeszek, Natalia; Pędracka, Anna; Waligórski, Michael Patrick Russell; Olko, Paweł

    2018-01-01

    To evaluate the effect on charge collection in the ionization chamber (IC) in proton pencil beam scanning (PBS), where the local dose rate may exceed the dose rates encountered in conventional MV therapy by up to three orders of magnitude. We measured values of the ion recombination (k s ) and polarity (k pol ) correction factors in water, for a plane-parallel Markus TM23343 IC, using the cyclotron-based Proteus-235 therapy system with an active proton PBS of energies 30-230 MeV. Values of k s were determined from extrapolation of the saturation curve and the Two-Voltage Method (TVM), for planar fields. We compared our experimental results with those obtained from theoretical calculations. The PBS dose rates were estimated by combining direct IC measurements with results of simulations performed using the FLUKA MC code. Values of k s were also determined by the TVM for uniformly irradiated volumes over different ranges and modulation depths of the proton PBS, with or without range shifter. By measuring charge collection efficiency versus applied IC voltage, we confirmed that, with respect to ion recombination, our proton PBS represents a continuous beam. For a given chamber parameter, e.g., nominal voltage, the value of k s depends on the energy and the dose rate of the proton PBS, reaching c. 0.5% for the TVM, at the dose rate of 13.4 Gy/s. For uniformly irradiated regular volumes, the k s value was significantly smaller, within 0.2% or 0.3% for irradiations with or without range shifter, respectively. Within measurement uncertainty, the average value of k pol , for the Markus TM23343 IC, was close to unity over the whole investigated range of clinical proton beam energies. While no polarity effect was observed for the Markus TM23343 IC in our pencil scanning proton beam system, the effect of volume recombination cannot be ignored. © 2017 American Association of Physicists in Medicine.

  20. TADSim: Discrete Event-based Performance Prediction for Temperature Accelerated Dynamics

    DOE PAGES

    Mniszewski, Susan M.; Junghans, Christoph; Voter, Arthur F.; ...

    2015-04-16

    Next-generation high-performance computing will require more scalable and flexible performance prediction tools to evaluate software--hardware co-design choices relevant to scientific applications and hardware architectures. Here, we present a new class of tools called application simulators—parameterized fast-running proxies of large-scale scientific applications using parallel discrete event simulation. Parameterized choices for the algorithmic method and hardware options provide a rich space for design exploration and allow us to quickly find well-performing software--hardware combinations. We demonstrate our approach with a TADSim simulator that models the temperature-accelerated dynamics (TAD) method, an algorithmically complex and parameter-rich member of the accelerated molecular dynamics (AMD) family ofmore » molecular dynamics methods. The essence of the TAD application is captured without the computational expense and resource usage of the full code. We accomplish this by identifying the time-intensive elements, quantifying algorithm steps in terms of those elements, abstracting them out, and replacing them by the passage of time. We use TADSim to quickly characterize the runtime performance and algorithmic behavior for the otherwise long-running simulation code. We extend TADSim to model algorithm extensions, such as speculative spawning of the compute-bound stages, and predict performance improvements without having to implement such a method. Validation against the actual TAD code shows close agreement for the evolution of an example physical system, a silver surface. Finally, focused parameter scans have allowed us to study algorithm parameter choices over far more scenarios than would be possible with the actual simulation. This has led to interesting performance-related insights and suggested extensions.« less

  1. Fixed-point Design of the Lattice-reduction-aided Iterative Detection and Decoding Receiver for Coded MIMO Systems

    DTIC Science & Technology

    2011-01-01

    reliability, e.g., Turbo Codes [2] and Low Density Parity Check ( LDPC ) codes [3]. The challenge to apply both MIMO and ECC into wireless systems is on...REPORT Fixed-point Design of theLattice-reduction-aided Iterative Detection andDecoding Receiver for Coded MIMO Systems 14. ABSTRACT 16. SECURITY...illustrates the performance of coded LR aided detectors. 1. REPORT DATE (DD-MM-YYYY) 4. TITLE AND SUBTITLE 13. SUPPLEMENTARY NOTES The views, opinions

  2. Expert system for maintenance management of a boiling water reactor power plant

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hong Shen; Liou, L.W.; Levine, S.

    1992-01-01

    An expert system code has been developed for the maintenance of two boiling water reactor units in Berwick, Pennsylvania, that are operated by the Pennsylvania Power and Light Company (PP and L). The objective of this expert system code, where the knowledge of experienced operators and engineers is captured and implemented, is to support the decisions regarding which components can be safely and reliably removed from service for maintenance. It can also serve as a query-answering facility for checking the plant system status and for training purposes. The operating and maintenance information of a large number of support systems, whichmore » must be available for emergencies and/or in the event of an accident, is stored in the data base of the code. It identifies the relevant technical specifications and management rules for shutting down any one of the systems or removing a component from service to support maintenance. Because of the complexity and time needed to incorporate a large number of systems and their components, the first phase of the expert system develops a prototype code, which includes only the reactor core isolation coolant system, the high-pressure core injection system, the instrument air system, the service water system, and the plant electrical system. The next phase is scheduled to expand the code to include all other systems. This paper summarizes the prototype code and the design concept of the complete expert system code for maintenance management of all plant systems and components.« less

  3. A ring transducer system for medical ultrasound research.

    PubMed

    Waag, Robert C; Fedewa, Russell J

    2006-10-01

    An ultrasonic ring transducer system has been developed for experimental studies of scattering and imaging. The transducer consists of 2048 rectangular elements with a 2.5-MHz center frequency, a 67% -6 dB bandwidth, and a 0.23-mm pitch arranged in a 150-mm-diameter ring with a 25-mm elevation. At the center frequency, the element size is 0.30lambda x 42lambda and the pitch is 0.38lambda. The system has 128 parallel transmit channels, 16 parallel receive channels, a 2048:128 transmit multiplexer, a 2048:16 receive multiplexer, independently programmable transmit waveforms with 8-bit resolution, and receive amplifiers with time variable gain independently programmable over a 40-dB range. Receive signals are sampled at 20 MHz with 12-bit resolution. Arbitrary transmit and receive apertures can be synthesized. Calibration software minimizes system nonidealities caused by noncircularity of the ring and element-to-element response differences. Application software enables the system to be used by specification of high-level parameters in control files from which low-level hardware-dependent parameters are derived by specialized code. Use of the system is illustrated by producing focused and steered beams, synthesizing a spatially limited plane wave, measuring angular scattering, and forming b-scan images.

  4. FORTRAN Automated Code Evaluation System (faces) system documentation, version 2, mod 0. [error detection codes/user manuals (computer programs)

    NASA Technical Reports Server (NTRS)

    1975-01-01

    A system is presented which processes FORTRAN based software systems to surface potential problems before they become execution malfunctions. The system complements the diagnostic capabilities of compilers, loaders, and execution monitors rather than duplicating these functions. Also, it emphasizes frequent sources of FORTRAN problems which require inordinate manual effort to identify. The principle value of the system is extracting small sections of unusual code from the bulk of normal sequences. Code structures likely to cause immediate or future problems are brought to the user's attention. These messages stimulate timely corrective action of solid errors and promote identification of 'tricky' code. Corrective action may require recoding or simply extending software documentation to explain the unusual technique.

  5. Architecture and implementation considerations of a high-speed Viterbi decoder for a Reed-Muller subcode

    NASA Technical Reports Server (NTRS)

    Lin, Shu (Principal Investigator); Uehara, Gregory T.; Nakamura, Eric; Chu, Cecilia W. P.

    1996-01-01

    The (64, 40, 8) subcode of the third-order Reed-Muller (RM) code for high-speed satellite communications is proposed. The RM subcode can be used either alone or as an inner code of a concatenated coding system with the NASA standard (255, 233, 33) Reed-Solomon (RS) code as the outer code to achieve high performance (or low bit-error rate) with reduced decoding complexity. It can also be used as a component code in a multilevel bandwidth efficient coded modulation system to achieve reliable bandwidth efficient data transmission. The progress made toward achieving the goal of implementing a decoder system based upon this code is summarized. The development of the integrated circuit prototype sub-trellis IC, particularly focusing on the design methodology, is addressed.

  6. Nurses' attitudes toward the use of the bar-coding medication administration system.

    PubMed

    Marini, Sana Daya; Hasman, Arie; Huijer, Huda Abu-Saad; Dimassi, Hani

    2010-01-01

    This study determines nurses' attitudes toward bar-coding medication administration system use. Some of the factors underlying the successful use of bar-coding medication administration systems that are viewed as a connotative indicator of users' attitudes were used to gather data that describe the attitudinal basis for system adoption and use decisions in terms of subjective satisfaction. Only 67 nurses in the United States had the chance to respond to the e-questionnaire posted on the CARING list server for the months of June and July 2007. Participants rated their satisfaction with bar-coding medication administration system use based on system functionality, usability, and its positive/negative impact on the nursing practice. Results showed, to some extent, positive attitude, but the image profile draws attention to nurses' concerns for improving certain system characteristics. The high bar-coding medication administration system skills revealed a more negative perception of the system by the nursing staff. The reasons underlying dissatisfaction with bar-coding medication administration use by skillful users are an important source of knowledge that can be helpful for system development as well as system deployment. As a result, strengthening bar-coding medication administration system usability by magnifying its ability to eliminate medication errors and the contributing factors, maximizing system functionality by ascertaining its power as an extra eye in the medication administration process, and impacting the clinical nursing practice positively by being helpful to nurses, speeding up the medication administration process, and being user-friendly can offer a congenial settings for establishing positive attitude toward system use, which in turn leads to successful bar-coding medication administration system use.

  7. Chirp-coded excitation imaging with a high-frequency ultrasound annular array.

    PubMed

    Mamou, Jonathan; Ketterling, Jeffrey A; Silverman, Ronald H

    2008-02-01

    High-frequency ultrasound (HFU, > 15 MHz) is an effective means of obtaining fine-resolution images of biological tissues for applications such as opthalmologic, dermatologic, and small animal imaging. HFU has two inherent drawbacks. First, HFU images have a limited depth of field (DOF) because of the short wavelength and the low fixed F-number of conventional HFU transducers. Second, HFU can be used to image only a few millimeters deep into a tissue because attenuation increases with frequency. In this study, a five-element annular array was used in conjunction with a synthetic-focusing algorithm to extend the DOF. The annular array had an aperture of 10 mm, a focal length of 31 mm, and a center frequency of 17 MHz. To increase penetration depth, 8-micros, chirp-coded signals were designed, input into an arbitrary waveform generator, and used to excite each array element. After data acquisition, the received signals were linearly filtered to restore axial resolution and increase the SNR. To compare the chirpcoded imaging method with conventional impulse imaging in terms of resolution, a 25-microm diameter wire was scanned and the -6-dB axial and lateral resolutions were computed at depths ranging from 20.5 to 40.5 mm. The results demonstrated that chirp-coded excitation did not degrade axial or lateral resolution. A tissue-mimicking phantom containing 10-microm glass beads was scanned, and backscattered signals were analyzed to evaluate SNR and penetration depth. Finally, ex vivo ophthalmic images were formed and chirpcoded images showed features that were not visible in conventional impulse images.

  8. Theory and Performance of AIMS for Active Interrogation

    NASA Astrophysics Data System (ADS)

    Walters, William J.; Royston, Katherine E. K.; Haghighat, Alireza

    2014-06-01

    A hybrid Monte Carlo and deterministic methodology has been developed for application to active interrogation systems. The methodology consists of four steps: i) determination of neutron flux distribution due to neutron source transport and subcritical multiplication; ii) generation of gamma source distribution from (n, γ) interactions; iii) determination of gamma current at a detector window; iv) detection of gammas by the detector. This paper discusses the theory and results of the first three steps for the case of a cargo container with a sphere of HEU in third-density water. In the first step, a response-function formulation has been developed to calculate the subcritical multiplication and neutron flux distribution. Response coefficients are pre-calculated using the MCNP5 Monte Carlo code. The second step uses the calculated neutron flux distribution and Bugle-96 (n, γ) cross sections to find the resulting gamma source distribution. Finally, in the third step the gamma source distribution is coupled with a pre-calculated adjoint function to determine the gamma flux at a detector window. A code, AIMS (Active Interrogation for Monitoring Special-Nuclear-materials), has been written to output the gamma current for an source-detector assembly scanning across the cargo using the pre-calculated values and takes significantly less time than a reference MCNP5 calculation.

  9. Variation in clinical coding lists in UK general practice: a barrier to consistent data entry?

    PubMed

    Tai, Tracy Waize; Anandarajah, Sobanna; Dhoul, Neil; de Lusignan, Simon

    2007-01-01

    Routinely collected general practice computer data are used for quality improvement; poor data quality including inconsistent coding can reduce their usefulness. To document the diversity of data entry systems currently in use in UK general practice and highlight possible implications for data quality. General practice volunteers provided screen shots of the clinical coding screen they would use to code a diagnosis or problem title in the clinical consultation. The six clinical conditions examined were: depression, cystitis, type 2 diabetes mellitus, sore throat, tired all the time, and myocardial infarction. We looked at the picking lists generated for these problem titles in EMIS, IPS, GPASS and iSOFT general practice clinical computer systems, using the Triset browser as a gold standard for comparison. A mean of 19.3 codes is offered in the picking list after entering a diagnosis or problem title. EMIS produced the longest picking lists and GPASS the shortest, with a mean number of choices of 35.2 and 12.7, respectively. Approximately three-quarters (73.5%) of codes are diagnoses, one-eighth (12.5%) symptom codes, and the remainder come from a range of Read chapters. There was no readily detectable consistent order in which codes were displayed. Velocity coding, whereby commonly-used codes are placed higher in the picking list, results in variation between practices even where they have the same brand of computer system. Current systems for clinical coding promote diversity rather than consistency of clinical coding. As the UK moves towards an integrated health IT system consistency of coding will become more important. A standardised, limited list of codes for primary care might help address this need.

  10. Influence of Different CAM Strategies on the Fit of Partial Crown Restorations: A Digital Three-dimensional Evaluation.

    PubMed

    Zimmermann, M; Valcanaia, A; Neiva, G; Mehl, A; Fasbinder, D

    2018-04-09

    CAM fabrication is an important step within the CAD/CAM process. The internal fit of restorations is influenced by the accuracy of the subtractive CAM procedure. Little is known about how CAM strategies might influence the fit of CAD/CAM fabricated restorations. The aim of this study was to three-dimensionally evaluate the fit of CAD/CAM fabricated zirconia-reinforced lithium silicate ceramic partial crowns fabricated with three different CAM strategies. The null hypothesis was that different CAM strategies did not influence the fitting accuracy of CAD/CAM fabricated zirconia-reinforced lithium silicate ceramic partial crowns. Preparation for a partial crown was performed on a maxillary right first molar on a typodont. A chairside CAD/CAM system with the intraoral scanning device CEREC Omnicam (Dentsply Sirona, York, PA, USA) and the 3+1 axis milling unit CEREC MCXL was used. There were three groups with different CAM strategies: step bur 12 (12), step bur 12S (12S), and two step-mode (12TWO). The zirconia-reinforced lithium silicate ceramic Celtra Duo (Dentsply Sirona) was used as the CAD/CAM material. A new 3D method for evaluating the fit was applied, consisting of the quadrant scan with the intraoral scanning device CEREC Omnicam. The scan of the PVS material adherent to the preparation and the preparation scan were matched, and the difference analysis was performed with special software OraCheck (Cyfex AG, Zurich, Switzerland). Three areas were selected for analysis: margin (MA), axial (AX), and occlusal (OC). Statistical analysis was performed using 80% percentile, one-way ANOVA, and the post hoc Scheffé test with α=0.05. Statistically significant differences were found both within and between the test groups. The aspect axial fit results varied from 90.5 ± 20.1 μm for the two-step milling mode (12TWO_AX) to 122.8 ± 12.2 μm for the milling with step bur 12S (12S_AX). The worst result in all groups was found for the aspect occlusal fit with the highest value for group 12S of 222.8 ± 35.6 μm. Group two-step milling mode (12TWO) performed statistically significantly better from groups 12 and 12S for the occlusal fit ( p<0.05). Deviation patterns were visually analyzed with a color-coded scheme for each restoration. CAM strategy influenced the internal adaptation of zirconia-reinforced lithium silicate partial crowns fabricated with a chairside CAD/CAM system. Sensible selection of specific areas of internal adaptation and fit is an important factor for evaluating the CAM accuracy of CAD/CAM systems.

  11. SU-F-I-53: Coded Aperture Coherent Scatter Spectral Imaging of the Breast: A Monte Carlo Evaluation of Absorbed Dose

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Morris, R; Lakshmanan, M; Fong, G

    Purpose: Coherent scatter based imaging has shown improved contrast and molecular specificity over conventional digital mammography however the biological risks have not been quantified due to a lack of accurate information on absorbed dose. This study intends to characterize the dose distribution and average glandular dose from coded aperture coherent scatter spectral imaging of the breast. The dose deposited in the breast from this new diagnostic imaging modality has not yet been quantitatively evaluated. Here, various digitized anthropomorphic phantoms are tested in a Monte Carlo simulation to evaluate the absorbed dose distribution and average glandular dose using clinically feasible scanmore » protocols. Methods: Geant4 Monte Carlo radiation transport simulation software is used to replicate the coded aperture coherent scatter spectral imaging system. Energy sensitive, photon counting detectors are used to characterize the x-ray beam spectra for various imaging protocols. This input spectra is cross-validated with the results from XSPECT, a commercially available application that yields x-ray tube specific spectra for the operating parameters employed. XSPECT is also used to determine the appropriate number of photons emitted per mAs of tube current at a given kVp tube potential. With the implementation of the XCAT digital anthropomorphic breast phantom library, a variety of breast sizes with differing anatomical structure are evaluated. Simulations were performed with and without compression of the breast for dose comparison. Results: Through the Monte Carlo evaluation of a diverse population of breast types imaged under real-world scan conditions, a clinically relevant average glandular dose for this new imaging modality is extrapolated. Conclusion: With access to the physical coherent scatter imaging system used in the simulation, the results of this Monte Carlo study may be used to directly influence the future development of the modality to keep breast dose to a minimum while still maintaining clinically viable image quality.« less

  12. Combating QR-Code-Based Compromised Accounts in Mobile Social Networks.

    PubMed

    Guo, Dong; Cao, Jian; Wang, Xiaoqi; Fu, Qiang; Li, Qiang

    2016-09-20

    Cyber Physical Social Sensing makes mobile social networks (MSNs) popular with users. However, such attacks are rampant as malicious URLs are spread covertly through quick response (QR) codes to control compromised accounts in MSNs to propagate malicious messages. Currently, there are generally two types of methods to identify compromised accounts in MSNs: one type is to analyze the potential threats on wireless access points and the potential threats on handheld devices' operation systems so as to stop compromised accounts from spreading malicious messages; the other type is to apply the method of detecting compromised accounts in online social networks to MSNs. The above types of methods above focus neither on the problems of MSNs themselves nor on the interaction of sensors' messages, which leads to the restrictiveness of platforms and the simplification of methods. In order to stop the spreading of compromised accounts in MSNs effectively, the attacks have to be traced to their sources first. Through sensors, users exchange information in MSNs and acquire information by scanning QR codes. Therefore, analyzing the traces of sensor-related information helps to identify the compromised accounts in MSNs. This paper analyzes the diversity of information sending modes of compromised accounts and normal accounts, analyzes the regularity of GPS (Global Positioning System)-based location information, and introduces the concepts of entropy and conditional entropy so as to construct an entropy-based model based on machine learning strategies. To achieve the goal, about 500,000 accounts of Sina Weibo and about 100 million corresponding messages are collected. Through the validation, the accuracy rate of the model is proved to be as high as 87.6%, and the false positive rate is only 3.7%. Meanwhile, the comparative experiments of the feature sets prove that sensor-based location information can be applied to detect the compromised accounts in MSNs.

  13. Combating QR-Code-Based Compromised Accounts in Mobile Social Networks

    PubMed Central

    Guo, Dong; Cao, Jian; Wang, Xiaoqi; Fu, Qiang; Li, Qiang

    2016-01-01

    Cyber Physical Social Sensing makes mobile social networks (MSNs) popular with users. However, such attacks are rampant as malicious URLs are spread covertly through quick response (QR) codes to control compromised accounts in MSNs to propagate malicious messages. Currently, there are generally two types of methods to identify compromised accounts in MSNs: one type is to analyze the potential threats on wireless access points and the potential threats on handheld devices’ operation systems so as to stop compromised accounts from spreading malicious messages; the other type is to apply the method of detecting compromised accounts in online social networks to MSNs. The above types of methods above focus neither on the problems of MSNs themselves nor on the interaction of sensors’ messages, which leads to the restrictiveness of platforms and the simplification of methods. In order to stop the spreading of compromised accounts in MSNs effectively, the attacks have to be traced to their sources first. Through sensors, users exchange information in MSNs and acquire information by scanning QR codes. Therefore, analyzing the traces of sensor-related information helps to identify the compromised accounts in MSNs. This paper analyzes the diversity of information sending modes of compromised accounts and normal accounts, analyzes the regularity of GPS (Global Positioning System)-based location information, and introduces the concepts of entropy and conditional entropy so as to construct an entropy-based model based on machine learning strategies. To achieve the goal, about 500,000 accounts of Sina Weibo and about 100 million corresponding messages are collected. Through the validation, the accuracy rate of the model is proved to be as high as 87.6%, and the false positive rate is only 3.7%. Meanwhile, the comparative experiments of the feature sets prove that sensor-based location information can be applied to detect the compromised accounts in MSNs. PMID:27657071

  14. 48 CFR 219.303 - Determining North American Industry Classification System (NAICS) codes and size standards.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 3 2011-10-01 2011-10-01 false Determining North American Industry Classification System (NAICS) codes and size standards. 219.303 Section 219.303 Federal... Programs 219.303 Determining North American Industry Classification System (NAICS) codes and size standards...

  15. 48 CFR 219.303 - Determining North American Industry Classification System (NAICS) codes and size standards.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 48 Federal Acquisition Regulations System 3 2012-10-01 2012-10-01 false Determining North American Industry Classification System (NAICS) codes and size standards. 219.303 Section 219.303 Federal... Programs 219.303 Determining North American Industry Classification System (NAICS) codes and size standards...

  16. 48 CFR 219.303 - Determining North American Industry Classification System (NAICS) codes and size standards.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 48 Federal Acquisition Regulations System 3 2014-10-01 2014-10-01 false Determining North American Industry Classification System (NAICS) codes and size standards. 219.303 Section 219.303 Federal... Determining North American Industry Classification System (NAICS) codes and size standards. Contracting...

  17. 48 CFR 219.303 - Determining North American Industry Classification System (NAICS) codes and size standards.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 48 Federal Acquisition Regulations System 3 2013-10-01 2013-10-01 false Determining North American Industry Classification System (NAICS) codes and size standards. 219.303 Section 219.303 Federal... Determining North American Industry Classification System (NAICS) codes and size standards. Contracting...

  18. 75 FR 78707 - Medicare Program; First Semi-Annual Meeting of the Advisory Panel on Ambulatory Payment...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-12-16

    ... hospital payment systems; hospital medical care delivery systems; provider billing and accounting systems; APC groups; Current Procedural Terminology codes; Health Care Common Procedure Coding System (HCPCS) codes; the use of, and payment for, drugs, medical devices, and other services in the outpatient setting...

  19. IT Security Support for Spaceport Command and Control System

    NASA Technical Reports Server (NTRS)

    McLain, Jeffrey

    2013-01-01

    During the fall 2013 semester, I worked at the Kennedy Space Center as an IT Security Intern in support of the Spaceport Command and Control System under the guidance of the IT Security Lead Engineer. Some of my responsibilities included assisting with security plan documentation collection, system hardware and software inventory, and malicious code and malware scanning. Throughout the semester, I had the opportunity to work on a wide range of security related projects. However, there are three projects in particular that stand out. The first project I completed was updating a large interactive spreadsheet that details the SANS Institutes Top 20 Critical Security Controls. My task was to add in all of the new commercial of the shelf (COTS) software listed on the SANS website that can be used to meet their Top 20 controls. In total, there are 153 unique security tools listed by SANS that meet one or more of their 20 controls. My second project was the creation of a database that will allow my mentor to keep track of the work done by the contractors that report to him in a more efficient manner by recording events as they occur throughout the quarter. Lastly, I expanded upon a security assessment of the Linux machines being used on center that I began last semester. To do this, I used a vulnerability and configuration tool that scans hosts remotely through the network and presents the user with an abundance of information detailing each machines configuration. The experience I gained from working on each of these projects has been invaluable, and I look forward to returning in the spring semester to continue working with the IT Security team.

  20. Survey of adaptive image coding techniques

    NASA Technical Reports Server (NTRS)

    Habibi, A.

    1977-01-01

    The general problem of image data compression is discussed briefly with attention given to the use of Karhunen-Loeve transforms, suboptimal systems, and block quantization. A survey is then conducted encompassing the four categories of adaptive systems: (1) adaptive transform coding (adaptive sampling, adaptive quantization, etc.), (2) adaptive predictive coding (adaptive delta modulation, adaptive DPCM encoding, etc.), (3) adaptive cluster coding (blob algorithms and the multispectral cluster coding technique), and (4) adaptive entropy coding.

  1. The analysis of convolutional codes via the extended Smith algorithm

    NASA Technical Reports Server (NTRS)

    Mceliece, R. J.; Onyszchuk, I.

    1993-01-01

    Convolutional codes have been the central part of most error-control systems in deep-space communication for many years. Almost all such applications, however, have used the restricted class of (n,1), also known as 'rate 1/n,' convolutional codes. The more general class of (n,k) convolutional codes contains many potentially useful codes, but their algebraic theory is difficult and has proved to be a stumbling block in the evolution of convolutional coding systems. In this article, the situation is improved by describing a set of practical algorithms for computing certain basic things about a convolutional code (among them the degree, the Forney indices, a minimal generator matrix, and a parity-check matrix), which are usually needed before a system using the code can be built. The approach is based on the classic Forney theory for convolutional codes, together with the extended Smith algorithm for polynomial matrices, which is introduced in this article.

  2. System for loading executable code into volatile memory in a downhole tool

    DOEpatents

    Hall, David R.; Bartholomew, David B.; Johnson, Monte L.

    2007-09-25

    A system for loading an executable code into volatile memory in a downhole tool string component comprises a surface control unit comprising executable code. An integrated downhole network comprises data transmission elements in communication with the surface control unit and the volatile memory. The executable code, stored in the surface control unit, is not permanently stored in the downhole tool string component. In a preferred embodiment of the present invention, the downhole tool string component comprises boot memory. In another embodiment, the executable code is an operating system executable code. Preferably, the volatile memory comprises random access memory (RAM). A method for loading executable code to volatile memory in a downhole tool string component comprises sending the code from the surface control unit to a processor in the downhole tool string component over the network. A central processing unit writes the executable code in the volatile memory.

  3. Ultra Safe And Secure Blasting System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hart, M M

    2009-07-27

    The Ultra is a blasting system that is designed for special applications where the risk and consequences of unauthorized demolition or blasting are so great that the use of an extraordinarily safe and secure blasting system is justified. Such a blasting system would be connected and logically welded together through digital code-linking as part of the blasting system set-up and initialization process. The Ultra's security is so robust that it will defeat the people who designed and built the components in any attempt at unauthorized detonation. Anyone attempting to gain unauthorized control of the system by substituting components or tappingmore » into communications lines will be thwarted in their inability to provide encrypted authentication. Authentication occurs through the use of codes that are generated by the system during initialization code-linking and the codes remain unknown to anyone, including the authorized operator. Once code-linked, a closed system has been created. The system requires all components connected as they were during initialization as well as a unique code entered by the operator for function and blasting.« less

  4. A survey to identify the clinical coding and classification systems currently in use across Europe.

    PubMed

    de Lusignan, S; Minmagh, C; Kennedy, J; Zeimet, M; Bommezijn, H; Bryant, J

    2001-01-01

    This is a survey to identify what clinical coding systems are currently in use across the European Union, and the states seeking membership to it. We sought to identify what systems are currently used and to what extent they were subject to local adaptation. Clinical coding should facilitate identifying key medical events in a computerised medical record, and aggregating information across groups of records. The emerging new driver is as the enabler of the life-long computerised medical record. A prerequisite for this level of functionality is the transfer of information between different computer systems. This transfer can be facilitated either by working on the interoperability problems between disparate systems or by harmonising the underlying data. This paper examines the extent to which the latter has occurred across Europe. Literature and Internet search. Requests for information via electronic mail to pan-European mailing lists of health informatics professionals. Coding systems are now a de facto part of health information systems across Europe. There are relatively few coding systems in existence across Europe. ICD9 and ICD 10, ICPC and Read were the most established. However the local adaptation of these classification systems either on a by country or by computer software manufacturer basis; significantly reduces the ability for the meaning coded with patients computer records to be easily transferred from one medical record system to another. There is no longer any debate as to whether a coding or classification system should be used. Convergence of different classifications systems should be encouraged. Countries and computer manufacturers within the EU should be encouraged to stop making local modifications to coding and classification systems, as this practice risks significantly slowing progress towards easy transfer of records between computer systems.

  5. Practical guide to bar coding for patient medication safety.

    PubMed

    Neuenschwander, Mark; Cohen, Michael R; Vaida, Allen J; Patchett, Jeffrey A; Kelly, Jamie; Trohimovich, Barbara

    2003-04-15

    Bar coding for the medication administration step of the drug-use process is discussed. FDA will propose a rule in 2003 that would require bar-code labels on all human drugs and biologicals. Even with an FDA mandate, manufacturer procrastination and possible shifts in product availability are likely to slow progress. Such delays should not preclude health systems from adopting bar-code-enabled point-of-care (BPOC) systems to achieve gains in patient safety. Bar-code technology is a replacement for traditional keyboard data entry. The elements of bar coding are content, which determines the meaning; data format, which refers to the embedded data and symbology, which describes the "font" in which the machine-readable code is written. For a BPOC system to deliver an acceptable level of patient protection, the hospital must first establish reliable processes for a patient identification band, caregiver badge, and medication bar coding. Medications can have either drug-specific or patient-specific bar codes. Both varieties result in the desired code that supports patient's five rights of drug administration. When medications are not available from the manufacturer in immediate-container bar-coded packaging, other means of applying the bar code must be devised, including the use of repackaging equipment, overwrapping, manual bar coding, and outsourcing. Virtually all medications should be bar coded, the bar code on the label should be easily readable, and appropriate policies, procedures, and checks should be in place. Bar coding has the potential to be not only cost-effective but to produce a return on investment. By bar coding patient identification tags, caregiver badges, and immediate-container medications, health systems can substantially increase patient safety during medication administration.

  6. Mapping educational opportunities for healthcare workers on antimicrobial resistance and stewardship around the world.

    PubMed

    Rogers Van Katwyk, Susan; Jones, Sara L; Hoffman, Steven J

    2018-02-05

    Antimicrobial resistance is an important global issue facing society. Healthcare workers need to be engaged in solving this problem, as advocates for rational antimicrobial use, stewards of sustainable effectiveness, and educators of their patients. To fulfill this role, healthcare workers need access to training and educational resources on antimicrobial resistance. To better understand the resources available to healthcare workers, we undertook a global environmental scan of educational programs and resources targeting healthcare workers on the topic of antimicrobial resistance and antimicrobial stewardship. Programs were identified through contact with key experts, web searching, and academic literature searching. We summarized programs in tabular form, including participating organizations, region, and intended audience. We developed a coding system to classify programs by program type and participating organization type, assigning multiple codes as necessary and creating summary charts for program types, organization types, and intended audience to illustrate the breadth of available resources. We identified 94 educational initiatives related to antimicrobial resistance and antimicrobial stewardship, which represent a diverse array of programs including courses, workshops, conferences, guidelines, public outreach materials, and online-resource websites. These resources were developed by a combination of government bodies, professional societies, universities, non-profit and community organizations, hospitals and healthcare centers, and insurance companies and industry. Most programs either targeted healthcare workers collectively or specifically targeted physicians. A smaller number of programs were aimed at other healthcare worker groups including pharmacists, nurses, midwives, and healthcare students. Our environmental scan shows that there are many organizations working to develop and share educational resources for healthcare workers on antimicrobial resistance and antimicrobial stewardship. Governments, hospitals, and professional societies appear to be driving action on this front, sometimes working with other types of organizations. A broad range of resources have been made freely available; however, we have noted several opportunities for action, including increased engagement with students, improvements to pre-service education, recognition of antimicrobial resistance courses as continuing medical education, and better platforms for resource-sharing online.

  7. Three-Dimensional Terahertz Coded-Aperture Imaging Based on Single Input Multiple Output Technology.

    PubMed

    Chen, Shuo; Luo, Chenggao; Deng, Bin; Wang, Hongqiang; Cheng, Yongqiang; Zhuang, Zhaowen

    2018-01-19

    As a promising radar imaging technique, terahertz coded-aperture imaging (TCAI) can achieve high-resolution, forward-looking, and staring imaging by producing spatiotemporal independent signals with coded apertures. In this paper, we propose a three-dimensional (3D) TCAI architecture based on single input multiple output (SIMO) technology, which can reduce the coding and sampling times sharply. The coded aperture applied in the proposed TCAI architecture loads either purposive or random phase modulation factor. In the transmitting process, the purposive phase modulation factor drives the terahertz beam to scan the divided 3D imaging cells. In the receiving process, the random phase modulation factor is adopted to modulate the terahertz wave to be spatiotemporally independent for high resolution. Considering human-scale targets, images of each 3D imaging cell are reconstructed one by one to decompose the global computational complexity, and then are synthesized together to obtain the complete high-resolution image. As for each imaging cell, the multi-resolution imaging method helps to reduce the computational burden on a large-scale reference-signal matrix. The experimental results demonstrate that the proposed architecture can achieve high-resolution imaging with much less time for 3D targets and has great potential in applications such as security screening, nondestructive detection, medical diagnosis, etc.

  8. Experimental benchmarking of a Monte Carlo dose simulation code for pediatric CT

    NASA Astrophysics Data System (ADS)

    Li, Xiang; Samei, Ehsan; Yoshizumi, Terry; Colsher, James G.; Jones, Robert P.; Frush, Donald P.

    2007-03-01

    In recent years, there has been a desire to reduce CT radiation dose to children because of their susceptibility and prolonged risk for cancer induction. Concerns arise, however, as to the impact of dose reduction on image quality and thus potentially on diagnostic accuracy. To study the dose and image quality relationship, we are developing a simulation code to calculate organ dose in pediatric CT patients. To benchmark this code, a cylindrical phantom was built to represent a pediatric torso, which allows measurements of dose distributions from its center to its periphery. Dose distributions for axial CT scans were measured on a 64-slice multidetector CT (MDCT) scanner (GE Healthcare, Chalfont St. Giles, UK). The same measurements were simulated using a Monte Carlo code (PENELOPE, Universitat de Barcelona) with the applicable CT geometry including bowtie filter. The deviations between simulated and measured dose values were generally within 5%. To our knowledge, this work is one of the first attempts to compare measured radial dose distributions on a cylindrical phantom with Monte Carlo simulated results. It provides a simple and effective method for benchmarking organ dose simulation codes and demonstrates the potential of Monte Carlo simulation for investigating the relationship between dose and image quality for pediatric CT patients.

  9. A microprocessor controlled pressure scanning system

    NASA Technical Reports Server (NTRS)

    Anderson, R. C.

    1976-01-01

    A microprocessor-based controller and data logger for pressure scanning systems is described. The microcomputer positions and manages data from as many as four 48-port electro-mechanical pressure scanners. The maximum scanning rate is 80 pressure measurements per second (20 ports per second on each of four scanners). The system features on-line calibration, position-directed data storage, and once-per-scan display in engineering units of data from a selected port. The system is designed to be interfaced to a facility computer through a shared memory. System hardware and software are described. Factors affecting measurement error in this type of system are also discussed.

  10. System Design for FEC in Aeronautical Telemetry

    DTIC Science & Technology

    2012-03-12

    rate punctured convolutional codes for soft decision Viterbi...below follows that given in [8]. The final coding rate of exactly 2/3 is achieved by puncturing the rate -1/2 code as follows. We begin with the buffer c1...concatenated convolutional code (SCCC). The contributions of this paper are on the system-design level. One major contribution is to design a SCCC code

  11. Biomass Economy

    DTIC Science & Technology

    1985-11-01

    Boiler and Pressure Vessel Code HEI Heat Exchanger Institute Heat and Material Balance c. System Description (1) Condenser... Boiler and Pressure Vessel Code "AN(SI B31.1 Power Piping d. System Description (1) Deaerator The deaerator will be d direct contact feedwater heater, and...vent, and drain piping. "b . Applicable Codes ASME Boiler and Pressure Vessel Code "ANSI B31.1 - Power Piping Code

  12. Augmented burst-error correction for UNICON laser memory. [digital memory

    NASA Technical Reports Server (NTRS)

    Lim, R. S.

    1974-01-01

    A single-burst-error correction system is described for data stored in the UNICON laser memory. In the proposed system, a long fire code with code length n greater than 16,768 bits was used as an outer code to augment an existing inner shorter fire code for burst error corrections. The inner fire code is a (80,64) code shortened from the (630,614) code, and it is used to correct a single-burst-error on a per-word basis with burst length b less than or equal to 6. The outer code, with b less than or equal to 12, would be used to correct a single-burst-error on a per-page basis, where a page consists of 512 32-bit words. In the proposed system, the encoding and error detection processes are implemented by hardware. A minicomputer, currently used as a UNICON memory management processor, is used on a time-demanding basis for error correction. Based upon existing error statistics, this combination of an inner code and an outer code would enable the UNICON system to obtain a very low error rate in spite of flaws affecting the recorded data.

  13. Control electronics for a multi-laser/multi-detector scanning system

    NASA Technical Reports Server (NTRS)

    Kennedy, W.

    1980-01-01

    The Mars Rover Laser Scanning system uses a precision laser pointing mechanism, a photodetector array, and the concept of triangulation to perform three dimensional scene analysis. The system is used for real time terrain sensing and vision. The Multi-Laser/Multi-Detector laser scanning system is controlled by a digital device called the ML/MD controller. A next generation laser scanning system, based on the Level 2 controller, is microprocessor based. The new controller capabilities far exceed those of the ML/MD device. The first draft circuit details and general software structure are presented.

  14. The Effects of Computerized Auditory Feedback on Electronic Article Surveillance Tag Placement in an Auto-Parts Distribution Center

    ERIC Educational Resources Information Center

    Goomas, David T.

    2008-01-01

    In this report from the field, computerized auditory feedback was used to inform order selectors and order selector auditors in a distribution center to add an electronic article surveillance (EAS) adhesive tag. This was done by programming handheld computers to emit a loud beep for high-priced items upon scanning the item's bar-coded Universal…

  15. The Emergence of Atomic-Level Structural Information for Ordered Metal-Solution Interfaces: Some Recent Contributions from In-Situ Infrared Spectroscopy and Scanning Tunneling Microscopy

    DTIC Science & Technology

    1992-02-28

    Distribution/ Availabilit Codes jAval’ an~d/or Dist Speoial (19. cont.) reconstruction at gold-aqueous interfaces. All three low-index gold surfaces are...uncharged (vide supra). This difference may well be due to the influence of the interfacial water , or conceivably to adsorbed perchlorate anions. Both Au

  16. Understanding effects in reviews of implementation interventions using the Theoretical Domains Framework.

    PubMed

    Little, Elizabeth A; Presseau, Justin; Eccles, Martin P

    2015-06-17

    Behavioural theory can be used to better understand the effects of behaviour change interventions targeting healthcare professional behaviour to improve quality of care. However, the explicit use of theory is rarely reported despite interventions inevitably involving at least an implicit idea of what factors to target to implement change. There is a quality of care gap in the post-fracture investigation (bone mineral density (BMD) scanning) and management (bisphosphonate prescription) of patients at risk of osteoporosis. We aimed to use the Theoretical Domains Framework (TDF) within a systematic review of interventions to improve quality of care in post-fracture investigation. Our objectives were to explore which theoretical factors the interventions in the review may have been targeting and how this might be related to the size of the effect on rates of BMD scanning and osteoporosis treatment with bisphosphonate medication. A behavioural scientist and a clinician independently coded TDF domains in intervention and control groups. Quantitative analyses explored the relationship between intervention effect size and total number of domains targeted, and as number of different domains targeted. Nine randomised controlled trials (RCTs) (10 interventions) were analysed. The five theoretical domains most frequently coded as being targeted by the interventions in the review included "memory, attention and decision processes", "knowledge", "environmental context and resources", "social influences" and "beliefs about consequences". Each intervention targeted a combination of at least four of these five domains. Analyses identified an inverse relationship between both number of times and number of different domains coded and the effect size for BMD scanning but not for bisphosphonate prescription, suggesting that the more domains the intervention targeted, the lower the observed effect size. When explicit use of theory to inform interventions is absent, it is possible to retrospectively identify the likely targeted factors using theoretical frameworks such as the TDF. In osteoporosis management, this suggested that several likely determinants of healthcare professional behaviour appear not yet to have been considered in implementation interventions. This approach may serve as a useful basis for using theory-based frameworks such as the TDF to retrospectively identify targeted factors within systematic reviews of implementation interventions in other implementation contexts.

  17. Simulation of spacecraft attitude dynamics using TREETOPS and model-specific computer Codes

    NASA Technical Reports Server (NTRS)

    Cochran, John E.; No, T. S.; Fitz-Coy, Norman G.

    1989-01-01

    The simulation of spacecraft attitude dynamics and control using the generic, multi-body code called TREETOPS and other codes written especially to simulate particular systems is discussed. Differences in the methods used to derive equations of motion--Kane's method for TREETOPS and the Lagrangian and Newton-Euler methods, respectively, for the other two codes--are considered. Simulation results from the TREETOPS code are compared with those from the other two codes for two example systems. One system is a chain of rigid bodies; the other consists of two rigid bodies attached to a flexible base body. Since the computer codes were developed independently, consistent results serve as a verification of the correctness of all the programs. Differences in the results are discussed. Results for the two-rigid-body, one-flexible-body system are useful also as information on multi-body, flexible, pointing payload dynamics.

  18. A family of chaotic pure analog coding schemes based on baker's map function

    NASA Astrophysics Data System (ADS)

    Liu, Yang; Li, Jing; Lu, Xuanxuan; Yuen, Chau; Wu, Jun

    2015-12-01

    This paper considers a family of pure analog coding schemes constructed from dynamic systems which are governed by chaotic functions—baker's map function and its variants. Various decoding methods, including maximum likelihood (ML), minimum mean square error (MMSE), and mixed ML-MMSE decoding algorithms, have been developed for these novel encoding schemes. The proposed mirrored baker's and single-input baker's analog codes perform a balanced protection against the fold error (large distortion) and weak distortion and outperform the classical chaotic analog coding and analog joint source-channel coding schemes in literature. Compared to the conventional digital communication system, where quantization and digital error correction codes are used, the proposed analog coding system has graceful performance evolution, low decoding latency, and no quantization noise. Numerical results show that under the same bandwidth expansion, the proposed analog system outperforms the digital ones over a wide signal-to-noise (SNR) range.

  19. Integral force feedback control with input shaping: Application to piezo-based scanning systems in ECDLs.

    PubMed

    Zhang, Meng; Liu, Zhigang; Zhu, Yu; Bu, Mingfan; Hong, Jun

    2017-07-01

    In this paper, a hybrid control system is developed by integrating the closed-loop force feedback and input shaping method to overcome the problem of the hysteresis and dynamic behavior in piezo-based scanning systems and increase the scanning speed of tunable external cavity diode lasers. The flexible hinge and piezoelectric actuators are analyzed, and a dynamic model of the scanning systems is established. A force sensor and an integral controller are utilized in integral force feedback (IFF) to directly augment the damping of the piezoelectric scanning systems. Hysteresis has been effectively eliminated, but the mechanical resonance is still evident. Noticeable residual vibration occurred after the inflection points and then gradually disappeared. For the further control of mechanical resonance, based on the theory of minimum-acceleration trajectory planning, the time-domain input shaping method was developed. The turning sections of a scanning trajectory are replaced by smooth curves, while the linear sections are retained. The IFF method is combined with the input shaping method to control the non-linearity and mechanical resonance in high-speed piezo-based scanning systems. Experiments are conducted, and the results demonstrate the effectiveness of the proposed control approach.

  20. Integral force feedback control with input shaping: Application to piezo-based scanning systems in ECDLs

    NASA Astrophysics Data System (ADS)

    Zhang, Meng; Liu, Zhigang; Zhu, Yu; Bu, Mingfan; Hong, Jun

    2017-07-01

    In this paper, a hybrid control system is developed by integrating the closed-loop force feedback and input shaping method to overcome the problem of the hysteresis and dynamic behavior in piezo-based scanning systems and increase the scanning speed of tunable external cavity diode lasers. The flexible hinge and piezoelectric actuators are analyzed, and a dynamic model of the scanning systems is established. A force sensor and an integral controller are utilized in integral force feedback (IFF) to directly augment the damping of the piezoelectric scanning systems. Hysteresis has been effectively eliminated, but the mechanical resonance is still evident. Noticeable residual vibration occurred after the inflection points and then gradually disappeared. For the further control of mechanical resonance, based on the theory of minimum-acceleration trajectory planning, the time-domain input shaping method was developed. The turning sections of a scanning trajectory are replaced by smooth curves, while the linear sections are retained. The IFF method is combined with the input shaping method to control the non-linearity and mechanical resonance in high-speed piezo-based scanning systems. Experiments are conducted, and the results demonstrate the effectiveness of the proposed control approach.

  1. Error Control Coding Techniques for Space and Satellite Communications

    NASA Technical Reports Server (NTRS)

    Lin, Shu

    2000-01-01

    This paper presents a concatenated turbo coding system in which a Reed-Solomom outer code is concatenated with a binary turbo inner code. In the proposed system, the outer code decoder and the inner turbo code decoder interact to achieve both good bit error and frame error performances. The outer code decoder helps the inner turbo code decoder to terminate its decoding iteration while the inner turbo code decoder provides soft-output information to the outer code decoder to carry out a reliability-based soft-decision decoding. In the case that the outer code decoding fails, the outer code decoder instructs the inner code decoder to continue its decoding iterations until the outer code decoding is successful or a preset maximum number of decoding iterations is reached. This interaction between outer and inner code decoders reduces decoding delay. Also presented in the paper are an effective criterion for stopping the iteration process of the inner code decoder and a new reliability-based decoding algorithm for nonbinary codes.

  2. A development and integration of database code-system with a compilation of comparator, k0 and absolute methods for INAA using microsoft access

    NASA Astrophysics Data System (ADS)

    Hoh, Siew Sin; Rapie, Nurul Nadiah; Lim, Edwin Suh Wen; Tan, Chun Yuan; Yavar, Alireza; Sarmani, Sukiman; Majid, Amran Ab.; Khoo, Kok Siong

    2013-05-01

    Instrumental Neutron Activation Analysis (INAA) is often used to determine and calculate the elemental concentrations of a sample at The National University of Malaysia (UKM) typically in Nuclear Science Programme, Faculty of Science and Technology. The objective of this study was to develop a database code-system based on Microsoft Access 2010 which could help the INAA users to choose either comparator method, k0-method or absolute method for calculating the elemental concentrations of a sample. This study also integrated k0data, Com-INAA, k0Concent, k0-Westcott and Abs-INAA to execute and complete the ECC-UKM database code-system. After the integration, a study was conducted to test the effectiveness of the ECC-UKM database code-system by comparing the concentrations between the experiments and the code-systems. 'Triple Bare Monitor' Zr-Au and Cr-Mo-Au were used in k0Concent, k0-Westcott and Abs-INAA code-systems as monitors to determine the thermal to epithermal neutron flux ratio (f). Calculations involved in determining the concentration were net peak area (Np), measurement time (tm), irradiation time (tirr), k-factor (k), thermal to epithermal neutron flux ratio (f), parameters of the neutron flux distribution epithermal (α) and detection efficiency (ɛp). For Com-INAA code-system, certified reference material IAEA-375 Soil was used to calculate the concentrations of elements in a sample. Other CRM and SRM were also used in this database codesystem. Later, a verification process to examine the effectiveness of the Abs-INAA code-system was carried out by comparing the sample concentrations between the code-system and the experiment. The results of the experimental concentration values of ECC-UKM database code-system were performed with good accuracy.

  3. A novel construction scheme of QC-LDPC codes based on the RU algorithm for optical transmission systems

    NASA Astrophysics Data System (ADS)

    Yuan, Jian-guo; Liang, Meng-qi; Wang, Yong; Lin, Jin-zhao; Pang, Yu

    2016-03-01

    A novel lower-complexity construction scheme of quasi-cyclic low-density parity-check (QC-LDPC) codes for optical transmission systems is proposed based on the structure of the parity-check matrix for the Richardson-Urbanke (RU) algorithm. Furthermore, a novel irregular QC-LDPC(4 288, 4 020) code with high code-rate of 0.937 is constructed by this novel construction scheme. The simulation analyses show that the net coding gain ( NCG) of the novel irregular QC-LDPC(4 288,4 020) code is respectively 2.08 dB, 1.25 dB and 0.29 dB more than those of the classic RS(255, 239) code, the LDPC(32 640, 30 592) code and the irregular QC-LDPC(3 843, 3 603) code at the bit error rate ( BER) of 10-6. The irregular QC-LDPC(4 288, 4 020) code has the lower encoding/decoding complexity compared with the LDPC(32 640, 30 592) code and the irregular QC-LDPC(3 843, 3 603) code. The proposed novel QC-LDPC(4 288, 4 020) code can be more suitable for the increasing development requirements of high-speed optical transmission systems.

  4. Nondestructive Evaluation of Concrete Bridge Decks with Automated Acoustic Scanning System and Ground Penetrating Radar.

    PubMed

    Sun, Hongbin; Pashoutani, Sepehr; Zhu, Jinying

    2018-06-16

    Delamanintions and reinforcement corrosion are two common problems in concrete bridge decks. No single nondestructive testing method (NDT) is able to provide comprehensive characterization of these defects. In this work, two NDT methods, acoustic scanning and Ground Penetrating Radar (GPR), were used to image a straight concrete bridge deck and a curved intersection ramp bridge. An acoustic scanning system has been developed for rapid delamination mapping. The system consists of metal-ball excitation sources, air-coupled sensors, and a GPS positioning system. The acoustic scanning results are presented as a two-dimensional image that is based on the energy map in the frequency range of 0.5⁻5 kHz. The GPR scanning results are expressed as the GPR signal attenuation map to characterize concrete deterioration and reinforcement corrosion. Signal processing algorithms for both methods are discussed. Delamination maps from the acoustic scanning are compared with deterioration maps from the GPR scanning on both bridges. The results demonstrate that combining the acoustic and GPR scanning results will provide a complementary and comprehensive evaluation of concrete bridge decks.

  5. An accurate evaluation of the performance of asynchronous DS-CDMA systems with zero-correlation-zone coding in Rayleigh fading

    NASA Astrophysics Data System (ADS)

    Walker, Ernest; Chen, Xinjia; Cooper, Reginald L.

    2010-04-01

    An arbitrarily accurate approach is used to determine the bit-error rate (BER) performance for generalized asynchronous DS-CDMA systems, in Gaussian noise with Raleigh fading. In this paper, and the sequel, new theoretical work has been contributed which substantially enhances existing performance analysis formulations. Major contributions include: substantial computational complexity reduction, including a priori BER accuracy bounding; an analytical approach that facilitates performance evaluation for systems with arbitrary spectral spreading distributions, with non-uniform transmission delay distributions. Using prior results, augmented by these enhancements, a generalized DS-CDMA system model is constructed and used to evaluated the BER performance, in a variety of scenarios. In this paper, the generalized system modeling was used to evaluate the performance of both Walsh- Hadamard (WH) and Walsh-Hadamard-seeded zero-correlation-zone (WH-ZCZ) coding. The selection of these codes was informed by the observation that WH codes contain N spectral spreading values (0 to N - 1), one for each code sequence; while WH-ZCZ codes contain only two spectral spreading values (N/2 - 1,N/2); where N is the sequence length in chips. Since these codes span the spectral spreading range for DS-CDMA coding, by invoking an induction argument, the generalization of the system model is sufficiently supported. The results in this paper, and the sequel, support the claim that an arbitrary accurate performance analysis for DS-CDMA systems can be evaluated over the full range of binary coding, with minimal computational complexity.

  6. Viewing-zone scanning holographic display using a MEMS spatial light modulator.

    PubMed

    Takaki, Yasuhiro; Fujii, Keisuke

    2014-10-06

    Horizontally scanning holography using a spatial light modulator based on microelectromechanical system, which we previously proposed for enlarging both the screen size and the viewing zone, utilized a screen scanning system with elementary holograms being scanned horizontally on the screen. In this study, to enlarge the screen size and the viewing zone, we propose a viewing-zone scanning system with enlarged hologram screen and horizontally scanned reduced viewing zone. The reduced viewing zone is localized using converging light emitted from the screen, and the entire screen can be viewed from the localized viewing zone. An experimental system was constructed, and we demonstrated the generation of reconstructed images with a screen size of 2.0 in, a viewing zone width of 437 mm at a distance of 600 mm from the screen, and a frame rate of 60 Hz.

  7. Boresight alignment method for mobile laser scanning systems

    NASA Astrophysics Data System (ADS)

    Rieger, P.; Studnicka, N.; Pfennigbauer, M.; Zach, G.

    2010-06-01

    Mobile laser scanning (MLS) is the latest approach towards fast and cost-efficient acquisition of 3-dimensional spatial data. Accurately evaluating the boresight alignment in MLS systems is an obvious necessity. However, recent systems available on the market may lack of suitable and efficient practical workflows on how to perform this calibration. This paper discusses an innovative method for accurately determining the boresight alignment of MLS systems by employing 3D laser scanners. Scanning objects using a 3D laser scanner operating in a 2D line-scan mode from various different runs and scan directions provides valuable scan data for determining the angular alignment between inertial measurement unit and laser scanner. Field data is presented demonstrating the final accuracy of the calibration and the high quality of the point cloud acquired during an MLS campaign.

  8. Evaluation of acute ischemic stroke using quantitative EEG: a comparison with conventional EEG and CT scan.

    PubMed

    Murri, L; Gori, S; Massetani, R; Bonanni, E; Marcella, F; Milani, S

    1998-06-01

    The sensitivity of quantitative electroencephalogram (EEG) was compared with that of conventional EEG in patients with acute ischaemic stroke. In addition, a correlation between quantitative EEG data and computerized tomography (CT) scan findings was carried out for all the areas of lesion in order to reassess the actual role of EEG in the evaluation of stroke. Sixty-five patients were tested with conventional and quantitative EEG within 24 h from the onset of neurological symptoms, whereas CT scan was performed within 4 days from the onset of stroke. EEG was recorded from 19 electrodes placed upon the scalp according to the International 10-20 System. Spectral analysis was carried out on 30 artefact-free 4-sec epochs. For each channel absolute and relative power were calculated for the delta, theta, alpha and beta frequency bands and such data were successively represented in colour-coded maps. Ten patients with extensive lesions documented by CT scan were excluded. The results indicated that conventional EEG revealed abnormalities in 40 of 55 cases, while EEG mapping showed abnormalities in 46 of 55 cases: it showed focal abnormalities in five cases and nonfocal abnormalities in one of six cases which had appeared to be normal according to visual inspection of EEG. In a further 11 cases, where the conventional EEG revealed abnormalities in one hemisphere, the quantitative EEG and maps allowed to further localize abnormal activity in a more localized way. The sensitivity of both methods was higher for frontocentral, temporal and parieto-occipital cortical-subcortical infarctions than for basal ganglia and internal capsule lesions; however, quantitative EEG was more efficient for all areas of lesion in detecting cases that had appeared normal by visual inspection and was clearly superior in revealing focal abnormalities. When we considered the electrode related to which the maximum power of the delta frequency band is recorded, a fairly close correlation was found between the localization of the maximum delta power and the position of lesions documented by CT scan for all areas of lesion excepting those located in the striatocapsular area.

  9. Surface acoustic wave coding for orthogonal frequency coded devices

    NASA Technical Reports Server (NTRS)

    Malocha, Donald (Inventor); Kozlovski, Nikolai (Inventor)

    2011-01-01

    Methods and systems for coding SAW OFC devices to mitigate code collisions in a wireless multi-tag system. Each device producing plural stepped frequencies as an OFC signal with a chip offset delay to increase code diversity. A method for assigning a different OCF to each device includes using a matrix based on the number of OFCs needed and the number chips per code, populating each matrix cell with OFC chip, and assigning the codes from the matrix to the devices. The asynchronous passive multi-tag system includes plural surface acoustic wave devices each producing a different OFC signal having the same number of chips and including a chip offset time delay, an algorithm for assigning OFCs to each device, and a transceiver to transmit an interrogation signal and receive OFC signals in response with minimal code collisions during transmission.

  10. Establishing an Environmental Scanning/Forecasting System to Augment College and University Planning.

    ERIC Educational Resources Information Center

    Morrison, James L.

    1987-01-01

    The major benefit of an environmental scanning/forecasting system is in providing critical information for strategic planning. Such a system allows the institution to detect social, technological, economic, and political trends and potential events. The environmental scanning database developed by United Way of America is described. (MLW)

  11. Fast ultra-wideband microwave spectral scanning utilizing photonic wavelength- and time-division multiplexing.

    PubMed

    Li, Yihan; Kuse, Naoya; Fermann, Martin

    2017-08-07

    A high-speed ultra-wideband microwave spectral scanning system is proposed and experimentally demonstrated. Utilizing coherent dual electro-optical frequency combs and a recirculating optical frequency shifter, the proposed system realizes wavelength- and time-division multiplexing at the same time, offering flexibility between scan speed and size, weight and power requirements (SWaP). High-speed spectral scanning spanning from ~1 to 8 GHz with ~1.2 MHz spectral resolution is achieved experimentally within 14 µs. The system can be easily scaled to higher bandwidth coverage, faster scanning speed or finer spectral resolution with suitable hardware.

  12. Short non-coding RNAs as bacteria species identifiers detected by surface plasmon resonance enhanced common path interferometry

    NASA Astrophysics Data System (ADS)

    Greef, Charles; Petropavlovskikh, Viatcheslav; Nilsen, Oyvind; Khattatov, Boris; Plam, Mikhail; Gardner, Patrick; Hall, John

    2008-04-01

    Small non-coding RNA sequences have recently been discovered as unique identifiers of certain bacterial species, raising the possibility that they can be used as highly specific Biowarfare Agent detection markers in automated field deployable integrated detection systems. Because they are present in high abundance they could allow genomic based bacterial species identification without the need for pre-assay amplification. Further, a direct detection method would obviate the need for chemical labeling, enabling a rapid, efficient, high sensitivity mechanism for bacterial detection. Surface Plasmon Resonance enhanced Common Path Interferometry (SPR-CPI) is a potentially market disruptive, high sensitivity dual technology that allows real-time direct multiplex measurement of biomolecule interactions, including small molecules, nucleic acids, proteins, and microbes. SPR-CPI measures differences in phase shift of reflected S and P polarized light under Total Internal Reflection (TIR) conditions at a surface, caused by changes in refractive index induced by biomolecular interactions within the evanescent field at the TIR interface. The measurement is performed on a microarray of discrete 2-dimensional areas functionalized with biomolecule capture reagents, allowing simultaneous measurement of up to 100 separate analytes. The optical beam encompasses the entire microarray, allowing a solid state detector system with no scanning requirement. Output consists of simultaneous voltage measurements proportional to the phase differences resulting from the refractive index changes from each microarray feature, and is automatically processed and displayed graphically or delivered to a decision making algorithm, enabling a fully automatic detection system capable of rapid detection and quantification of small nucleic acids at extremely sensitive levels. Proof-of-concept experiments on model systems and cell culture samples have demonstrated utility of the system, and efforts are in progress for full development and deployment of the device. The technology has broad applicability as a universal detection platform for BWA detection, medical diagnostics, and drug discovery research, and represents a new class of instrumentation as a rapid, high sensitivity, label-free methodology.

  13. 48 CFR 19.303 - Determining North American Industry Classification System (NAICS) codes and size standards.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 1 2010-10-01 2010-10-01 false Determining North American Industry Classification System (NAICS) codes and size standards. 19.303 Section 19.303 Federal Acquisition... Classification System (NAICS) codes and size standards. (a) The contracting officer shall determine the...

  14. 75 FR 51465 - Medicare Program; Announcement of Five New Members to the Advisory Panel on Ambulatory Payment...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-08-20

    ... Panel. This expertise encompasses hospital payment systems; hospital medical-care delivery systems; provider billing systems; APC groups, Current Procedural Terminology codes, and alpha-numeric Healthcare Common Procedure Coding System codes; and the use of, and payment for, drugs and medical devices in the...

  15. Variable Coded Modulation software simulation

    NASA Astrophysics Data System (ADS)

    Sielicki, Thomas A.; Hamkins, Jon; Thorsen, Denise

    This paper reports on the design and performance of a new Variable Coded Modulation (VCM) system. This VCM system comprises eight of NASA's recommended codes from the Consultative Committee for Space Data Systems (CCSDS) standards, including four turbo and four AR4JA/C2 low-density parity-check codes, together with six modulations types (BPSK, QPSK, 8-PSK, 16-APSK, 32-APSK, 64-APSK). The signaling protocol for the transmission mode is based on a CCSDS recommendation. The coded modulation may be dynamically chosen, block to block, to optimize throughput.

  16. A simple way to higher speed atomic force microscopy by retrofitting with a novel high-speed flexure-guided scanner

    NASA Astrophysics Data System (ADS)

    Ouma Alunda, Bernard; Lee, Yong Joong; Park, Soyeun

    2018-06-01

    A typical line-scan rate for a commercial atomic force microscope (AFM) is about 1 Hz. At such a rate, more than four minutes of scanning time is required to obtain an image of 256 × 256 pixels. Despite control electronics of most commercial AFMs permit faster scan rates, default piezoelectric X–Y scanners limit the overall speed of the system. This is a direct consequence of manufacturers choosing a large scan range over the maximum operating speed for a X–Y scanner. Although some AFM manufacturers offer reduced-scan area scanners as an option, the speed improvement is not significant because such scanners do not have large enough reduction in the scan range and are mainly targeted to reducing the overall cost of the AFM systems. In this article, we present a simple parallel-kinematic substitute scanner for a commercial atomic force microscope to afford a higher scanning speed with no other hardware or software upgrade to the original system. Although the scan area reduction is unavoidable, our modified commercial XE-70 AFM from Park Systems has achieved a line scan rate of over 50 Hz, more than 10 times faster than the original, unmodified system. Our flexure-guided X–Y scanner can be a simple drop-in replacement option for enhancing the speed of various aging atomic force microscopes.

  17. Developing and refining the methods for a 'one-stop shop' for research evidence about health systems.

    PubMed

    Lavis, John N; Wilson, Michael G; Moat, Kaelan A; Hammill, Amanda C; Boyko, Jennifer A; Grimshaw, Jeremy M; Flottorp, Signe

    2015-02-25

    Policymakers, stakeholders and researchers have not been able to find research evidence about health systems using an easily understood taxonomy of topics, know when they have conducted a comprehensive search of the many types of research evidence relevant to them, or rapidly identify decision-relevant information in their search results. To address these gaps, we developed an approach to building a 'one-stop shop' for research evidence about health systems. We developed a taxonomy of health system topics and iteratively refined it by drawing on existing categorization schemes and by using it to categorize progressively larger bundles of research evidence. We identified systematic reviews, systematic review protocols, and review-derived products through searches of Medline, hand searches of several databases indexing systematic reviews, hand searches of journals, and continuous scanning of listservs and websites. We developed an approach to providing 'added value' to existing content (e.g., coding systematic reviews according to the countries in which included studies were conducted) and to expanding the types of evidence eligible for inclusion (e.g., economic evaluations and health system descriptions). Lastly, we developed an approach to continuously updating the online one-stop shop in seven supported languages. The taxonomy is organized by governance, financial, and delivery arrangements and by implementation strategies. The 'one-stop shop', called Health Systems Evidence, contains a comprehensive inventory of evidence briefs, overviews of systematic reviews, systematic reviews, systematic review protocols, registered systematic review titles, economic evaluations and costing studies, health reform descriptions and health system descriptions, and many types of added-value coding. It is continuously updated and new content is regularly translated into Arabic, Chinese, English, French, Portuguese, Russian, and Spanish. Policymakers and stakeholders can now easily access and use a wide variety of types of research evidence about health systems to inform decision-making and advocacy. Researchers and research funding agencies can use Health Systems Evidence to identify gaps in the current stock of research evidence and domains that could benefit from primary research, systematic reviews, and review overviews.

  18. Rocketdyne/Westinghouse nuclear thermal rocket engine modeling

    NASA Technical Reports Server (NTRS)

    Glass, James F.

    1993-01-01

    The topics are presented in viewgraph form and include the following: systems approach needed for nuclear thermal rocket (NTR) design optimization; generic NTR engine power balance codes; rocketdyne nuclear thermal system code; software capabilities; steady state model; NTR engine optimizer code-logic; reactor power calculation logic; sample multi-component configuration; NTR design code output; generic NTR code at Rocketdyne; Rocketdyne NTR model; and nuclear thermal rocket modeling directions.

  19. Properties of a certain stochastic dynamical system, channel polarization, and polar codes

    NASA Astrophysics Data System (ADS)

    Tanaka, Toshiyuki

    2010-06-01

    A new family of codes, called polar codes, has recently been proposed by Arikan. Polar codes are of theoretical importance because they are provably capacity achieving with low-complexity encoding and decoding. We first discuss basic properties of a certain stochastic dynamical system, on the basis of which properties of channel polarization and polar codes are reviewed, with emphasis on our recent results.

  20. Ultra-narrow bandwidth voice coding

    DOEpatents

    Holzrichter, John F [Berkeley, CA; Ng, Lawrence C [Danville, CA

    2007-01-09

    A system of removing excess information from a human speech signal and coding the remaining signal information, transmitting the coded signal, and reconstructing the coded signal. The system uses one or more EM wave sensors and one or more acoustic microphones to determine at least one characteristic of the human speech signal.

  1. Spatial transform coding of color images.

    NASA Technical Reports Server (NTRS)

    Pratt, W. K.

    1971-01-01

    The application of the transform-coding concept to the coding of color images represented by three primary color planes of data is discussed. The principles of spatial transform coding are reviewed and the merits of various methods of color-image representation are examined. A performance analysis is presented for the color-image transform-coding system. Results of a computer simulation of the coding system are also given. It is shown that, by transform coding, the chrominance content of a color image can be coded with an average of 1.0 bits per element or less without serious degradation. If luminance coding is also employed, the average rate reduces to about 2.0 bits per element or less.

  2. Development of online lines-scan imaging system for chicken inspection and differentiation

    NASA Astrophysics Data System (ADS)

    Yang, Chun-Chieh; Chan, Diane E.; Chao, Kuanglin; Chen, Yud-Ren; Kim, Moon S.

    2006-10-01

    An online line-scan imaging system was developed for differentiation of wholesome and systemically diseased chickens. The hyperspectral imaging system used in this research can be directly converted to multispectral operation and would provide the ideal implementation of essential features for data-efficient high-speed multispectral classification algorithms. The imaging system consisted of an electron-multiplying charge-coupled-device (EMCCD) camera and an imaging spectrograph for line-scan images. The system scanned the surfaces of chicken carcasses on an eviscerating line at a poultry processing plant in December 2005. A method was created to recognize birds entering and exiting the field of view, and to locate a Region of Interest on the chicken images from which useful spectra were extracted for analysis. From analysis of the difference spectra between wholesome and systemically diseased chickens, four wavelengths of 468 nm, 501 nm, 582 nm and 629 nm were selected as key wavelengths for differentiation. The method of locating the Region of Interest will also have practical application in multispectral operation of the line-scan imaging system for online chicken inspection. This line-scan imaging system makes possible the implementation of multispectral inspection using the key wavelengths determined in this study with minimal software adaptations and without the need for cross-system calibration.

  3. Bar-Code System for a Microbiological Laboratory

    NASA Technical Reports Server (NTRS)

    Law, Jennifer; Kirschner, Larry

    2007-01-01

    A bar-code system has been assembled for a microbiological laboratory that must examine a large number of samples. The system includes a commercial bar-code reader, computer hardware and software components, plus custom-designed database software. The software generates a user-friendly, menu-driven interface.

  4. Geographic Information Systems using CODES linked data (Crash outcome data evaluation system)

    DOT National Transportation Integrated Search

    2001-04-01

    This report presents information about geographic information systems (GIS) and CODES linked data. Section one provides an overview of a GIS and the benefits of linking to CODES. Section two outlines the basic issues relative to the types of map data...

  5. Validity of the Child Facial Coding System for the Assessment of Acute Pain in Children With Cerebral Palsy.

    PubMed

    Hadden, Kellie L; LeFort, Sandra; O'Brien, Michelle; Coyte, Peter C; Guerriere, Denise N

    2016-04-01

    The purpose of the current study was to examine the concurrent and discriminant validity of the Child Facial Coding System for children with cerebral palsy. Eighty-five children (mean = 8.35 years, SD = 4.72 years) were videotaped during a passive joint stretch with their physiotherapist and during 3 time segments: baseline, passive joint stretch, and recovery. Children's pain responses were rated from videotape using the Numerical Rating Scale and Child Facial Coding System. Results indicated that Child Facial Coding System scores during the passive joint stretch significantly correlated with Numerical Rating Scale scores (r = .72, P < .01). Child Facial Coding System scores were also significantly higher during the passive joint stretch than the baseline and recovery segments (P < .001). Facial activity was not significantly correlated with the developmental measures. These findings suggest that the Child Facial Coding System is a valid method of identifying pain in children with cerebral palsy. © The Author(s) 2015.

  6. [Clinical effect of three dimensional human body scanning system BurnCalc in the evaluation of burn wound area].

    PubMed

    Lu, J; Wang, L; Zhang, Y C; Tang, H T; Xia, Z F

    2017-10-20

    Objective: To validate the clinical effect of three dimensional human body scanning system BurnCalc developed by our research team in the evaluation of burn wound area. Methods: A total of 48 burn patients treated in the outpatient department of our unit from January to June 2015, conforming to the study criteria, were enrolled in. For the first 12 patients, one wound on the limbs or torso was selected from each patient. The stability of the system was tested by 3 attending physicians using three dimensional human body scanning system BurnCalc to measure the area of wounds individually. For the following 36 patients, one wound was selected from each patient, including 12 wounds on limbs, front torso, and side torso, respectively. The area of wounds was measured by the same attending physician using transparency tracing method, National Institutes of Health (NIH) Image J method, and three dimensional human body scanning system BurnCalc, respectively. The time for getting information of 36 wounds by three methods was recorded by stopwatch. The stability among the testers was evaluated by the intra-class correlation coefficient (ICC). Data were processed with randomized blocks analysis of variance and Bonferroni test. Results: (1) Wound area of patients measured by three physicians using three dimensional human body scanning system BurnCalc was (122±95), (121±95), and (123±96) cm(2,) respectively, and there was no statistically significant difference among them ( F =1.55, P >0.05). The ICC among 3 physicians was 0.999. (2) The wound area of limbs of patients measured by transparency tracing method, NIH Image J method, and three dimensional human body scanning system BurnCalc was (84±50), (76±46), and (84±49) cm(2,) respectively. There was no statistically significant difference in the wound area of limbs of patients measured by transparency tracing method and three dimensional human body scanning system BurnCalc ( P >0.05). The wound area of limbs of patients measured by NIH Image J method was smaller than that measured by transparency tracing method and three dimensional human body scanning system BurnCalc (with P values below 0.05). There was no statistically significant difference in the wound area of front torso of patients measured by transparency tracing method, NIH Image J method, and three dimensional human body scanning system BurnCalc ( F =0.33, P >0.05). The wound area of side torso of patients measured by transparency tracing method, NIH Image J method, and three dimensional human body scanning system BurnCalc was (169±88), (150±80), and (169±86) cm(2,) respectively. There was no statistically significant difference in the wound area of side torso of patients measured by transparency tracing method and three dimensional human body scanning system BurnCalc ( P >0.05). The wound area of side torso of patients measured by NIH Image J method was smaller than that measured by transparency tracing method and three dimensional human body scanning system BurnCalc (with P values below 0.05). (3) The time for getting information of wounds of patients by transparency tracing method, NIH Image J method, and three dimensional human body scanning system BurnCalc was (77±14), (10±3), and (9±3) s, respectively. The time for getting information of wounds of patients by transparency tracing method was longer than that by NIH Image J method and three dimensional human body scanning system BurnCalc (with P values below 0.05). The time for getting information of wounds of patients by three dimensional human body scanning system BurnCalc was close to that by NIH Image J method ( P >0.05). Conclusions: The three dimensional human body scanning system BurnCalc is stable and can accurately evaluate the wound area on limbs and torso of burn patients.

  7. From LIDAR Scanning to 3d FEM Analysis for Complex Surface and Underground Excavations

    NASA Astrophysics Data System (ADS)

    Chun, K.; Kemeny, J.

    2017-12-01

    Light detection and ranging (LIDAR) has been a prevalent remote-sensing technology applied in the geological fields due to its high precision and ease to use. One of the major applications is to use the detailed geometrical information of underground structures as a basis for the generation of three-dimensional numerical model that can be used in FEM analysis. To date, however, straightforward techniques in reconstructing numerical model from the scanned data of underground structures have not been well established or tested. In this paper, we propose a comprehensive approach integrating from LIDAR scanning to finite element numerical analysis, specifically converting LIDAR 3D point clouds of object containing complex surface geometry into finite element model. This methodology has been applied to the Kartchner Caverns in Arizona for the stability analysis. Numerical simulations were performed using the finite element code ABAQUS. The results indicate that the highlights of our technologies obtained from LIDAR is effective and provide reference for other similar engineering project in practice.

  8. Real-Time Noise Removal for Line-Scanning Hyperspectral Devices Using a Minimum Noise Fraction-Based Approach

    PubMed Central

    Bjorgan, Asgeir; Randeberg, Lise Lyngsnes

    2015-01-01

    Processing line-by-line and in real-time can be convenient for some applications of line-scanning hyperspectral imaging technology. Some types of processing, like inverse modeling and spectral analysis, can be sensitive to noise. The MNF (minimum noise fraction) transform provides suitable denoising performance, but requires full image availability for the estimation of image and noise statistics. In this work, a modified algorithm is proposed. Incrementally-updated statistics enables the algorithm to denoise the image line-by-line. The denoising performance has been compared to conventional MNF and found to be equal. With a satisfying denoising performance and real-time implementation, the developed algorithm can denoise line-scanned hyperspectral images in real-time. The elimination of waiting time before denoised data are available is an important step towards real-time visualization of processed hyperspectral data. The source code can be found at http://www.github.com/ntnu-bioopt/mnf. This includes an implementation of conventional MNF denoising. PMID:25654717

  9. Suitability of holographic beam scanning in high resolution applications

    NASA Astrophysics Data System (ADS)

    Kalita, Ranjan; Goutam Buddha, S. S.; Boruah, Bosanta R.

    2018-02-01

    The high resolution applications of a laser scanning imaging system very much demand the accurate positioning of the illumination beam. The galvanometer scanner based beam scanning imaging systems, on the other hand, suffer from both short term and long term beam instability issues. Fortunately Computer generated holography based beam scanning offers extremely accurate beam steering, which can be very useful for imaging in high-resolution applications in confocal microscopy. The holographic beam scanning can be achieved by writing a sequence of holograms onto a spatial light modulator and utilizing one of the diffracted orders as the illumination beam. This paper highlights relative advantages of such a holographic beam scanning based confocal system and presents some of preliminary experimental results.

  10. LDPC coded OFDM over the atmospheric turbulence channel.

    PubMed

    Djordjevic, Ivan B; Vasic, Bane; Neifeld, Mark A

    2007-05-14

    Low-density parity-check (LDPC) coded optical orthogonal frequency division multiplexing (OFDM) is shown to significantly outperform LDPC coded on-off keying (OOK) over the atmospheric turbulence channel in terms of both coding gain and spectral efficiency. In the regime of strong turbulence at a bit-error rate of 10(-5), the coding gain improvement of the LDPC coded single-side band unclipped-OFDM system with 64 sub-carriers is larger than the coding gain of the LDPC coded OOK system by 20.2 dB for quadrature-phase-shift keying (QPSK) and by 23.4 dB for binary-phase-shift keying (BPSK).

  11. Human abdomen recognition using camera and force sensor in medical robot system for automatic ultrasound scan.

    PubMed

    Bin Mustafa, Ammar Safwan; Ishii, Takashi; Matsunaga, Yoshiki; Nakadate, Ryu; Ishii, Hiroyuki; Ogawa, Kouji; Saito, Akiko; Sugawara, Motoaki; Niki, Kiyomi; Takanishi, Atsuo

    2013-01-01

    Physicians use ultrasound scans to obtain real-time images of internal organs, because such scans are safe and inexpensive. However, people in remote areas face difficulties to be scanned due to aging society and physician's shortage. Hence, it is important to develop an autonomous robotic system to perform remote ultrasound scans. Previously, we developed a robotic system for automatic ultrasound scan focusing on human's liver. In order to make it a completely autonomous system, we present in this paper a way to autonomously localize the epigastric region as the starting position for the automatic ultrasound scan. An image processing algorithm marks the umbilicus and mammary papillae on a digital photograph of the patient's abdomen. Then, we made estimation for the location of the epigastric region using the distances between these landmarks. A supporting algorithm distinguishes rib position from epigastrium using the relationship between force and displacement. We implemented these algorithms with the automatic scanning system into an apparatus: a Mitsubishi Electric's MELFA RV-1 six axis manipulator. Tests on 14 healthy male subjects showed the apparatus located the epigastric region with a success rate of 94%. The results suggest that image recognition was effective in localizing a human body part.

  12. A virtual tour of geological heritage: Valourising geodiversity using Google Earth and QR code

    NASA Astrophysics Data System (ADS)

    Martínez-Graña, A. M.; Goy, J. L.; Cimarra, C. A.

    2013-12-01

    When making land-use plans, it is necessary to inventory and catalogue the geological heritage and geodiversity of a site to establish an apolitical conservation protection plan to meet the educational and social needs of society. New technologies make it possible to create virtual databases using virtual globes - e.g., Google Earth - and other personal-use geomatics applications (smartphones, tablets, PDAs) for accessing geological heritage information in “real time” for scientific, educational, and cultural purposes via a virtual geological itinerary. Seventeen mapped and georeferenced geosites have been created in Keyhole Markup Language for use in map layers used in geological itinerary stops for different applications. A virtual tour has been developed for Las Quilamas Natural Park, which is located in the Spanish Central System, using geological layers and topographic and digital terrain models that can be overlaid in a 3D model. The Google Earth application was used to import the geosite placemarks. For each geosite, a tab has been developed that shows a description of the geology with photographs and diagrams and that evaluates the scientific, educational, and tourism quality. Augmented reality allows the user to access these georeferenced thematic layers and overlay data, images, and graphics in real time on their mobile devices. These virtual tours can be incorporated into subject guides designed by public. Seven educational and interpretive panels describing some of the geosites were designed and tagged with a QR code that could be printed at each stop or in the printed itinerary. These QR codes can be scanned with the camera found on most mobile devices, and video virtual tours can be viewed on these devices. The virtual tour of the geological heritage can be used to show tourists the geological history of the Las Quilamas Natural Park using new geomatics technologies (virtual globes, augmented reality, and QR codes).

  13. Convolutional coding techniques for data protection

    NASA Technical Reports Server (NTRS)

    Massey, J. L.

    1975-01-01

    Results of research on the use of convolutional codes in data communications are presented. Convolutional coding fundamentals are discussed along with modulation and coding interaction. Concatenated coding systems and data compression with convolutional codes are described.

  14. An all-digital receiver for satellite audio broadcasting signals using trellis coded quasi-orthogonal code-division multiplexing

    NASA Astrophysics Data System (ADS)

    Braun, Walter; Eglin, Peter; Abello, Ricard

    1993-02-01

    Spread Spectrum Code Division Multiplex is an attractive scheme for the transmission of multiple signals over a satellite transponder. By using orthogonal or quasi-orthogonal spreading codes the interference between the users can be virtually eliminated. However, the acquisition and tracking of the spreading code phase can not take advantage of the code orthogonality since sequential acquisition and Delay-Locked loop tracking depend on correlation with code phases other than the optimal despreading phase. Hence, synchronization is a critical issue in such a system. A demonstration hardware for the verification of the orthogonal CDM synchronization and data transmission concept is being designed and implemented. The system concept, the synchronization scheme, and the implementation are described. The performance of the system is discussed based on computer simulations.

  15. System of radiographic control or an imaging system for personal radiographic inspection

    NASA Astrophysics Data System (ADS)

    Babichev, E. A.; Baru, S. E.; Neustroev, V. A.; Leonov, V. V.; Porosev, V. V.; Savinov, G. A.; Ukraintsev, Yu. G.

    2004-06-01

    The security system of personal radiographic inspection for detection of explosive materials and plastic weapons was developed in BINP recently. Basic system parameters are: maximum scanning height— 2000 mm, image width— 800 mm, number of detector channels—768, channel size— 1.05×1 mm, charge collecting time for one line—2, 5 ms, scanning speed— 40 cm/s, maximum scanning time— 5 s, radiation dose per one inspection <5 μSv. The detector is a multichannel ionization Xe chamber. The image of inspected person will appear on the display just after scanning. The pilot sample of this system was put into operation in March, 2003.b

  16. Bioreactor Cultivation of Anatomically Shaped Human Bone Grafts

    PubMed Central

    Temple, Joshua P.; Yeager, Keith; Bhumiratana, Sarindr; Vunjak-Novakovic, Gordana; Grayson, Warren L.

    2015-01-01

    In this chapter, we describe a method for engineering bone grafts in vitro with the specific geometry of the temporomandibular joint (TMJ) condyle. The anatomical geometry of the bone grafts was segmented from computed tomography (CT) scans, converted to G-code, and used to machine decellularized trabecular bone scaffolds into the identical shape of the condyle. These scaffolds were seeded with human bone marrow-derived mesenchymal stem cells (MSCs) using spinner flasks and cultivated for up to 5 weeks in vitro using a custom-designed perfusion bioreactor system. The flow patterns through the complex geometry were modeled using the FloWorks module of SolidWorks to optimize bioreactor design. The perfused scaffolds exhibited significantly higher cellular content, better matrix production, and increased bone mineral deposition relative to non-perfused (static) controls after 5 weeks of in vitro cultivation. This technology is broadly applicable for creating patient-specific bone grafts of varying shapes and sizes. PMID:24014312

  17. Thermal Transfer Compared To The Fourteen Other Imaging Technologies

    NASA Astrophysics Data System (ADS)

    O'Leary, John W.

    1989-07-01

    A quiet revolution in the world of imaging has been underway for the past few years. The older technologies of dot matrix, daisy wheel, thermal paper and pen plotters have been increasingly displaced by laser, ink jet and thermal transfer. The net result of this revolution is improved technologies that afford superior imaging, quiet operation, plain paper usage, instant operation, and solid state components. Thermal transfer is one of the processes that incorporates these benefits. Among the imaging application for thermal transfer are: 1. Bar code labeling and scanning. 2. New systems for airline ticketing, boarding passes, reservations, etc. 3. Color computer graphics and imaging. 4. Copying machines that copy in color. 5. Fast growing communications media such as facsimile. 6. Low cost word processors and computer printers. 7. New devices that print pictures from video cameras or television sets. 8. Cameras utilizing computer chips in place of film.

  18. 'Wakey wakey baby': narrating four-dimensional (4D) bonding scans.

    PubMed

    Roberts, Julie

    2012-02-01

    Commercial companies market 4D ultrasound scans to expectant parents for the stated purpose of reassurance, to promote bonding, and to get 'baby's first picture'. This article describes in detail the process of commercial 4D scanning in the UK, paying particular attention to the discursive exchanges in the scan room. It is argued that sonographers and clients engage in a process of 'collaborative coding' that, despite the realism of 4D, is essential to making the imagery on the screen personally and socially meaningful. While sonographers first help clients to get their bearings, expectant parents and others often engage in a complex process of narrating the images on the screen as they are created. The capacities of 4D ultrasound to image facial features and movements inform stories about fetal experience and family resemblances as well as enabling playfully imagined interactions with the fetus. While these stories are primarily based in experiences of the visual, there is also evidence that pregnant women seek to map the image onto their bodies and to reintroduce some elements of their embodied experiences into the narratives. © 2011 The Author. Sociology of Health & Illness © 2011 Foundation for the Sociology of Health & Illness/Blackwell Publishing Ltd.

  19. Virtual Planning of a Complex Three-Part Bimaxillary Osteotomy

    PubMed Central

    Anghinoni, Marilena Laura

    2017-01-01

    In maxillofacial surgery, every patient presents special problems requiring careful evaluation. Conventional methods to study the deformities are still reliable, but the advent of tridimensional (3D) imaging, especially computed tomography (CT) scan and laser scanning of casts, created the opportunity to better understanding the skeletal support and the soft tissue structures. Nowadays, virtual technologies are increasingly employed in maxillofacial surgery and demonstrated precision and reliability. However, in complex surgical procedures, these new technologies are still controversial. Especially in the less frequent cases of three-part maxillary surgery, the experience is limited, and scientific literature cannot give a clear support. This paper presents the case of a young patient affected by a complex long face dentofacial deformity treated by a bimaxillary surgery with three-part segmentation of the maxilla. The operator performed the surgical study completely with a virtual workflow. Pre- and postoperative CT scan and optical scanning of plaster models were collected and compared. Every postoperatory maxillary piece was superimposed with the presurgical one, and the differences were examined in a color-coded map. Only mild differences were found near the osteotomy lines, when the bony surface and the teeth demonstrated an excellent coincidence. PMID:29318057

  20. Los Alamos radiation transport code system on desktop computing platforms

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Briesmeister, J.F.; Brinkley, F.W.; Clark, B.A.

    The Los Alamos Radiation Transport Code System (LARTCS) consists of state-of-the-art Monte Carlo and discrete ordinates transport codes and data libraries. These codes were originally developed many years ago and have undergone continual improvement. With a large initial effort and continued vigilance, the codes are easily portable from one type of hardware to another. The performance of scientific work-stations (SWS) has evolved to the point that such platforms can be used routinely to perform sophisticated radiation transport calculations. As the personal computer (PC) performance approaches that of the SWS, the hardware options for desk-top radiation transport calculations expands considerably. Themore » current status of the radiation transport codes within the LARTCS is described: MCNP, SABRINA, LAHET, ONEDANT, TWODANT, TWOHEX, and ONELD. Specifically, the authors discuss hardware systems on which the codes run and present code performance comparisons for various machines.« less

Top