Sample records for improving design codes

  1. National Combustion Code Parallel Performance Enhancements

    NASA Technical Reports Server (NTRS)

    Quealy, Angela; Benyo, Theresa (Technical Monitor)

    2002-01-01

    The National Combustion Code (NCC) is being developed by an industry-government team for the design and analysis of combustion systems. The unstructured grid, reacting flow code uses a distributed memory, message passing model for its parallel implementation. The focus of the present effort has been to improve the performance of the NCC code to meet combustor designer requirements for model accuracy and analysis turnaround time. Improving the performance of this code contributes significantly to the overall reduction in time and cost of the combustor design cycle. This report describes recent parallel processing modifications to NCC that have improved the parallel scalability of the code, enabling a two hour turnaround for a 1.3 million element fully reacting combustion simulation on an SGI Origin 2000.

  2. Optimizing a liquid propellant rocket engine with an automated combustor design code (AUTOCOM)

    NASA Technical Reports Server (NTRS)

    Hague, D. S.; Reichel, R. H.; Jones, R. T.; Glatt, C. R.

    1972-01-01

    A procedure for automatically designing a liquid propellant rocket engine combustion chamber in an optimal fashion is outlined. The procedure is contained in a digital computer code, AUTOCOM. The code is applied to an existing engine, and design modifications are generated which provide a substantial potential payload improvement over the existing design. Computer time requirements for this payload improvement were small, approximately four minutes in the CDC 6600 computer.

  3. Speech coding at low to medium bit rates

    NASA Astrophysics Data System (ADS)

    Leblanc, Wilfred Paul

    1992-09-01

    Improved search techniques coupled with improved codebook design methodologies are proposed to improve the performance of conventional code-excited linear predictive coders for speech. Improved methods for quantizing the short term filter are developed by employing a tree search algorithm and joint codebook design to multistage vector quantization. Joint codebook design procedures are developed to design locally optimal multistage codebooks. Weighting during centroid computation is introduced to improve the outlier performance of the multistage vector quantizer. Multistage vector quantization is shown to be both robust against input characteristics and in the presence of channel errors. Spectral distortions of about 1 dB are obtained at rates of 22-28 bits/frame. Structured codebook design procedures for excitation in code-excited linear predictive coders are compared to general codebook design procedures. Little is lost using significant structure in the excitation codebooks while greatly reducing the search complexity. Sparse multistage configurations are proposed for reducing computational complexity and memory size. Improved search procedures are applied to code-excited linear prediction which attempt joint optimization of the short term filter, the adaptive codebook, and the excitation. Improvements in signal to noise ratio of 1-2 dB are realized in practice.

  4. Evaluation of three coding schemes designed for improved data communication

    NASA Technical Reports Server (NTRS)

    Snelsire, R. W.

    1974-01-01

    Three coding schemes designed for improved data communication are evaluated. Four block codes are evaluated relative to a quality function, which is a function of both the amount of data rejected and the error rate. The Viterbi maximum likelihood decoding algorithm as a decoding procedure is reviewed. This evaluation is obtained by simulating the system on a digital computer. Short constraint length rate 1/2 quick-look codes are studied, and their performance is compared to general nonsystematic codes.

  5. Incorporation of Dynamic SSI Effects in the Design Response Spectra

    NASA Astrophysics Data System (ADS)

    Manjula, N. K.; Pillai, T. M. Madhavan; Nagarajan, Praveen; Reshma, K. K.

    2018-05-01

    Many studies in the past on dynamic soil-structure interactions have revealed the detrimental and advantageous effects of soil flexibility. Based on such studies, the design response spectra of international seismic codes are being improved worldwide. The improvements required for the short period range of the design response spectra in the Indian seismic code (IS 1893:2002) are presented in this paper. As the recent code revisions has not incorporated the short period amplifications, proposals given in this paper are equally applicable for the latest code also (IS 1893:2016). Analyses of single degree of freedom systems are performed to predict the required improvements. The proposed modifications to the constant acceleration portion of the spectra are evaluated with respect to the current design spectra in Eurocode 8.

  6. Side information in coded aperture compressive spectral imaging

    NASA Astrophysics Data System (ADS)

    Galvis, Laura; Arguello, Henry; Lau, Daniel; Arce, Gonzalo R.

    2017-02-01

    Coded aperture compressive spectral imagers sense a three-dimensional cube by using two-dimensional projections of the coded and spectrally dispersed source. These imagers systems often rely on FPA detectors, SLMs, micromirror devices (DMDs), and dispersive elements. The use of the DMDs to implement the coded apertures facilitates the capture of multiple projections, each admitting a different coded aperture pattern. The DMD allows not only to collect the sufficient number of measurements for spectrally rich scenes or very detailed spatial scenes but to design the spatial structure of the coded apertures to maximize the information content on the compressive measurements. Although sparsity is the only signal characteristic usually assumed for reconstruction in compressing sensing, other forms of prior information such as side information have been included as a way to improve the quality of the reconstructions. This paper presents the coded aperture design in a compressive spectral imager with side information in the form of RGB images of the scene. The use of RGB images as side information of the compressive sensing architecture has two main advantages: the RGB is not only used to improve the reconstruction quality but to optimally design the coded apertures for the sensing process. The coded aperture design is based on the RGB scene and thus the coded aperture structure exploits key features such as scene edges. Real reconstructions of noisy compressed measurements demonstrate the benefit of the designed coded apertures in addition to the improvement in the reconstruction quality obtained by the use of side information.

  7. Modification and Validation of Conceptual Design Aerodynamic Prediction Method HASC95 With VTXCHN

    NASA Technical Reports Server (NTRS)

    Albright, Alan E.; Dixon, Charles J.; Hegedus, Martin C.

    1996-01-01

    A conceptual/preliminary design level subsonic aerodynamic prediction code HASC (High Angle of Attack Stability and Control) has been improved in several areas, validated, and documented. The improved code includes improved methodologies for increased accuracy and robustness, and simplified input/output files. An engineering method called VTXCHN (Vortex Chine) for prediciting nose vortex shedding from circular and non-circular forebodies with sharp chine edges has been improved and integrated into the HASC code. This report contains a summary of modifications, description of the code, user's guide, and validation of HASC. Appendices include discussion of a new HASC utility code, listings of sample input and output files, and a discussion of the application of HASC to buffet analysis.

  8. Recent improvements of reactor physics codes in MHI

    NASA Astrophysics Data System (ADS)

    Kosaka, Shinya; Yamaji, Kazuya; Kirimura, Kazuki; Kamiyama, Yohei; Matsumoto, Hideki

    2015-12-01

    This paper introduces recent improvements for reactor physics codes in Mitsubishi Heavy Industries, Ltd(MHI). MHI has developed a new neutronics design code system Galaxy/Cosmo-S(GCS) for PWR core analysis. After TEPCO's Fukushima Daiichi accident, it is required to consider design extended condition which has not been covered explicitly by the former safety licensing analyses. Under these circumstances, MHI made some improvements for GCS code system. A new resonance calculation model of lattice physics code and homogeneous cross section representative model for core simulator have been developed to apply more wide range core conditions corresponding to severe accident status such like anticipated transient without scram (ATWS) analysis and criticality evaluation of dried-up spent fuel pit. As a result of these improvements, GCS code system has very wide calculation applicability with good accuracy for any core conditions as far as fuel is not damaged. In this paper, the outline of GCS code system is described briefly and recent relevant development activities are presented.

  9. Recent improvements of reactor physics codes in MHI

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kosaka, Shinya, E-mail: shinya-kosaka@mhi.co.jp; Yamaji, Kazuya; Kirimura, Kazuki

    2015-12-31

    This paper introduces recent improvements for reactor physics codes in Mitsubishi Heavy Industries, Ltd(MHI). MHI has developed a new neutronics design code system Galaxy/Cosmo-S(GCS) for PWR core analysis. After TEPCO’s Fukushima Daiichi accident, it is required to consider design extended condition which has not been covered explicitly by the former safety licensing analyses. Under these circumstances, MHI made some improvements for GCS code system. A new resonance calculation model of lattice physics code and homogeneous cross section representative model for core simulator have been developed to apply more wide range core conditions corresponding to severe accident status such like anticipatedmore » transient without scram (ATWS) analysis and criticality evaluation of dried-up spent fuel pit. As a result of these improvements, GCS code system has very wide calculation applicability with good accuracy for any core conditions as far as fuel is not damaged. In this paper, the outline of GCS code system is described briefly and recent relevant development activities are presented.« less

  10. Protograph LDPC Codes Over Burst Erasure Channels

    NASA Technical Reports Server (NTRS)

    Divsalar, Dariush; Dolinar, Sam; Jones, Christopher

    2006-01-01

    In this paper we design high rate protograph based LDPC codes suitable for binary erasure channels. To simplify the encoder and decoder implementation for high data rate transmission, the structure of codes are based on protographs and circulants. These LDPC codes can improve data link and network layer protocols in support of communication networks. Two classes of codes were designed. One class is designed for large block sizes with an iterative decoding threshold that approaches capacity of binary erasure channels. The other class is designed for short block sizes based on maximizing minimum stopping set size. For high code rates and short blocks the second class outperforms the first class.

  11. 76 FR 70807 - Notice of Passenger Facility Charge (PFC) Approvals and Disapprovals

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-11-15

    ... (design and construction). Enplane road structural improvements (design and construction). Landside signage improvements (design and construction). Taxiway B-2 extension and taxiway B-1 rehabilitation (design and construction). Elevator and escalator safety code compliance improvements (design and...

  12. Current and anticipated use of thermal-hydraulic codes for BWR transient and accident analyses in Japan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Arai, Kenji; Ebata, Shigeo

    1997-07-01

    This paper summarizes the current and anticipated use of the thermal-hydraulic and neutronic codes for the BWR transient and accident analyses in Japan. The codes may be categorized into the licensing codes and the best estimate codes for the BWR transient and accident analyses. Most of the licensing codes have been originally developed by General Electric. Some codes have been updated based on the technical knowledge obtained in the thermal hydraulic study in Japan, and according to the BWR design changes. The best estimates codes have been used to support the licensing calculations and to obtain the phenomenological understanding ofmore » the thermal hydraulic phenomena during a BWR transient or accident. The best estimate codes can be also applied to a design study for a next generation BWR to which the current licensing model may not be directly applied. In order to rationalize the margin included in the current BWR design and develop a next generation reactor with appropriate design margin, it will be required to improve the accuracy of the thermal-hydraulic and neutronic model. In addition, regarding the current best estimate codes, the improvement in the user interface and the numerics will be needed.« less

  13. The use of Tcl and Tk to improve design and code reutilization

    NASA Technical Reports Server (NTRS)

    Rodriguez, Lisbet; Reinholtz, Kirk

    1995-01-01

    Tcl and Tk facilitate design and code reuse in the ZIPSIM series of high-performance, high-fidelity spacecraft simulators. Tcl and Tk provide a framework for the construction of the Graphical User Interfaces for the simulators. The interfaces are architected such that a large proportion of the design and code is used for several applications, which has reduced design time and life-cycle costs.

  14. National Combustion Code: Parallel Implementation and Performance

    NASA Technical Reports Server (NTRS)

    Quealy, A.; Ryder, R.; Norris, A.; Liu, N.-S.

    2000-01-01

    The National Combustion Code (NCC) is being developed by an industry-government team for the design and analysis of combustion systems. CORSAIR-CCD is the current baseline reacting flow solver for NCC. This is a parallel, unstructured grid code which uses a distributed memory, message passing model for its parallel implementation. The focus of the present effort has been to improve the performance of the NCC flow solver to meet combustor designer requirements for model accuracy and analysis turnaround time. Improving the performance of this code contributes significantly to the overall reduction in time and cost of the combustor design cycle. This paper describes the parallel implementation of the NCC flow solver and summarizes its current parallel performance on an SGI Origin 2000. Earlier parallel performance results on an IBM SP-2 are also included. The performance improvements which have enabled a turnaround of less than 15 hours for a 1.3 million element fully reacting combustion simulation are described.

  15. Incorporating Manual and Autonomous Code Generation

    NASA Technical Reports Server (NTRS)

    McComas, David

    1998-01-01

    Code can be generated manually or using code-generated software tools, but how do you interpret the two? This article looks at a design methodology that combines object-oriented design with autonomic code generation for attitude control flight software. Recent improvements in space flight computers are allowing software engineers to spend more time engineering the applications software. The application developed was the attitude control flight software for an astronomical satellite called the Microwave Anisotropy Probe (MAP). The MAP flight system is being designed, developed, and integrated at NASA's Goddard Space Flight Center. The MAP controls engineers are using Integrated Systems Inc.'s MATRIXx for their controls analysis. In addition to providing a graphical analysis for an environment, MATRIXx includes an autonomic code generation facility called AutoCode. This article examines the forces that shaped the final design and describes three highlights of the design process: (1) Defining the manual to autonomic code interface; (2) Applying object-oriented design to the manual flight code; (3) Implementing the object-oriented design in C.

  16. Pre-engineering Spaceflight Validation of Environmental Models and the 2005 HZETRN Simulation Code

    NASA Technical Reports Server (NTRS)

    Nealy, John E.; Cucinotta, Francis A.; Wilson, John W.; Badavi, Francis F.; Dachev, Ts. P.; Tomov, B. T.; Walker, Steven A.; DeAngelis, Giovanni; Blattnig, Steve R.; Atwell, William

    2006-01-01

    The HZETRN code has been identified by NASA for engineering design in the next phase of space exploration highlighting a return to the Moon in preparation for a Mars mission. In response, a new series of algorithms beginning with 2005 HZETRN, will be issued by correcting some prior limitations and improving control of propagated errors along with established code verification processes. Code validation processes will use new/improved low Earth orbit (LEO) environmental models with a recently improved International Space Station (ISS) shield model to validate computational models and procedures using measured data aboard ISS. These validated models will provide a basis for flight-testing the designs of future space vehicles and systems of the Constellation program in the LEO environment.

  17. Users manual and modeling improvements for axial turbine design and performance computer code TD2-2

    NASA Technical Reports Server (NTRS)

    Glassman, Arthur J.

    1992-01-01

    Computer code TD2 computes design point velocity diagrams and performance for multistage, multishaft, cooled or uncooled, axial flow turbines. This streamline analysis code was recently modified to upgrade modeling related to turbine cooling and to the internal loss correlation. These modifications are presented in this report along with descriptions of the code's expanded input and output. This report serves as the users manual for the upgraded code, which is named TD2-2.

  18. Improved design of special boundary elements for T-shaped reinforced concrete walls

    NASA Astrophysics Data System (ADS)

    Ji, Xiaodong; Liu, Dan; Qian, Jiaru

    2017-01-01

    This study examines the design provisions of the Chinese GB 50011-2010 code for seismic design of buildings for the special boundary elements of T-shaped reinforced concrete walls and proposes an improved design method. Comparison of the design provisions of the GB 50011-2010 code and those of the American code ACI 318-14 indicates a possible deficiency in the T-shaped wall design provisions in GB 50011-2010. A case study of a typical T-shaped wall designed in accordance with GB 50011-2010 also indicates the insufficient extent of the boundary element at the non-flange end and overly conservative design of the flange end boundary element. Improved designs for special boundary elements of T-shaped walls are developed using a displacement-based method. The proposed design formulas produce a longer boundary element at the non-flange end and a shorter boundary element at the flange end, relative to those of the GB 50011-2010 provisions. Extensive numerical analysis indicates that T-shaped walls designed using the proposed formulas develop inelastic drift of 0.01 for both cases of the flange in compression and in tension.

  19. Capacity achieving nonbinary LDPC coded non-uniform shaping modulation for adaptive optical communications.

    PubMed

    Lin, Changyu; Zou, Ding; Liu, Tao; Djordjevic, Ivan B

    2016-08-08

    A mutual information inspired nonbinary coded modulation design with non-uniform shaping is proposed. Instead of traditional power of two signal constellation sizes, we design 5-QAM, 7-QAM and 9-QAM constellations, which can be used in adaptive optical networks. The non-uniform shaping and LDPC code rate are jointly considered in the design, which results in a better performance scheme for the same SNR values. The matched nonbinary (NB) LDPC code is used for this scheme, which further improves the coding gain and the overall performance. We analyze both coding performance and system SNR performance. We show that the proposed NB LDPC-coded 9-QAM has more than 2dB gain in symbol SNR compared to traditional LDPC-coded star-8-QAM. On the other hand, the proposed NB LDPC-coded 5-QAM and 7-QAM have even better performance than LDPC-coded QPSK.

  20. Standardized Radiation Shield Design Methods: 2005 HZETRN

    NASA Technical Reports Server (NTRS)

    Wilson, John W.; Tripathi, Ram K.; Badavi, Francis F.; Cucinotta, Francis A.

    2006-01-01

    Research committed by the Langley Research Center through 1995 resulting in the HZETRN code provides the current basis for shield design methods according to NASA STD-3000 (2005). With this new prominence, the database, basic numerical procedures, and algorithms are being re-examined with new methods of verification and validation being implemented to capture a well defined algorithm for engineering design processes to be used in this early development phase of the Bush initiative. This process provides the methodology to transform the 1995 HZETRN research code into the 2005 HZETRN engineering code to be available for these early design processes. In this paper, we will review the basic derivations including new corrections to the codes to insure improved numerical stability and provide benchmarks for code verification.

  1. Interactive-graphic flowpath plotting for turbine engines

    NASA Technical Reports Server (NTRS)

    Corban, R. R.

    1981-01-01

    An engine cycle program capable of simulating the design and off-design performance of arbitrary turbine engines, and a computer code which, when used in conjunction with the cycle code, can predict the weight of the engines are described. A graphics subroutine was added to the code to enable the engineer to visualize the designed engine with more clarity by producing an overall view of the designed engine for output on a graphics device using IBM-370 graphics subroutines. In addition, with the engine drawn on a graphics screen, the program allows for the interactive user to make changes to the inputs to the code for the engine to be redrawn and reweighed. These improvements allow better use of the code in conjunction with the engine program.

  2. Structural Analysis and Design Software

    NASA Technical Reports Server (NTRS)

    1997-01-01

    Collier Research and Development Corporation received a one-of-a-kind computer code for designing exotic hypersonic aircraft called ST-SIZE in the first ever Langley Research Center software copyright license agreement. Collier transformed the NASA computer code into a commercial software package called HyperSizer, which integrates with other Finite Element Modeling and Finite Analysis private-sector structural analysis program. ST-SIZE was chiefly conceived as a means to improve and speed the structural design of a future aerospace plane for Langley Hypersonic Vehicles Office. Including the NASA computer code into HyperSizer has enabled the company to also apply the software to applications other than aerospace, including improved design and construction for offices, marine structures, cargo containers, commercial and military aircraft, rail cars, and a host of everyday consumer products.

  3. [The design and experiment of complementary S coding matrix based on digital micromirror spectrometer].

    PubMed

    Zhang, Zhi-Hai; Gao, Ling-Xiao; Guo, Yuan-Jun; Wang, Wei; Mo, Xiang-Xia

    2012-12-01

    The template selection is essential in the application of digital micromirror spectrometer. The best theoretical coding H-matrix is not widely used due to acyclic, complex coding and difficult achievement. The noise ratio of best practical S-matrix for improvement is slightly inferior to matrix H. So we designed a new type complementary S-matrix. Through studying its noise improvement theory, the algorithm is proved to have the advantages of both H-matrix and S-matrix. The experiments proved that the SNR can be increased 2.05 times than S-template.

  4. DCT based interpolation filter for motion compensation in HEVC

    NASA Astrophysics Data System (ADS)

    Alshin, Alexander; Alshina, Elena; Park, Jeong Hoon; Han, Woo-Jin

    2012-10-01

    High Efficiency Video Coding (HEVC) draft standard has a challenging goal to improve coding efficiency twice compare to H.264/AVC. Many aspects of the traditional hybrid coding framework were improved during new standard development. Motion compensated prediction, in particular the interpolation filter, is one area that was improved significantly over H.264/AVC. This paper presents the details of the interpolation filter design of the draft HEVC standard. The coding efficiency improvements over H.264/AVC interpolation filter is studied and experimental results are presented, which show a 4.0% average bitrate reduction for Luma component and 11.3% average bitrate reduction for Chroma component. The coding efficiency gains are significant for some video sequences and can reach up 21.7%.

  5. Refactoring and Its Benefits

    NASA Astrophysics Data System (ADS)

    Veerraju, R. P. S. P.; Rao, A. Srinivasa; Murali, G.

    2010-10-01

    Refactoring is a disciplined technique for restructuring an existing body of code, altering its internal structure without changing its external behavior. It improves internal code structure without altering its external functionality by transforming functions and rethinking algorithms. It is an iterative process. Refactoring include reducing scope, replacing complex instructions with simpler or built-in instructions, and combining multiple statements into one statement. By transforming the code with refactoring techniques it will be faster to change, execute, and download. It is an excellent best practice to adopt for programmers wanting to improve their productivity. Refactoring is similar to things like performance optimizations, which are also behavior- preserving transformations. It also helps us find bugs when we are trying to fix a bug in difficult-to-understand code. By cleaning things up, we make it easier to expose the bug. Refactoring improves the quality of application design and implementation. In general, three cases concerning refactoring. Iterative refactoring, Refactoring when is necessary, Not refactor. Mr. Martin Fowler identifies four key reasons to refractor. Refactoring improves the design of software, makes software easier to understand, helps us find bugs and also helps in executing the program faster. There is an additional benefit of refactoring. It changes the way a developer thinks about the implementation when not refactoring. There are the three types of refactorings. 1) Code refactoring: It often referred to simply as refactoring. This is the refactoring of programming source code. 2) Database refactoring: It is a simple change to a database schema that improves its design while retaining both its behavioral and informational semantics. 3) User interface (UI) refactoring: It is a simple change to the UI which retains its semantics. Finally, we conclude the benefits of Refactoring are: Improves the design of software, Makes software easier to understand, Software gets cleaned up and Helps us to find bugs and Helps us to program faster.

  6. Refactoring and Its Benefits

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Veerraju, R. P. S. P.; Rao, A. Srinivasa; Murali, G.

    2010-10-26

    Refactoring is a disciplined technique for restructuring an existing body of code, altering its internal structure without changing its external behavior. It improves internal code structure without altering its external functionality by transforming functions and rethinking algorithms. It is an iterative process. Refactoring include reducing scope, replacing complex instructions with simpler or built-in instructions, and combining multiple statements into one statement. By transforming the code with refactoring techniques it will be faster to change, execute, and download. It is an excellent best practice to adopt for programmers wanting to improve their productivity. Refactoring is similar to things like performance optimizations,more » which are also behavior- preserving transformations. It also helps us find bugs when we are trying to fix a bug in difficult-to-understand code. By cleaning things up, we make it easier to expose the bug. Refactoring improves the quality of application design and implementation. In general, three cases concerning refactoring. Iterative refactoring, Refactoring when is necessary, Not refactor.Mr. Martin Fowler identifies four key reasons to refractor. Refactoring improves the design of software, makes software easier to understand, helps us find bugs and also helps in executing the program faster. There is an additional benefit of refactoring. It changes the way a developer thinks about the implementation when not refactoring. There are the three types of refactorings. 1) Code refactoring: It often referred to simply as refactoring. This is the refactoring of programming source code. 2) Database refactoring: It is a simple change to a database schema that improves its design while retaining both its behavioral and informational semantics. 3) User interface (UI) refactoring: It is a simple change to the UI which retains its semantics. Finally, we conclude the benefits of Refactoring are: Improves the design of software, Makes software easier to understand, Software gets cleaned up and Helps us to find bugs and Helps us to program faster.« less

  7. Object-Oriented/Data-Oriented Design of a Direct Simulation Monte Carlo Algorithm

    NASA Technical Reports Server (NTRS)

    Liechty, Derek S.

    2014-01-01

    Over the past decade, there has been much progress towards improved phenomenological modeling and algorithmic updates for the direct simulation Monte Carlo (DSMC) method, which provides a probabilistic physical simulation of gas Rows. These improvements have largely been based on the work of the originator of the DSMC method, Graeme Bird. Of primary importance are improved chemistry, internal energy, and physics modeling and a reduction in time to solution. These allow for an expanded range of possible solutions In altitude and velocity space. NASA's current production code, the DSMC Analysis Code (DAC), is well-established and based on Bird's 1994 algorithms written in Fortran 77 and has proven difficult to upgrade. A new DSMC code is being developed in the C++ programming language using object-oriented and data-oriented design paradigms to facilitate the inclusion of the recent improvements and future development activities. The development efforts on the new code, the Multiphysics Algorithm with Particles (MAP), are described, and performance comparisons are made with DAC.

  8. The effect of code expanding optimizations on instruction cache design

    NASA Technical Reports Server (NTRS)

    Chen, William Y.; Chang, Pohua P.; Conte, Thomas M.; Hwu, Wen-Mei W.

    1991-01-01

    It is shown that code expanding optimizations have strong and non-intuitive implications on instruction cache design. Three types of code expanding optimizations are studied: instruction placement, function inline expansion, and superscalar optimizations. Overall, instruction placement reduces the miss ratio of small caches. Function inline expansion improves the performance for small cache sizes, but degrades the performance of medium caches. Superscalar optimizations increases the cache size required for a given miss ratio. On the other hand, they also increase the sequentiality of instruction access so that a simple load-forward scheme effectively cancels the negative effects. Overall, it is shown that with load forwarding, the three types of code expanding optimizations jointly improve the performance of small caches and have little effect on large caches.

  9. Summary of papers on current and anticipated uses of thermal-hydraulic codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Caruso, R.

    1997-07-01

    The author reviews a range of recent papers which discuss possible uses and future development needs for thermal/hydraulic codes in the nuclear industry. From this review, eight common recommendations are extracted. They are: improve the user interface so that more people can use the code, so that models are easier and less expensive to prepare and maintain, and so that the results are scrutable; design the code so that it can easily be coupled to other codes, such as core physics, containment, fission product behaviour during severe accidents; improve the numerical methods to make the code more robust and especiallymore » faster running, particularly for low pressure transients; ensure that future code development includes assessment of code uncertainties as integral part of code verification and validation; provide extensive user guidelines or structure the code so that the `user effect` is minimized; include the capability to model multiple fluids (gas and liquid phase); design the code in a modular fashion so that new models can be added easily; provide the ability to include detailed or simplified component models; build on work previously done with other codes (RETRAN, RELAP, TRAC, CATHARE) and other code validation efforts (CSAU, CSNI SET and IET matrices).« less

  10. HERCULES: A Pattern Driven Code Transformation System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kartsaklis, Christos; Hernandez, Oscar R; Hsu, Chung-Hsing

    2012-01-01

    New parallel computers are emerging, but developing efficient scientific code for them remains difficult. A scientist must manage not only the science-domain complexity but also the performance-optimization complexity. HERCULES is a code transformation system designed to help the scientist to separate the two concerns, which improves code maintenance, and facilitates performance optimization. The system combines three technologies, code patterns, transformation scripts and compiler plugins, to provide the scientist with an environment to quickly implement code transformations that suit his needs. Unlike existing code optimization tools, HERCULES is unique in its focus on user-level accessibility. In this paper we discuss themore » design, implementation and an initial evaluation of HERCULES.« less

  11. Tri-Lab Co-Design Milestone: In-Depth Performance Portability Analysis of Improved Integrated Codes on Advanced Architecture.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hoekstra, Robert J.; Hammond, Simon David; Richards, David

    2017-09-01

    This milestone is a tri-lab deliverable supporting ongoing Co-Design efforts impacting applications in the Integrated Codes (IC) program element Advanced Technology Development and Mitigation (ATDM) program element. In FY14, the trilabs looked at porting proxy application to technologies of interest for ATS procurements. In FY15, a milestone was completed evaluating proxy applications in multiple programming models and in FY16, a milestone was completed focusing on the migration of lessons learned back into production code development. This year, the co-design milestone focuses on extracting the knowledge gained and/or code revisions back into production applications.

  12. TRIAD IV: Nationwide Survey of Medical Students' Understanding of Living Wills and DNR Orders.

    PubMed

    Mirarchi, Ferdinando L; Ray, Matthew; Cooney, Timothy

    2016-12-01

    Living wills are a form of advance directives that help to protect patient autonomy. They are frequently encountered in the conduct of medicine. Because of their impact on care, it is important to understand the adequacy of current medical school training in the preparation of physicians to interpret these directives. Between April and August 2011 of third and fourth year medical students participated in an internet survey involving the interpretation of living wills. The survey presented a standard living will as a "stand-alone," a standard living will with the addition an emergent clinical scenario and then variations of the standard living will that included a code status designation ("DNR," "Full Code," or "Comfort Care"). For each version/ scenario, respondents were asked to assign a code status and choose interventions based on the cases presented. Four hundred twenty-five students from medical schools throughout the country responded. The majority indicated they had received some form of advance directive training and understood the concept of code status and the term "DNR." Based on a stand-alone document, 15% of respondents correctly denoted "full code" as the appropriate code status; adding a clinical scenario yielded negligible improvement. When a code designation was added to the living will, correct code status responses ranged from 68% to 93%, whereas correct treatment decisions ranged from 18% to 78%. Previous training in advance directives had no impact on these results. Our data indicate that the majority of students failed to understand the key elements of a living will; adding a code status designations improved correct responses with the exception of the term DNR. Misunderstanding of advance directives is a nationwide problem and jeopardizes patient safety. Medical School ethics curricula need to be improved to ensure competency with respect to understanding advance directives.

  13. High-speed architecture for the decoding of trellis-coded modulation

    NASA Technical Reports Server (NTRS)

    Osborne, William P.

    1992-01-01

    Since 1971, when the Viterbi Algorithm was introduced as the optimal method of decoding convolutional codes, improvements in circuit technology, especially VLSI, have steadily increased its speed and practicality. Trellis-Coded Modulation (TCM) combines convolutional coding with higher level modulation (non-binary source alphabet) to provide forward error correction and spectral efficiency. For binary codes, the current stare-of-the-art is a 64-state Viterbi decoder on a single CMOS chip, operating at a data rate of 25 Mbps. Recently, there has been an interest in increasing the speed of the Viterbi Algorithm by improving the decoder architecture, or by reducing the algorithm itself. Designs employing new architectural techniques are now in existence, however these techniques are currently applied to simpler binary codes, not to TCM. The purpose of this report is to discuss TCM architectural considerations in general, and to present the design, at the logic gate level, or a specific TCM decoder which applies these considerations to achieve high-speed decoding.

  14. AspectAssay: A Technique for Expanding the Pool of Available Aspect Mining Test Data Using Concern Seeding

    ERIC Educational Resources Information Center

    Moore, David G., Jr.

    2013-01-01

    Aspect-oriented software design (AOSD) enables better and more complete separation of concerns in software-intensive systems. By extracting aspect code and relegating crosscutting functionality to aspects, software engineers can improve the maintainability of their code by reducing code tangling and coupling of code concerns. Further, the number…

  15. Overview of NASA Multi-dimensional Stirling Convertor Code Development and Validation Effort

    NASA Technical Reports Server (NTRS)

    Tew, Roy C.; Cairelli, James E.; Ibrahim, Mounir B.; Simon, Terrence W.; Gedeon, David

    2002-01-01

    A NASA grant has been awarded to Cleveland State University (CSU) to develop a multi-dimensional (multi-D) Stirling computer code with the goals of improving loss predictions and identifying component areas for improvements. The University of Minnesota (UMN) and Gedeon Associates are teamed with CSU. Development of test rigs at UMN and CSU and validation of the code against test data are part of the effort. The one-dimensional (1-D) Stirling codes used for design and performance prediction do not rigorously model regions of the working space where abrupt changes in flow area occur (such as manifolds and other transitions between components). Certain hardware experiences have demonstrated large performance gains by varying manifolds and heat exchanger designs to improve flow distributions in the heat exchangers. 1-D codes were not able to predict these performance gains. An accurate multi-D code should improve understanding of the effects of area changes along the main flow axis, sensitivity of performance to slight changes in internal geometry, and, in general, the understanding of various internal thermodynamic losses. The commercial CFD-ACE code has been chosen for development of the multi-D code. This 2-D/3-D code has highly developed pre- and post-processors, and moving boundary capability. Preliminary attempts at validation of CFD-ACE models of MIT gas spring and "two space" test rigs were encouraging. Also, CSU's simulations of the UMN oscillating-flow fig compare well with flow visualization results from UMN. A complementary Department of Energy (DOE) Regenerator Research effort is aiding in development of regenerator matrix models that will be used in the multi-D Stirling code. This paper reports on the progress and challenges of this

  16. Improvements and applications of COBRA-TF for stand-alone and coupled LWR safety analyses

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Avramova, M.; Cuervo, D.; Ivanov, K.

    2006-07-01

    The advanced thermal-hydraulic subchannel code COBRA-TF has been recently improved and applied for stand-alone and coupled LWR core calculations at the Pennsylvania State Univ. in cooperation with AREVA NP GmbH (Germany)) and the Technical Univ. of Madrid. To enable COBRA-TF for academic and industrial applications including safety margins evaluations and LWR core design analyses, the code programming, numerics, and basic models were revised and substantially improved. The code has undergone through an extensive validation, verification, and qualification program. (authors)

  17. U.S. Seismic Design Maps Web Application

    NASA Astrophysics Data System (ADS)

    Martinez, E.; Fee, J.

    2015-12-01

    The application computes earthquake ground motion design parameters compatible with the International Building Code and other seismic design provisions. It is the primary method for design engineers to obtain ground motion parameters for multiple building codes across the country. When designing new buildings and other structures, engineers around the country use the application. Users specify the design code of interest, location, and other parameters to obtain necessary ground motion information consisting of a high-level executive summary as well as detailed information including maps, data, and graphs. Results are formatted such that they can be directly included in a final engineering report. In addition to single-site analysis, the application supports a batch mode for simultaneous consideration of multiple locations. Finally, an application programming interface (API) is available which allows other application developers to integrate this application's results into larger applications for additional processing. Development on the application has proceeded in an iterative manner working with engineers through email, meetings, and workshops. Each iteration provided new features, improved performance, and usability enhancements. This development approach positioned the application to be integral to the structural design process and is now used to produce over 1800 reports daily. Recent efforts have enhanced the application to be a data-driven, mobile-first, responsive web application. Development is ongoing, and source code has recently been published into the open-source community on GitHub. Open-sourcing the code facilitates improved incorporation of user feedback to add new features ensuring the application's continued success.

  18. Insensitive Munitions Modeling Improvement Efforts

    DTIC Science & Technology

    2010-10-01

    LLNL) ALE3D . Report Documentation Page Form ApprovedOMB No. 0704-0188 Public reporting burden for the collection of information is estimated to...codes most commonly used by munition designers are CTH and the SIERRA suite of codes produced by Sandia National Labs (SNL) and ALE3D produced by... ALE3D , a LLNL developed code, is also used by various DoD participants. It was however, designed differently than either CTH or Sierra. ALE3D is a

  19. Design of a Low Aspect Ratio Transonic Compressor Stage Using CFD Techniques

    NASA Technical Reports Server (NTRS)

    Sanger, Nelson L.

    1994-01-01

    A transonic compressor stage has been designed for the Naval Postgraduate School Turbopropulsion Laboratory. The design relied heavily on CFD techniques while minimizing conventional empirical design methods. The low aspect ratio (1.2) rotor has been designed for a specific head ratio of .25 and a tip relative inlet Mach number of 1.3. Overall stage pressure ratio is 1.56. The rotor was designed using an Euler code augmented by a distributed body force model to account for viscous effects. This provided a relatively quick-running design tool, and was used for both rotor and stator calculations. The initial stator sections were sized using a compressible, cascade panel code. In addition to being used as a case study for teaching purposes, the compressor stage will be used as a research stage. Detailed measurements, including non-intrusive LDV, will be compared with the design computations, and with the results of other CFD codes, as a means of assessing and improving the computational codes as design tools.

  20. An Object-Oriented Computer Code for Aircraft Engine Weight Estimation

    NASA Technical Reports Server (NTRS)

    Tong, Michael T.; Naylor, Bret A.

    2009-01-01

    Reliable engine-weight estimation at the conceptual design stage is critical to the development of new aircraft engines. It helps to identify the best engine concept amongst several candidates. At NASA Glenn Research Center (GRC), the Weight Analysis of Turbine Engines (WATE) computer code, originally developed by Boeing Aircraft, has been used to estimate the engine weight of various conceptual engine designs. The code, written in FORTRAN, was originally developed for NASA in 1979. Since then, substantial improvements have been made to the code to improve the weight calculations for most of the engine components. Most recently, to improve the maintainability and extensibility of WATE, the FORTRAN code has been converted into an object-oriented version. The conversion was done within the NASA's NPSS (Numerical Propulsion System Simulation) framework. This enables WATE to interact seamlessly with the thermodynamic cycle model which provides component flow data such as airflows, temperatures, and pressures, etc., that are required for sizing the components and weight calculations. The tighter integration between the NPSS and WATE would greatly enhance system-level analysis and optimization capabilities. It also would facilitate the enhancement of the WATE code for next-generation aircraft and space propulsion systems. In this paper, the architecture of the object-oriented WATE code (or WATE++) is described. Both the FORTRAN and object-oriented versions of the code are employed to compute the dimensions and weight of a 300-passenger aircraft engine (GE90 class). Both versions of the code produce essentially identical results as should be the case.

  1. Optical network security using unipolar Walsh code

    NASA Astrophysics Data System (ADS)

    Sikder, Somali; Sarkar, Madhumita; Ghosh, Shila

    2018-04-01

    Optical code-division multiple-access (OCDMA) is considered as a good technique to provide optical layer security. Many research works have been published to enhance optical network security by using optical signal processing. The paper, demonstrates the design of the AWG (arrayed waveguide grating) router-based optical network for spectral-amplitude-coding (SAC) OCDMA networks with Walsh Code to design a reconfigurable network codec by changing signature codes to against eavesdropping. In this paper we proposed a code reconfiguration scheme to improve the network access confidentiality changing the signature codes by cyclic rotations, for OCDMA system. Each of the OCDMA network users is assigned a unique signature code to transmit the information and at the receiving end each receiver correlates its own signature pattern a(n) with the receiving pattern s(n). The signal arriving at proper destination leads to s(n)=a(n).

  2. Modeling Improvements and Users Manual for Axial-flow Turbine Off-design Computer Code AXOD

    NASA Technical Reports Server (NTRS)

    Glassman, Arthur J.

    1994-01-01

    An axial-flow turbine off-design performance computer code used for preliminary studies of gas turbine systems was modified and calibrated based on the experimental performance of large aircraft-type turbines. The flow- and loss-model modifications and calibrations are presented in this report. Comparisons are made between computed performances and experimental data for seven turbines over wide ranges of speed and pressure ratio. This report also serves as the users manual for the revised code, which is named AXOD.

  3. Evaluation and implementation of QR Code Identity Tag system for Healthcare in Turkey.

    PubMed

    Uzun, Vassilya; Bilgin, Sami

    2016-01-01

    For this study, we designed a QR Code Identity Tag system to integrate into the Turkish healthcare system. This system provides QR code-based medical identification alerts and an in-hospital patient identification system. Every member of the medical system is assigned a unique QR Code Tag; to facilitate medical identification alerts, the QR Code Identity Tag can be worn as a bracelet or necklace or carried as an ID card. Patients must always possess the QR Code Identity bracelets within hospital grounds. These QR code bracelets link to the QR Code Identity website, where detailed information is stored; a smartphone or standalone QR code scanner can be used to scan the code. The design of this system allows authorized personnel (e.g., paramedics, firefighters, or police) to access more detailed patient information than the average smartphone user: emergency service professionals are authorized to access patient medical histories to improve the accuracy of medical treatment. In Istanbul, we tested the self-designed system with 174 participants. To analyze the QR Code Identity Tag system's usability, the participants completed the System Usability Scale questionnaire after using the system.

  4. Design of Linear Accelerator (LINAC) tanks for proton therapy via Particle Swarm Optimization (PSO) and Genetic Algorithm (GA) approaches

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Castellano, T.; De Palma, L.; Laneve, D.

    2015-07-01

    A homemade computer code for designing a Side- Coupled Linear Accelerator (SCL) is written. It integrates a simplified model of SCL tanks with the Particle Swarm Optimization (PSO) algorithm. The computer code main aim is to obtain useful guidelines for the design of Linear Accelerator (LINAC) resonant cavities. The design procedure, assisted via the aforesaid approach seems very promising, allowing future improvements towards the optimization of actual accelerating geometries. (authors)

  5. Improving aircraft conceptual design - A PHIGS interactive graphics interface for ACSYNT

    NASA Technical Reports Server (NTRS)

    Wampler, S. G.; Myklebust, A.; Jayaram, S.; Gelhausen, P.

    1988-01-01

    A CAD interface has been created for the 'ACSYNT' aircraft conceptual design code that permits the execution and control of the design process via interactive graphics menus. This CAD interface was coded entirely with the new three-dimensional graphics standard, the Programmer's Hierarchical Interactive Graphics System. The CAD/ACSYNT system is designed for use by state-of-the-art high-speed imaging work stations. Attention is given to the approaches employed in modeling, data storage, and rendering.

  6. A simple approach to improve recording of concerns about childmaltreatment in primary care records: developing a quality improvement intervention

    PubMed Central

    Woodman, Jenny; Allister, Janice; Rafi, Imran; de Lusignan, Simon; Belsey, Jonathan; Petersen, Irene; Gilbert, Ruth

    2012-01-01

    Background Information is lacking on how concerns about child maltreatment are recorded in primary care records. Aim To determine how the recording of child maltreatment concerns can be improved. Design and setting Development of a quality improvement intervention involving: clinical audit, a descriptive survey, telephone interviews, a workshop, database analyses, and consensus development in UK general practice. Method Descriptive analyses and incidence estimates were carried out based on 11 study practices and 442 practices in The Health Improvement Network (THIN). Telephone interviews, a workshop, and a consensus development meeting were conducted with lead GPs from 11 study practices. Results The rate of children with at least one maltreatment-related code was 8.4/1000 child years (11 study practices, 2009–2010), and 8.0/1000 child years (THIN, 2009–2010). Of 25 patients with known maltreatment, six had no maltreatment-related codes recorded, but all had relevant free text, scanned documents, or codes. When stating their reasons for undercoding maltreatment concerns, GPs cited damage to the patient relationship, uncertainty about which codes to use, and having concerns about recording information on other family members in the child’s records. Consensus recommendations are to record the code ‘child is cause for concern’ as a red flag whenever maltreatment is considered, and to use a list of codes arranged around four clinical concepts, with an option for a templated short data entry form. Conclusion GPs under-record maltreatment-related concerns in children’s electronic medical records. As failure to use codes makes it impossible to search or audit these cases, an approach designed to be simple and feasible to implement in UK general practice was recommended. PMID:22781996

  7. Improvement of the cruise performances of a wing by means of aerodynamic optimization. Validation with a Far-Field method

    NASA Astrophysics Data System (ADS)

    Jiménez-Varona, J.; Ponsin Roca, J.

    2015-06-01

    Under a contract with AIRBUS MILITARY (AI-M), an exercise to analyze the potential of optimization techniques to improve the wing performances at cruise conditions has been carried out by using an in-house design code. The original wing was provided by AI-M and several constraints were posed for the redesign. To maximize the aerodynamic efficiency at cruise, optimizations were performed using the design techniques developed internally at INTA under a research program (Programa de Termofluidodinámica). The code is a gradient-based optimizaa tion code, which uses classical finite differences approach for gradient computations. Several techniques for search direction computation are implemented for unconstrained and constrained problems. Techniques for geometry modifications are based on different approaches which include perturbation functions for the thickness and/or mean line distributions and others by Bézier curves fitting of certain degree. It is very e important to afford a real design which involves several constraints that reduce significantly the feasible design space. And the assessment of the code is needed in order to check the capabilities and the possible drawbacks. Lessons learnt will help in the development of future enhancements. In addition, the validation of the results was done using also the well-known TAU flow solver and a far-field drag method in order to determine accurately the improvement in terms of drag counts.

  8. Tunable wavefront coded imaging system based on detachable phase mask: Mathematical analysis, optimization and underlying applications

    NASA Astrophysics Data System (ADS)

    Zhao, Hui; Wei, Jingxuan

    2014-09-01

    The key to the concept of tunable wavefront coding lies in detachable phase masks. Ojeda-Castaneda et al. (Progress in Electronics Research Symposium Proceedings, Cambridge, USA, July 5-8, 2010) described a typical design in which two components with cosinusoidal phase variation operate together to make defocus sensitivity tunable. The present study proposes an improved design and makes three contributions: (1) A mathematical derivation based on the stationary phase method explains why the detachable phase mask of Ojeda-Castaneda et al. tunes the defocus sensitivity. (2) The mathematical derivations show that the effective bandwidth wavefront coded imaging system is also tunable by making each component of the detachable phase mask move asymmetrically. An improved Fisher information-based optimization procedure was also designed to ascertain the optimal mask parameters corresponding to specific bandwidth. (3) Possible applications of the tunable bandwidth are demonstrated by simulated imaging.

  9. Design of convolutional tornado code

    NASA Astrophysics Data System (ADS)

    Zhou, Hui; Yang, Yao; Gao, Hongmin; Tan, Lu

    2017-09-01

    As a linear block code, the traditional tornado (tTN) code is inefficient in burst-erasure environment and its multi-level structure may lead to high encoding/decoding complexity. This paper presents a convolutional tornado (cTN) code which is able to improve the burst-erasure protection capability by applying the convolution property to the tTN code, and reduce computational complexity by abrogating the multi-level structure. The simulation results show that cTN code can provide a better packet loss protection performance with lower computation complexity than tTN code.

  10. Optimized atom position and coefficient coding for matching pursuit-based image compression.

    PubMed

    Shoa, Alireza; Shirani, Shahram

    2009-12-01

    In this paper, we propose a new encoding algorithm for matching pursuit image coding. We show that coding performance is improved when correlations between atom positions and atom coefficients are both used in encoding. We find the optimum tradeoff between efficient atom position coding and efficient atom coefficient coding and optimize the encoder parameters. Our proposed algorithm outperforms the existing coding algorithms designed for matching pursuit image coding. Additionally, we show that our algorithm results in better rate distortion performance than JPEG 2000 at low bit rates.

  11. An Object-oriented Computer Code for Aircraft Engine Weight Estimation

    NASA Technical Reports Server (NTRS)

    Tong, Michael T.; Naylor, Bret A.

    2008-01-01

    Reliable engine-weight estimation at the conceptual design stage is critical to the development of new aircraft engines. It helps to identify the best engine concept amongst several candidates. At NASA Glenn (GRC), the Weight Analysis of Turbine Engines (WATE) computer code, originally developed by Boeing Aircraft, has been used to estimate the engine weight of various conceptual engine designs. The code, written in FORTRAN, was originally developed for NASA in 1979. Since then, substantial improvements have been made to the code to improve the weight calculations for most of the engine components. Most recently, to improve the maintainability and extensibility of WATE, the FORTRAN code has been converted into an object-oriented version. The conversion was done within the NASA s NPSS (Numerical Propulsion System Simulation) framework. This enables WATE to interact seamlessly with the thermodynamic cycle model which provides component flow data such as airflows, temperatures, and pressures, etc. that are required for sizing the components and weight calculations. The tighter integration between the NPSS and WATE would greatly enhance system-level analysis and optimization capabilities. It also would facilitate the enhancement of the WATE code for next-generation aircraft and space propulsion systems. In this paper, the architecture of the object-oriented WATE code (or WATE++) is described. Both the FORTRAN and object-oriented versions of the code are employed to compute the dimensions and weight of a 300- passenger aircraft engine (GE90 class). Both versions of the code produce essentially identical results as should be the case. Keywords: NASA, aircraft engine, weight, object-oriented

  12. Heart Pump Design for Cleveland Clinic Foundation

    NASA Technical Reports Server (NTRS)

    2005-01-01

    Through a Lewis CommTech Program project with the Cleveland Clinic Foundation, the NASA Lewis Research Center is playing a key role in the design and development of a permanently implantable, artificial heart pump assist device. Known as the Innovative Ventricular Assist System (IVAS), this device will take on the pumping role of the damaged left ventricle of the heart. The key part of the IVAS is a nonpulsatile (continuous flow) artificial heart pump with centrifugal impeller blades, driven by an electric motor. Lewis is part of an industry and academia team, led by the Ohio Aerospace Institute (OAI), that is working with the Cleveland Clinic Foundation to make IVAS a reality. This device has the potential to save tens of thousands of lives each year, since 80 percent of heart attack victims suffer irreversible damage to the left ventricle, the part of the heart that does most of the pumping. Impeller blade design codes and flow-modeling analytical codes will be used in the project. These codes were developed at Lewis for the aerospace industry but will be applicable to the IVAS design project. The analytical codes, which currently simulate the flow through the compressor and pump systems, will be used to simulate the flow within the blood pump in the artificial heart assist device. The Interdisciplinary Technology Office heads up Lewis' efforts in the IVAS project. With the aid of numerical modeling, the blood pump will address many design issues, including some fluid-dynamic design considerations that are unique to the properties of blood. Some of the issues that will be addressed in the design process include hemolysis, deposition, recirculation, pump efficiency, rotor thrust balance, and bearing lubrication. Optimum pumping system performance will be achieved by modeling all the interactions between the pump components. The interactions can be multidisciplinary and, therefore, are influenced not only by the fluid dynamics of adjacent components but also by thermal and structural effects. Lewis-developed flow-modeling codes to be used in the pump simulations will include a one-dimensional code and an incompressible three-dimensional Navier-Stokes flow code. These codes will analyze the prototype pump designed by the Cleveland Clinic Foundation. With an improved understanding of the flow phenomena within the prototype pump, design changes to improve the performance of the pump system can be verified by computer prior to fabrication in order to reduce risks. The use of Lewis flow modeling codes during the design and development process will improve pump system performance and reduce the number of prototypes built in the development phase. The first phase of the IVAS project is to fully develop the prototype in a laboratory environment that uses a water/glycerin mixture as the surrogate fluid to simulate blood. A later phase of the project will include testing in animals for final validation. Lewis will be involved in the IVAS project for 3 to 5 years.

  13. Building Energy Codes: Policy Overview and Good Practices

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cox, Sadie

    2016-02-19

    Globally, 32% of total final energy consumption is attributed to the building sector. To reduce energy consumption, energy codes set minimum energy efficiency standards for the building sector. With effective implementation, building energy codes can support energy cost savings and complementary benefits associated with electricity reliability, air quality improvement, greenhouse gas emission reduction, increased comfort, and economic and social development. This policy brief seeks to support building code policymakers and implementers in designing effective building code programs.

  14. The Use of a Code-generating System for the Derivation of the Equations for Wind Turbine Dynamics

    NASA Astrophysics Data System (ADS)

    Ganander, Hans

    2003-10-01

    For many reasons the size of wind turbines on the rapidly growing wind energy market is increasing. Relations between aeroelastic properties of these new large turbines change. Modifications of turbine designs and control concepts are also influenced by growing size. All these trends require development of computer codes for design and certification. Moreover, there is a strong desire for design optimization procedures, which require fast codes. General codes, e.g. finite element codes, normally allow such modifications and improvements of existing wind turbine models. This is done relatively easy. However, the calculation times of such codes are unfavourably long, certainly for optimization use. The use of an automatic code generating system is an alternative for relevance of the two key issues, the code and the design optimization. This technique can be used for rapid generation of codes of particular wind turbine simulation models. These ideas have been followed in the development of new versions of the wind turbine simulation code VIDYN. The equations of the simulation model were derived according to the Lagrange equation and using Mathematica®, which was directed to output the results in Fortran code format. In this way the simulation code is automatically adapted to an actual turbine model, in terms of subroutines containing the equations of motion, definitions of parameters and degrees of freedom. Since the start in 1997, these methods, constituting a systematic way of working, have been used to develop specific efficient calculation codes. The experience with this technique has been very encouraging, inspiring the continued development of new versions of the simulation code as the need has arisen, and the interest for design optimization is growing.

  15. GAMERA - The New Magnetospheric Code

    NASA Astrophysics Data System (ADS)

    Lyon, J.; Sorathia, K.; Zhang, B.; Merkin, V. G.; Wiltberger, M. J.; Daldorff, L. K. S.

    2017-12-01

    The Lyon-Fedder-Mobarry (LFM) code has been a main-line magnetospheric simulation code for 30 years. The code base, designed in the age of memory to memory vector ma- chines,is still in wide use for science production but needs upgrading to ensure the long term sustainability. In this presentation, we will discuss our recent efforts to update and improve that code base and also highlight some recent results. The new project GAM- ERA, Grid Agnostic MHD for Extended Research Applications, has kept the original design characteristics of the LFM and made significant improvements. The original de- sign included high order numerical differencing with very aggressive limiting, the ability to use arbitrary, but logically rectangular, grids, and maintenance of div B = 0 through the use of the Yee grid. Significant improvements include high-order upwinding and a non-clipping limiter. One other improvement with wider applicability is an im- proved averaging technique for the singularities in polar and spherical grids. The new code adopts a hybrid structure - multi-threaded OpenMP with an overarching MPI layer for large scale and coupled applications. The MPI layer uses a combination of standard MPI and the Global Array Toolkit from PNL to provide a lightweight mechanism for coupling codes together concurrently. The single processor code is highly efficient and can run magnetospheric simulations at the default CCMC resolution faster than real time on a MacBook pro. We have run the new code through the Athena suite of tests, and the results compare favorably with the codes available to the astrophysics community. LFM/GAMERA has been applied to many different situations ranging from the inner and outer heliosphere and magnetospheres of Venus, the Earth, Jupiter and Saturn. We present example results the Earth's magnetosphere including a coupled ring current (RCM), the magnetospheres of Jupiter and Saturn, and the inner heliosphere.

  16. On the Use of Statistics in Design and the Implications for Deterministic Computer Experiments

    NASA Technical Reports Server (NTRS)

    Simpson, Timothy W.; Peplinski, Jesse; Koch, Patrick N.; Allen, Janet K.

    1997-01-01

    Perhaps the most prevalent use of statistics in engineering design is through Taguchi's parameter and robust design -- using orthogonal arrays to compute signal-to-noise ratios in a process of design improvement. In our view, however, there is an equally exciting use of statistics in design that could become just as prevalent: it is the concept of metamodeling whereby statistical models are built to approximate detailed computer analysis codes. Although computers continue to get faster, analysis codes always seem to keep pace so that their computational time remains non-trivial. Through metamodeling, approximations of these codes are built that are orders of magnitude cheaper to run. These metamodels can then be linked to optimization routines for fast analysis, or they can serve as a bridge for integrating analysis codes across different domains. In this paper we first review metamodeling techniques that encompass design of experiments, response surface methodology, Taguchi methods, neural networks, inductive learning, and kriging. We discuss their existing applications in engineering design and then address the dangers of applying traditional statistical techniques to approximate deterministic computer analysis codes. We conclude with recommendations for the appropriate use of metamodeling techniques in given situations and how common pitfalls can be avoided.

  17. Partial Least Squares with Structured Output for Modelling the Metabolomics Data Obtained from Complex Experimental Designs: A Study into the Y-Block Coding.

    PubMed

    Xu, Yun; Muhamadali, Howbeer; Sayqal, Ali; Dixon, Neil; Goodacre, Royston

    2016-10-28

    Partial least squares (PLS) is one of the most commonly used supervised modelling approaches for analysing multivariate metabolomics data. PLS is typically employed as either a regression model (PLS-R) or a classification model (PLS-DA). However, in metabolomics studies it is common to investigate multiple, potentially interacting, factors simultaneously following a specific experimental design. Such data often cannot be considered as a "pure" regression or a classification problem. Nevertheless, these data have often still been treated as a regression or classification problem and this could lead to ambiguous results. In this study, we investigated the feasibility of designing a hybrid target matrix Y that better reflects the experimental design than simple regression or binary class membership coding commonly used in PLS modelling. The new design of Y coding was based on the same principle used by structural modelling in machine learning techniques. Two real metabolomics datasets were used as examples to illustrate how the new Y coding can improve the interpretability of the PLS model compared to classic regression/classification coding.

  18. A Multi-Scale, Multi-Physics Optimization Framework for Additively Manufactured Structural Components

    NASA Astrophysics Data System (ADS)

    El-Wardany, Tahany; Lynch, Mathew; Gu, Wenjiong; Hsu, Arthur; Klecka, Michael; Nardi, Aaron; Viens, Daniel

    This paper proposes an optimization framework enabling the integration of multi-scale / multi-physics simulation codes to perform structural optimization design for additively manufactured components. Cold spray was selected as the additive manufacturing (AM) process and its constraints were identified and included in the optimization scheme. The developed framework first utilizes topology optimization to maximize stiffness for conceptual design. The subsequent step applies shape optimization to refine the design for stress-life fatigue. The component weight was reduced by 20% while stresses were reduced by 75% and the rigidity was improved by 37%. The framework and analysis codes were implemented using Altair software as well as an in-house loading code. The optimized design was subsequently produced by the cold spray process.

  19. An installed nacelle design code using a multiblock Euler solver. Volume 2: User guide

    NASA Technical Reports Server (NTRS)

    Chen, H. C.

    1992-01-01

    This is a user manual for the general multiblock Euler design (GMBEDS) code. The code is for the design of a nacelle installed on a geometrically complex configuration such as a complete airplane with wing/body/nacelle/pylon. It consists of two major building blocks: a design module developed by LaRC using directive iterative surface curvature (DISC); and a general multiblock Euler (GMBE) flow solver. The flow field surrounding a complex configuration is divided into a number of topologically simple blocks to facilitate surface-fitted grid generation and improve flow solution efficiency. This user guide provides input data formats along with examples of input files and a Unix script for program execution in the UNICOS environment.

  20. Structured Design Language for Computer Programs

    NASA Technical Reports Server (NTRS)

    Pace, Walter H., Jr.

    1986-01-01

    Box language used at all stages of program development. Developed to provide improved productivity in designing, coding, and maintaining computer programs. BOX system written in FORTRAN 77 for batch execution.

  1. Core Physics and Kinetics Calculations for the Fissioning Plasma Core Reactor

    NASA Technical Reports Server (NTRS)

    Butler, C.; Albright, D.

    2007-01-01

    Highly efficient, compact nuclear reactors would provide high specific impulse spacecraft propulsion. This analysis and numerical simulation effort has focused on the technical feasibility issues related to the nuclear design characteristics of a novel reactor design. The Fissioning Plasma Core Reactor (FPCR) is a shockwave-driven gaseous-core nuclear reactor, which uses Magneto Hydrodynamic effects to generate electric power to be used for propulsion. The nuclear design of the system depends on two major calculations: core physics calculations and kinetics calculations. Presently, core physics calculations have concentrated on the use of the MCNP4C code. However, initial results from other codes such as COMBINE/VENTURE and SCALE4a. are also shown. Several significant modifications were made to the ISR-developed QCALC1 kinetics analysis code. These modifications include testing the state of the core materials, an improvement to the calculation of the material properties of the core, the addition of an adiabatic core temperature model and improvement of the first order reactivity correction model. The accuracy of these modifications has been verified, and the accuracy of the point-core kinetics model used by the QCALC1 code has also been validated. Previously calculated kinetics results for the FPCR were described in the ISR report, "QCALC1: A code for FPCR Kinetics Model Feasibility Analysis" dated June 1, 2002.

  2. VLSI design of lossless frame recompression using multi-orientation prediction

    NASA Astrophysics Data System (ADS)

    Lee, Yu-Hsuan; You, Yi-Lun; Chen, Yi-Guo

    2016-01-01

    Pursuing an experience of high-end visual quality drives human to demand a higher display resolution and a higher frame rate. Hence, a lot of powerful coding tools are aggregated together in emerging video coding standards to improve coding efficiency. This also makes video coding standards suffer from two design challenges: heavy computation and tremendous memory bandwidth. The first issue can be properly solved by a careful hardware architecture design with advanced semiconductor processes. Nevertheless, the second one becomes a critical design bottleneck for a modern video coding system. In this article, a lossless frame recompression using multi-orientation prediction technique is proposed to overcome this bottleneck. This work is realised into a silicon chip with the technology of TSMC 0.18 µm CMOS process. Its encoding capability can reach full-HD (1920 × 1080)@48 fps. The chip power consumption is 17.31 mW@100 MHz. Core area and chip area are 0.83 × 0.83 mm2 and 1.20 × 1.20 mm2, respectively. Experiment results demonstrate that this work exhibits an outstanding performance on lossless compression ratio with a competitive hardware performance.

  3. Improved Algorithm For Finite-Field Normal-Basis Multipliers

    NASA Technical Reports Server (NTRS)

    Wang, C. C.

    1989-01-01

    Improved algorithm reduces complexity of calculations that must precede design of Massey-Omura finite-field normal-basis multipliers, used in error-correcting-code equipment and cryptographic devices. Algorithm represents an extension of development reported in "Algorithm To Design Finite-Field Normal-Basis Multipliers" (NPO-17109), NASA Tech Briefs, Vol. 12, No. 5, page 82.

  4. Optimization and parallelization of the thermal–hydraulic subchannel code CTF for high-fidelity multi-physics applications

    DOE PAGES

    Salko, Robert K.; Schmidt, Rodney C.; Avramova, Maria N.

    2014-11-23

    This study describes major improvements to the computational infrastructure of the CTF subchannel code so that full-core, pincell-resolved (i.e., one computational subchannel per real bundle flow channel) simulations can now be performed in much shorter run-times, either in stand-alone mode or as part of coupled-code multi-physics calculations. These improvements support the goals of the Department Of Energy Consortium for Advanced Simulation of Light Water Reactors (CASL) Energy Innovation Hub to develop high fidelity multi-physics simulation tools for nuclear energy design and analysis.

  5. Maximum aposteriori joint source/channel coding

    NASA Technical Reports Server (NTRS)

    Sayood, Khalid; Gibson, Jerry D.

    1991-01-01

    A maximum aposteriori probability (MAP) approach to joint source/channel coder design is presented in this paper. This method attempts to explore a technique for designing joint source/channel codes, rather than ways of distributing bits between source coders and channel coders. For a nonideal source coder, MAP arguments are used to design a decoder which takes advantage of redundancy in the source coder output to perform error correction. Once the decoder is obtained, it is analyzed with the purpose of obtaining 'desirable properties' of the channel input sequence for improving overall system performance. Finally, an encoder design which incorporates these properties is proposed.

  6. Improved heliostat field design for solar tower plants

    NASA Astrophysics Data System (ADS)

    Collado, Francisco J.; Guallar, Jesús

    2017-06-01

    In solar power tower (SPT) systems, selecting the optimum location of thousands of heliostats and the most profitable tower height and receiver size remains a challenge. Campo code is prepared for the detailed design of such plants in particular, the optimum layout, provided that the plant size is known. Therefore, less exhaustive codes, as DELSOL3, are also needed to perform preliminary parametric analysis that narrows the most economic size of the plant.

  7. Research on Formation of Microsatellite Communication with Genetic Algorithm

    PubMed Central

    Wu, Guoqiang; Bai, Yuguang; Sun, Zhaowei

    2013-01-01

    For the formation of three microsatellites which fly in the same orbit and perform three-dimensional solid mapping for terra, this paper proposes an optimizing design method of space circular formation order based on improved generic algorithm and provides an intersatellite direct spread spectrum communication system. The calculating equation of LEO formation flying satellite intersatellite links is guided by the special requirements of formation-flying microsatellite intersatellite links, and the transmitter power is also confirmed throughout the simulation. The method of space circular formation order optimizing design based on improved generic algorithm is given, and it can keep formation order steady for a long time under various absorb impetus. The intersatellite direct spread spectrum communication system is also provided. It can be found that, when the distance is 1 km and the data rate is 1 Mbps, the input wave matches preferably with the output wave. And LDPC code can improve the communication performance. The correct capability of (512, 256) LDPC code is better than (2, 1, 7) convolution code, distinctively. The design system can satisfy the communication requirements of microsatellites. So, the presented method provides a significant theory foundation for formation-flying and intersatellite communication. PMID:24078796

  8. Research on formation of microsatellite communication with genetic algorithm.

    PubMed

    Wu, Guoqiang; Bai, Yuguang; Sun, Zhaowei

    2013-01-01

    For the formation of three microsatellites which fly in the same orbit and perform three-dimensional solid mapping for terra, this paper proposes an optimizing design method of space circular formation order based on improved generic algorithm and provides an intersatellite direct spread spectrum communication system. The calculating equation of LEO formation flying satellite intersatellite links is guided by the special requirements of formation-flying microsatellite intersatellite links, and the transmitter power is also confirmed throughout the simulation. The method of space circular formation order optimizing design based on improved generic algorithm is given, and it can keep formation order steady for a long time under various absorb impetus. The intersatellite direct spread spectrum communication system is also provided. It can be found that, when the distance is 1 km and the data rate is 1 Mbps, the input wave matches preferably with the output wave. And LDPC code can improve the communication performance. The correct capability of (512, 256) LDPC code is better than (2, 1, 7) convolution code, distinctively. The design system can satisfy the communication requirements of microsatellites. So, the presented method provides a significant theory foundation for formation-flying and intersatellite communication.

  9. Data Assimilation - Advances and Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Williams, Brian J.

    2014-07-30

    This presentation provides an overview of data assimilation (model calibration) for complex computer experiments. Calibration refers to the process of probabilistically constraining uncertain physics/engineering model inputs to be consistent with observed experimental data. An initial probability distribution for these parameters is updated using the experimental information. Utilization of surrogate models and empirical adjustment for model form error in code calibration form the basis for the statistical methodology considered. The role of probabilistic code calibration in supporting code validation is discussed. Incorporation of model form uncertainty in rigorous uncertainty quantification (UQ) analyses is also addressed. Design criteria used within a batchmore » sequential design algorithm are introduced for efficiently achieving predictive maturity and improved code calibration. Predictive maturity refers to obtaining stable predictive inference with calibrated computer codes. These approaches allow for augmentation of initial experiment designs for collecting new physical data. A standard framework for data assimilation is presented and techniques for updating the posterior distribution of the state variables based on particle filtering and the ensemble Kalman filter are introduced.« less

  10. FY17 Status Report on NEAMS Neutronics Activities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, C. H.; Jung, Y. S.; Smith, M. A.

    2017-09-30

    Under the U.S. DOE NEAMS program, the high-fidelity neutronics code system has been developed to support the multiphysics modeling and simulation capability named SHARP. The neutronics code system includes the high-fidelity neutronics code PROTEUS, the cross section library and preprocessing tools, the multigroup cross section generation code MC2-3, the in-house meshing generation tool, the perturbation and sensitivity analysis code PERSENT, and post-processing tools. The main objectives of the NEAMS neutronics activities in FY17 are to continue development of an advanced nodal solver in PROTEUS for use in nuclear reactor design and analysis projects, implement a simplified sub-channel based thermal-hydraulic (T/H)more » capability into PROTEUS to efficiently compute the thermal feedback, improve the performance of PROTEUS-MOCEX using numerical acceleration and code optimization, improve the cross section generation tools including MC2-3, and continue to perform verification and validation tests for PROTEUS.« less

  11. Use, Assessment, and Improvement of the Loci-CHEM CFD Code for Simulation of Combustion in a Single Element GO2/GH2 Injector and Chamber

    NASA Technical Reports Server (NTRS)

    Westra, Douglas G.; Lin, Jeff; West, Jeff; Tucker, Kevin

    2006-01-01

    This document is a viewgraph presentation of a paper that documents a continuing effort at Marshall Space Flight Center (MSFC) to use, assess, and continually improve CFD codes to the point of material utility in the design of rocket engine combustion devices. This paper describes how the code is presently being used to simulate combustion in a single element combustion chamber with shear coaxial injectors using gaseous oxygen and gaseous hydrogen propellants. The ultimate purpose of the efforts documented is to assess and further improve the Loci-CHEM code and the implementation of it. Single element shear coaxial injectors were tested as part of the Staged Combustion Injector Technology (SCIT) program, where detailed chamber wall heat fluxes were measured. Data was taken over a range of chamber pressures for propellants injected at both ambient and elevated temperatures. Several test cases are simulated as part of the effort to demonstrate use of the Loci-CHEM CFD code and to enable us to make improvements in the code as needed. The simulations presented also include a grid independence study on hybrid grids. Several two-equation eddy viscosity low Reynolds number turbulence models are also evaluated as part of the study. All calculations are presented with a comparison to the experimental data. Weaknesses of the code relative to test data are discussed and continuing efforts to improve the code are presented.

  12. Development of 3D pseudo pin-by-pin calculation methodology in ANC

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, B.; Mayhue, L.; Huria, H.

    2012-07-01

    Advanced cores and fuel assembly designs have been developed to improve operational flexibility, economic performance and further enhance safety features of nuclear power plants. The simulation of these new designs, along with strong heterogeneous fuel loading, have brought new challenges to the reactor physics methodologies currently employed in the industrial codes for core analyses. Control rod insertion during normal operation is one operational feature in the AP1000{sup R} plant of Westinghouse next generation Pressurized Water Reactor (PWR) design. This design improves its operational flexibility and efficiency but significantly challenges the conventional reactor physics methods, especially in pin power calculations. Themore » mixture loading of fuel assemblies with significant neutron spectrums causes a strong interaction between different fuel assembly types that is not fully captured with the current core design codes. To overcome the weaknesses of the conventional methods, Westinghouse has developed a state-of-the-art 3D Pin-by-Pin Calculation Methodology (P3C) and successfully implemented in the Westinghouse core design code ANC. The new methodology has been qualified and licensed for pin power prediction. The 3D P3C methodology along with its application and validation will be discussed in the paper. (authors)« less

  13. Improved Simplified Methods for Effective Seismic Analysis and Design of Isolated and Damped Bridges in Western and Eastern North America

    NASA Astrophysics Data System (ADS)

    Koval, Viacheslav

    The seismic design provisions of the CSA-S6 Canadian Highway Bridge Design Code and the AASHTO LRFD Seismic Bridge Design Specifications have been developed primarily based on historical earthquake events that have occurred along the west coast of North America. For the design of seismic isolation systems, these codes include simplified analysis and design methods. The appropriateness and range of application of these methods are investigated through extensive parametric nonlinear time history analyses in this thesis. It was found that there is a need to adjust existing design guidelines to better capture the expected nonlinear response of isolated bridges. For isolated bridges located in eastern North America, new damping coefficients are proposed. The applicability limits of the code-based simplified methods have been redefined to ensure that the modified method will lead to conservative results and that a wider range of seismically isolated bridges can be covered by this method. The possibility of further improving current simplified code methods was also examined. By transforming the quantity of allocated energy into a displacement contribution, an idealized analytical solution is proposed as a new simplified design method. This method realistically reflects the effects of ground-motion and system design parameters, including the effects of a drifted oscillation center. The proposed method is therefore more appropriate than current existing simplified methods and can be applicable to isolation systems exhibiting a wider range of properties. A multi-level-hazard performance matrix has been adopted by different seismic provisions worldwide and will be incorporated into the new edition of the Canadian CSA-S6-14 Bridge Design code. However, the combined effect and optimal use of isolation and supplemental damping devices in bridges have not been fully exploited yet to achieve enhanced performance under different levels of seismic hazard. A novel Dual-Level Seismic Protection (DLSP) concept is proposed and developed in this thesis which permits to achieve optimum seismic performance with combined isolation and supplemental damping devices in bridges. This concept is shown to represent an attractive design approach for both the upgrade of existing seismically deficient bridges and the design of new isolated bridges.

  14. LDPC-coded MIMO optical communication over the atmospheric turbulence channel using Q-ary pulse-position modulation.

    PubMed

    Djordjevic, Ivan B

    2007-08-06

    We describe a coded power-efficient transmission scheme based on repetition MIMO principle suitable for communication over the atmospheric turbulence channel, and determine its channel capacity. The proposed scheme employs the Q-ary pulse-position modulation. We further study how to approach the channel capacity limits using low-density parity-check (LDPC) codes. Component LDPC codes are designed using the concept of pairwise-balanced designs. Contrary to the several recent publications, bit-error rates and channel capacities are reported assuming non-ideal photodetection. The atmospheric turbulence channel is modeled using the Gamma-Gamma distribution function due to Al-Habash et al. Excellent bit-error rate performance improvement, over uncoded case, is found.

  15. Long term integrity for space station power systems

    NASA Technical Reports Server (NTRS)

    Leckie, F. A.; Marriott, D. L.

    1991-01-01

    A study was made of the High Temperature Design Codes ASME N47, British R5, and the French RCC-MR Rules. It is concluded that all these codes provide a good basis of design for space application. The new British R5 is the most complete since it deals with the problem of defects. The ASME N47 was subjected longer to practical application and scrutiny. A draft code is introduced, and a proposed draft for high temperature design in which attempts were made to identify gaps and improvements is suggested. The design is limited by creep characteristics. In these circumstances, life is strongly affected by the selected value of the factor of safety. The factor of safety of primary loads adopted in the codes is 1.5. Maybe a lower value of 1.25 is permissible for use in space. Long term creep rupture data for HAYNES 188 is deficient and it is suggested that extrapolation methods be investigated.

  16. Final Report Advanced Quasioptical Launcher System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jeffrey Neilson

    2010-04-30

    This program developed an analytical design tool for designing antenna and mirror systems to convert whispering gallery RF modes to Gaussian or HE11 modes. Whispering gallery modes are generated by gyrotrons used for electron cyclotron heating of fusion plasmas in tokamaks. These modes cannot be easily transmitted and must be converted to free space or waveguide modes compatible with transmission line systems.This program improved the capability of SURF3D/LOT, which was initially developed in a previous SBIR program. This suite of codes revolutionized quasi-optical launcher design, and this code, or equivalent codes, are now used worldwide. This program added functionality tomore » SURF3D/LOT to allow creating of more compact launcher and mirror systems and provide direct coupling to corrugated waveguide within the vacuum envelope of the gyrotron. Analysis was also extended to include full-wave analysis of mirror transmission line systems. The code includes a graphical user interface and is available for advanced design of launcher systems.« less

  17. High-resolution coded-aperture design for compressive X-ray tomography using low resolution detectors

    NASA Astrophysics Data System (ADS)

    Mojica, Edson; Pertuz, Said; Arguello, Henry

    2017-12-01

    One of the main challenges in Computed Tomography (CT) is obtaining accurate reconstructions of the imaged object while keeping a low radiation dose in the acquisition process. In order to solve this problem, several researchers have proposed the use of compressed sensing for reducing the amount of measurements required to perform CT. This paper tackles the problem of designing high-resolution coded apertures for compressed sensing computed tomography. In contrast to previous approaches, we aim at designing apertures to be used with low-resolution detectors in order to achieve super-resolution. The proposed method iteratively improves random coded apertures using a gradient descent algorithm subject to constraints in the coherence and homogeneity of the compressive sensing matrix induced by the coded aperture. Experiments with different test sets show consistent results for different transmittances, number of shots and super-resolution factors.

  18. HEC Applications on Columbia Project

    NASA Technical Reports Server (NTRS)

    Taft, Jim

    2004-01-01

    NASA's Columbia system consists of a cluster of twenty 512 processor SGI Altix systems. Each of these systems is 3 TFLOP/s in peak performance - approximately the same as the entire compute capability at NAS just one year ago. Each 512p system is a single system image machine with one Linunx O5, one high performance file system, and one globally shared memory. The NAS Terascale Applications Group (TAG) is chartered to assist in scaling NASA's mission critical codes to at least 512p in order to significantly improve emergency response during flight operations, as well as provide significant improvements in the codes. and rate of scientific discovery across the scientifc disciplines within NASA's Missions. Recent accomplishments are 4x improvements to codes in the ocean modeling community, 10x performance improvements in a number of computational fluid dynamics codes used in aero-vehicle design, and 5x improvements in a number of space science codes dealing in extreme physics. The TAG group will continue its scaling work to 2048p and beyond (10240 cpus) as the Columbia system becomes fully operational and the upgrades to the SGI NUMAlink memory fabric are in place. The NUMlink uprades dramatically improve system scalability for a single application. These upgrades will allow a number of codes to execute faster at higher fidelity than ever before on any other system, thus increasing the rate of scientific discovery even further

  19. Improving coding accuracy in an academic practice.

    PubMed

    Nguyen, Dana; O'Mara, Heather; Powell, Robert

    2017-01-01

    Practice management has become an increasingly important component of graduate medical education. This applies to every practice environment; private, academic, and military. One of the most critical aspects of practice management is documentation and coding for physician services, as they directly affect the financial success of any practice. Our quality improvement project aimed to implement a new and innovative method for teaching billing and coding in a longitudinal fashion in a family medicine residency. We hypothesized that implementation of a new teaching strategy would increase coding accuracy rates among residents and faculty. Design: single group, pretest-posttest. military family medicine residency clinic. Study populations: 7 faculty physicians and 18 resident physicians participated as learners in the project. Educational intervention: monthly structured coding learning sessions in the academic curriculum that involved learner-presented cases, small group case review, and large group discussion. overall coding accuracy (compliance) percentage and coding accuracy per year group for the subjects that were able to participate longitudinally. Statistical tests used: average coding accuracy for population; paired t test to assess improvement between 2 intervention periods, both aggregate and by year group. Overall coding accuracy rates remained stable over the course of time regardless of the modality of the educational intervention. A paired t test was conducted to compare coding accuracy rates at baseline (mean (M)=26.4%, SD=10%) to accuracy rates after all educational interventions were complete (M=26.8%, SD=12%); t24=-0.127, P=.90. Didactic teaching and small group discussion sessions did not improve overall coding accuracy in a residency practice. Future interventions could focus on educating providers at the individual level.

  20. Application of computational fluid dynamics and laminar flow technology for improved performance and sonic boom reduction

    NASA Technical Reports Server (NTRS)

    Bobbitt, Percy J.

    1992-01-01

    A discussion is given of the many factors that affect sonic booms with particular emphasis on the application and development of improved computational fluid dynamics (CFD) codes. The benefits that accrue from interference (induced) lift, distributing lift using canard configurations, the use of wings with dihedral or anhedral and hybrid laminar flow control for drag reduction are detailed. The application of the most advanced codes to a wider variety of configurations along with improved ray-tracing codes to arrive at more accurate and, hopefully, lower sonic booms is advocated. Finally, it is speculated that when all of the latest technology is applied to the design of a supersonic transport it will be found environmentally acceptable.

  1. Combined trellis coding with asymmetric MPSK modulation: An MSAT-X report

    NASA Technical Reports Server (NTRS)

    Simon, M. K.; Divsalar, D.

    1985-01-01

    Traditionally symmetric, multiple phase-shift-keyed (MPSK) signal constellations, i.e., those with uniformly spaced signal points around the circle, have been used for both uncoded and coded systems. Although symmetric MPSK signal constellations are optimum for systems with no coding, the same is not necessarily true for coded systems. This appears to show that by designing the signal constellations to be asymmetric, one can, in many instances, obtain a significant performance improvement over the traditional symmetric MPSK constellations combined with trellis coding. The joint design of n/(n + 1) trellis codes and asymmetric 2 sup n + 1 - point MPSK is considered, which has a unity bandwidth expansion relative to uncoded 2 sup n-point symmetric MPSK. The asymptotic performance gains due to coding and asymmetry are evaluated in terms of the minimum free Euclidean distance free of the trellis. A comparison of the maximum value of this performance measure with the minimum distance d sub min of the uncoded system is an indication of the maximum reduction in required E sub b/N sub O that can be achieved for arbitrarily small system bit-error rates. It is to be emphasized that the introduction of asymmetry into the signal set does not effect the bandwidth of power requirements of the system; hence, the above-mentioned improvements in performance come at little or no cost. MPSK signal sets in coded systems appear in the work of Divsalar.

  2. Axial and Centrifugal Compressor Mean Line Flow Analysis Method

    NASA Technical Reports Server (NTRS)

    Veres, Joseph P.

    2009-01-01

    This paper describes a method to estimate key aerodynamic parameters of single and multistage axial and centrifugal compressors. This mean-line compressor code COMDES provides the capability of sizing single and multistage compressors quickly during the conceptual design process. Based on the compressible fluid flow equations and the Euler equation, the code can estimate rotor inlet and exit blade angles when run in the design mode. The design point rotor efficiency and stator losses are inputs to the code, and are modeled at off design. When run in the off-design analysis mode, it can be used to generate performance maps based on simple models for losses due to rotor incidence and inlet guide vane reset angle. The code can provide an improved understanding of basic aerodynamic parameters such as diffusion factor, loading levels and incidence, when matching multistage compressor blade rows at design and at part-speed operation. Rotor loading levels and relative velocity ratio are correlated to the onset of compressor surge. NASA Stage 37 and the three-stage NASA 74-A axial compressors were analyzed and the results compared to test data. The code has been used to generate the performance map for the NASA 76-B three-stage axial compressor featuring variable geometry. The compressor stages were aerodynamically matched at off-design speeds by adjusting the variable inlet guide vane and variable stator geometry angles to control the rotor diffusion factor and incidence angles.

  3. Statistical Analysis of CFD Solutions from 2nd Drag Prediction Workshop

    NASA Technical Reports Server (NTRS)

    Hemsch, M. J.; Morrison, J. H.

    2004-01-01

    In June 2001, the first AIAA Drag Prediction Workshop was held to evaluate results obtained from extensive N-Version testing of a series of RANS CFD codes. The geometry used for the computations was the DLR-F4 wing-body combination which resembles a medium-range subsonic transport. The cases reported include the design cruise point, drag polars at eight Mach numbers, and drag rise at three values of lift. Although comparisons of the code-to-code medians with available experimental data were similar to those obtained in previous studies, the code-to-code scatter was more than an order-of-magnitude larger than expected and far larger than desired for design and for experimental validation. The second Drag Prediction Workshop was held in June 2003 with emphasis on the determination of installed pylon-nacelle drag increments and on grid refinement studies. The geometry used was the DLR-F6 wing-body-pylon-nacelle combination for which the design cruise point and the cases run were similar to the first workshop except for additional runs on coarse and fine grids to complement the runs on medium grids. The code-to-code scatter was significantly reduced for the wing-body configuration compared to the first workshop, although still much larger than desired. However, the grid refinement studies showed no sign$cant improvement in code-to-code scatter with increasing grid refinement.

  4. FY2017 Updates to the SAS4A/SASSYS-1 Safety Analysis Code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fanning, T. H.

    The SAS4A/SASSYS-1 safety analysis software is used to perform deterministic analysis of anticipated events as well as design-basis and beyond-design-basis accidents for advanced fast reactors. It plays a central role in the analysis of U.S. DOE conceptual designs, proposed test and demonstration reactors, and in domestic and international collaborations. This report summarizes the code development activities that have taken place during FY2017. Extensions to the void and cladding reactivity feedback models have been implemented, and Control System capabilities have been improved through a new virtual data acquisition system for plant state variables and an additional Block Signal for a variablemore » lag compensator to represent reactivity feedback for novel shutdown devices. Current code development and maintenance needs are also summarized in three key areas: software quality assurance, modeling improvements, and maintenance of related tools. With ongoing support, SAS4A/SASSYS-1 can continue to fulfill its growing role in fast reactor safety analysis and help solidify DOE’s leadership role in fast reactor safety both domestically and in international collaborations.« less

  5. Design and implementation of H.264 based embedded video coding technology

    NASA Astrophysics Data System (ADS)

    Mao, Jian; Liu, Jinming; Zhang, Jiemin

    2016-03-01

    In this paper, an embedded system for remote online video monitoring was designed and developed to capture and record the real-time circumstances in elevator. For the purpose of improving the efficiency of video acquisition and processing, the system selected Samsung S5PV210 chip as the core processor which Integrated graphics processing unit. And the video was encoded with H.264 format for storage and transmission efficiently. Based on S5PV210 chip, the hardware video coding technology was researched, which was more efficient than software coding. After running test, it had been proved that the hardware video coding technology could obviously reduce the cost of system and obtain the more smooth video display. It can be widely applied for the security supervision [1].

  6. Design of a double-anode magnetron-injection gun for the W-band gyrotron

    NASA Astrophysics Data System (ADS)

    Jang, Kwang Ho; Choi, Jin Joo; So, Joon Ho

    2015-07-01

    A double-anode magnetron-injection gun (MIG) was designed. The MIG is for a W-band 10-kW gyrotron. Analytic equations based on adiabatic theory and angular momentum conservation were used to examine the initial design parameters such as the cathode angle, and the radius of the beam emitting surface. The MIG's performances were predicted by using an electron trajectory code, the EGUN code. The beam spread of the axial velocity, Δvz/vz, obtained from the EGUN code was observed to be 1.34% at α = 1.3. The cathode edge emission and the thermal effect were modeled. The cathode edge emission was found to have a major effect on the velocity spread. The electron beam's quality was significantly improved by affixing non-emissive cylinders to the cathode.

  7. Fuel-Air Mixing and Combustion in Scramjets

    NASA Technical Reports Server (NTRS)

    Drummond, J. P.; Diskin, Glenn S.; Cutler, A. D.

    2002-01-01

    Activities in the area of scramjet fuel-air mixing and combustion associated with the Research and Technology Organization Working Group on Technologies for Propelled Hypersonic Flight are described. Work discussed in this paper has centered on the design of two basic experiments for studying the mixing and combustion of fuel and air in a scramjet. Simulations were conducted to aid in the design of these experiments. The experimental models were then constructed, and data were collected in the laboratory. Comparison of the data from a coaxial jet mixing experiment and a supersonic combustor experiment with a combustor code were then made and described. This work was conducted by NATO to validate combustion codes currently employed in scramjet design and to aid in the development of improved turbulence and combustion models employed by the codes.

  8. New quantum codes constructed from quaternary BCH codes

    NASA Astrophysics Data System (ADS)

    Xu, Gen; Li, Ruihu; Guo, Luobin; Ma, Yuena

    2016-10-01

    In this paper, we firstly study construction of new quantum error-correcting codes (QECCs) from three classes of quaternary imprimitive BCH codes. As a result, the improved maximal designed distance of these narrow-sense imprimitive Hermitian dual-containing quaternary BCH codes are determined to be much larger than the result given according to Aly et al. (IEEE Trans Inf Theory 53:1183-1188, 2007) for each different code length. Thus, families of new QECCs are newly obtained, and the constructed QECCs have larger distance than those in the previous literature. Secondly, we apply a combinatorial construction to the imprimitive BCH codes with their corresponding primitive counterpart and construct many new linear quantum codes with good parameters, some of which have parameters exceeding the finite Gilbert-Varshamov bound for linear quantum codes.

  9. RELAP-7 Software Verification and Validation Plan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, Curtis L.; Choi, Yong-Joon; Zou, Ling

    This INL plan comprehensively describes the software for RELAP-7 and documents the software, interface, and software design requirements for the application. The plan also describes the testing-based software verification and validation (SV&V) process—a set of specially designed software models used to test RELAP-7. The RELAP-7 (Reactor Excursion and Leak Analysis Program) code is a nuclear reactor system safety analysis code being developed at Idaho National Laboratory (INL). The code is based on the INL’s modern scientific software development framework – MOOSE (Multi-Physics Object-Oriented Simulation Environment). The overall design goal of RELAP-7 is to take advantage of the previous thirty yearsmore » of advancements in computer architecture, software design, numerical integration methods, and physical models. The end result will be a reactor systems analysis capability that retains and improves upon RELAP5’s capability and extends the analysis capability for all reactor system simulation scenarios.« less

  10. Evaluation of seismic design spectrum based on UHS implementing fourth-generation seismic hazard maps of Canada

    NASA Astrophysics Data System (ADS)

    Ahmed, Ali; Hasan, Rafiq; Pekau, Oscar A.

    2016-12-01

    Two recent developments have come into the forefront with reference to updating the seismic design provisions for codes: (1) publication of new seismic hazard maps for Canada by the Geological Survey of Canada, and (2) emergence of the concept of new spectral format outdating the conventional standardized spectral format. The fourth -generation seismic hazard maps are based on enriched seismic data, enhanced knowledge of regional seismicity and improved seismic hazard modeling techniques. Therefore, the new maps are more accurate and need to incorporate into the Canadian Highway Bridge Design Code (CHBDC) for its next edition similar to its building counterpart National Building Code of Canada (NBCC). In fact, the code writers expressed similar intentions with comments in the commentary of CHBCD 2006. During the process of updating codes, NBCC, and AASHTO Guide Specifications for LRFD Seismic Bridge Design, American Association of State Highway and Transportation Officials, Washington (2009) lowered the probability level from 10 to 2% and 10 to 5%, respectively. This study has brought five sets of hazard maps corresponding to 2%, 5% and 10% probability of exceedance in 50 years developed by the GSC under investigation. To have a sound statistical inference, 389 Canadian cities are selected. This study shows the implications of the changes of new hazard maps on the design process (i.e., extent of magnification or reduction of the design forces).

  11. Aerothermal modeling program, phase 2. Element B: Flow interaction experiment

    NASA Technical Reports Server (NTRS)

    Nikjooy, M.; Mongia, H. C.; Murthy, S. N. B.; Sullivan, J. P.

    1986-01-01

    The design process was improved and the efficiency, life, and maintenance costs of the turbine engine hot section was enhanced. Recently, there has been much emphasis on the need for improved numerical codes for the design of efficient combustors. For the development of improved computational codes, there is a need for an experimentally obtained data base to be used at test cases for the accuracy of the computations. The purpose of Element-B is to establish a benchmark quality velocity and scalar measurements of the flow interaction of circular jets with swirling flow typical of that in the dome region of annular combustor. In addition to the detailed experimental effort, extensive computations of the swirling flows are to be compared with the measurements for the purpose of assessing the accuracy of current and advanced turbulence and scalar transport models.

  12. Description of the L1C signal

    USGS Publications Warehouse

    Betz, J.W.; Blanco, M.A.; Cahn, C.R.; Dafesh, P.A.; Hegarty, C.J.; Hudnut, K.W.; Kasemsri, V.; Keegan, R.; Kovach, K.; Lenahan, L.S.; Ma, H.H.; Rushanan, J.J.; Sklar, D.; Stansell, T.A.; Wang, C.C.; Yi, S.K.

    2006-01-01

    Detailed design of the modernized LI civil signal (L1C) signal has been completed, and the resulting draft Interface Specification IS-GPS-800 was released in Spring 2006. The novel characteristics of the optimized L1C signal design provide advanced capabilities while offering to receiver designers considerable flexibility in how to use these capabilities. L1C provides a number of advanced features, including: 75% of power in a pilot component for enhanced signal tracking, advanced Weilbased spreading codes, an overlay code on the pilot that provides data message synchronization, support for improved reading of clock and ephemeris by combining message symbols across messages, advanced forward error control coding, and data symbol interleaving to combat fading. The resulting design offers receiver designers the opportunity to obtain unmatched performance in many ways. This paper describes the design of L1C. A summary of LIC's background and history is provided. The signal description then proceeds with the overall signal structure consisting of a pilot component and a carrier component. The new L1C spreading code family is described, along with the logic used for generating these spreading codes. Overlay codes on the pilot channel are also described, as is the logic used for generating the overlay codes. Spreading modulation characteristics are summarized. The data message structure is also presented, showing the format for providing time, ephemeris, and system data to users, along with features that enable receivers to perform code combining. Encoding of rapidly changing time bits is described, as are the Low Density Parity Check codes used for forward error control of slowly changing time bits, clock, ephemeris, and system data. The structure of the interleaver is also presented. A summary of L 1C's unique features and their benefits is provided, along with a discussion of the plan for L1C implementation.

  13. Rotor cascade shape optimization with unsteady passing wakes using implicit dual time stepping method

    NASA Astrophysics Data System (ADS)

    Lee, Eun Seok

    2000-10-01

    An improved aerodynamics performance of a turbine cascade shape can be achieved by an understanding of the flow-field associated with the stator-rotor interaction. In this research, an axial gas turbine airfoil cascade shape is optimized for improved aerodynamic performance by using an unsteady Navier-Stokes solver and a parallel genetic algorithm. The objective of the research is twofold: (1) to develop a computational fluid dynamics code having faster convergence rate and unsteady flow simulation capabilities, and (2) to optimize a turbine airfoil cascade shape with unsteady passing wakes for improved aerodynamic performance. The computer code solves the Reynolds averaged Navier-Stokes equations. It is based on the explicit, finite difference, Runge-Kutta time marching scheme and the Diagonalized Alternating Direction Implicit (DADI) scheme, with the Baldwin-Lomax algebraic and k-epsilon turbulence modeling. Improvements in the code focused on the cascade shape design capability, convergence acceleration and unsteady formulation. First, the inverse shape design method was implemented in the code to provide the design capability, where a surface transpiration concept was employed as an inverse technique to modify the geometry satisfying the user specified pressure distribution on the airfoil surface. Second, an approximation storage multigrid method was implemented as an acceleration technique. Third, the preconditioning method was adopted to speed up the convergence rate in solving the low Mach number flows. Finally, the implicit dual time stepping method was incorporated in order to simulate the unsteady flow-fields. For the unsteady code validation, the Stokes's 2nd problem and the Poiseuille flow were chosen and compared with the computed results and analytic solutions. To test the code's ability to capture the natural unsteady flow phenomena, vortex shedding past a cylinder and the shock oscillation over a bicircular airfoil were simulated and compared with experiments and other research results. The rotor cascade shape optimization with unsteady passing wakes was performed to obtain an improved aerodynamic performance using the unsteady Navier-Stokes solver. Two objective functions were defined as minimization of total pressure loss and maximization of lift, while the mass flow rate was fixed. A parallel genetic algorithm was used as an optimizer and the penalty method was introduced. Each individual's objective function was computed simultaneously by using a 32 processor distributed memory computer. One optimization took about four days.

  14. Contextual Constraint Treatment for coarse coding deficit in adults with right hemisphere brain damage: Generalization to narrative discourse comprehension

    PubMed Central

    Blake, Margaret Lehman; Tompkins, Connie A.; Scharp, Victoria L.; Meigh, Kimberly M.; Wambaugh, Julie

    2014-01-01

    Coarse coding is the activation of broad semantic fields that can include multiple word meanings and a variety of features, including those peripheral to a word’s core meaning. It is a partially domain-general process related to general discourse comprehension and contributes to both literal and non-literal language processing. Adults with damage to the right cerebral hemisphere (RHD) and a coarse coding deficit are particularly slow to activate features of words that are relatively distant or peripheral. This manuscript reports a pre-efficacy study of Contextual Constraint Treatment (CCT), a novel, implicit treatment designed to increase the efficiency of coarse coding with the goal of improving narrative comprehension and other language performance that relies on coarse coding. Participants were four adults with RHD. The study used a single-subject controlled experimental design across subjects and behaviors. The treatment involves pre-stimulation, using a hierarchy of strong- and moderately-biased contexts, to prime the intended distantly-related features of critical stimulus words. Three of the four participants exhibited gains in auditory narrative discourse comprehension, the primary outcome measure. All participants exhibited generalization to untreated items. No strong generalization to processing nonliteral language was evident. The results indicate that CCT yields both improved efficiency of the coarse coding process and generalization to narrative comprehension. PMID:24983133

  15. On the optimum signal constellation design for high-speed optical transport networks.

    PubMed

    Liu, Tao; Djordjevic, Ivan B

    2012-08-27

    In this paper, we first describe an optimum signal constellation design algorithm, which is optimum in MMSE-sense, called MMSE-OSCD, for channel capacity achieving source distribution. Secondly, we introduce a feedback channel capacity inspired optimum signal constellation design (FCC-OSCD) to further improve the performance of MMSE-OSCD, inspired by the fact that feedback channel capacity is higher than that of systems without feedback. The constellations obtained by FCC-OSCD are, however, OSNR dependent. The optimization is jointly performed together with regular quasi-cyclic low-density parity-check (LDPC) code design. Such obtained coded-modulation scheme, in combination with polarization-multiplexing, is suitable as both 400 Gb/s and multi-Tb/s optical transport enabling technology. Using large girth LDPC code, we demonstrate by Monte Carlo simulations that a 32-ary signal constellation, obtained by FCC-OSCD, outperforms previously proposed optimized 32-ary CIPQ signal constellation by 0.8 dB at BER of 10(-7). On the other hand, the LDPC-coded 16-ary FCC-OSCD outperforms 16-QAM by 1.15 dB at the same BER.

  16. Evaluation of Force Transfer Around Openings - Experimental and Analytical Studies

    Treesearch

    Borjen Yeh; Tom Skaggs; Frank Lam; Minghao Li; Douglas Rammer; James Wacker

    2011-01-01

    Wood structural panel (WSP) sheathed shear walls and diaphragms are the primary lateral-load-resistingelements in wood-frame construction. The historical performance of light-frame structures in North America is very good due, in part, to model building codes that are designed to safeguard life safety. These model building codes have spawned continual improvement and...

  17. Reds, Greens, Yellows Ease the Spelling Blues.

    ERIC Educational Resources Information Center

    Irwin, Virginia

    1971-01-01

    This document reports on a color-coding innovation designed to improve the spelling ability of high school seniors. This color-coded system is based on two assumptions: that color will appeal to the students and that there are three principal reasons for misspelling. Two groups were chosen for the experiments. A basic list of spelling demons was…

  18. Modelling Force Transfer Around Openings of Full-Scale Shear Walls

    Treesearch

    Tom Skaggs; Borjen Yeh; Frank Lam; Minghao Li; Doug Rammer; James Wacker

    2011-01-01

    Wood structural panel (WSP) sheathed shear walls and diaphragms are the primary lateralload-resisting elements in wood-frame construction. The historical performance of lightframe structures in North America has been very good due, in part, to model building codes that are designed to preserve life safety. These model building codes have spawned continual improvement...

  19. Automatic finite element generators

    NASA Technical Reports Server (NTRS)

    Wang, P. S.

    1984-01-01

    The design and implementation of a software system for generating finite elements and related computations are described. Exact symbolic computational techniques are employed to derive strain-displacement matrices and element stiffness matrices. Methods for dealing with the excessive growth of symbolic expressions are discussed. Automatic FORTRAN code generation is described with emphasis on improving the efficiency of the resultant code.

  20. Optimal patch code design via device characterization

    NASA Astrophysics Data System (ADS)

    Wu, Wencheng; Dalal, Edul N.

    2012-01-01

    In many color measurement applications, such as those for color calibration and profiling, "patch code" has been used successfully for job identification and automation to reduce operator errors. A patch code is similar to a barcode, but is intended primarily for use in measurement devices that cannot read barcodes due to limited spatial resolution, such as spectrophotometers. There is an inherent tradeoff between decoding robustness and the number of code levels available for encoding. Previous methods have attempted to address this tradeoff, but those solutions have been sub-optimal. In this paper, we propose a method to design optimal patch codes via device characterization. The tradeoff between decoding robustness and the number of available code levels is optimized in terms of printing and measurement efforts, and decoding robustness against noises from the printing and measurement devices. Effort is drastically reduced relative to previous methods because print-and-measure is minimized through modeling and the use of existing printer profiles. Decoding robustness is improved by distributing the code levels in CIE Lab space rather than in CMYK space.

  1. Computer Simulation of the VASIMR Engine

    NASA Technical Reports Server (NTRS)

    Garrison, David

    2005-01-01

    The goal of this project is to develop a magneto-hydrodynamic (MHD) computer code for simulation of the VASIMR engine. This code is designed be easy to modify and use. We achieve this using the Cactus framework, a system originally developed for research in numerical relativity. Since its release, Cactus has become an extremely powerful and flexible open source framework. The development of the code will be done in stages, starting with a basic fluid dynamic simulation and working towards a more complex MHD code. Once developed, this code can be used by students and researchers in order to further test and improve the VASIMR engine.

  2. Potential of coded excitation in medical ultrasound imaging.

    PubMed

    Misaridis, T X; Gammelmark, K; Jørgensen, C H; Lindberg, N; Thomsen, A H; Pedersen, M H; Jensen, J A

    2000-03-01

    Improvement in signal-to-noise ratio (SNR) and/or penetration depth can be achieved in medical ultrasound by using long coded waveforms, in a similar manner as in radars or sonars. However, the time-bandwidth product (TB) improvement, and thereby SNR improvement is considerably lower in medical ultrasound, due to the lower available bandwidth. There is still space for about 20 dB improvement in the SNR, which will yield a penetration depth up to 20 cm at 5 MHz [M. O'Donnell, IEEE Trans. Ultrason. Ferroelectr. Freq. Contr., 39(3) (1992) 341]. The limited TB additionally yields unacceptably high range sidelobes. However, the frequency weighting from the ultrasonic transducer's bandwidth, although suboptimal, can be beneficial in sidelobe reduction. The purpose of this study is an experimental evaluation of the above considerations in a coded excitation ultrasound system. A coded excitation system based on a modified commercial scanner is presented. A predistorted FM signal is proposed in order to keep the resulting range sidelobes at acceptably low levels. The effect of the transducer is taken into account in the design of the compression filter. Intensity levels have been considered and simulations on the expected improvement in SNR are also presented. Images of a wire phantom and clinical images have been taken with the coded system. The images show a significant improvement in penetration depth and they preserve both axial resolution and contrast.

  3. Improved Algorithms Speed It Up for Codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hazi, A

    2005-09-20

    Huge computers, huge codes, complex problems to solve. The longer it takes to run a code, the more it costs. One way to speed things up and save time and money is through hardware improvements--faster processors, different system designs, bigger computers. But another side of supercomputing can reap savings in time and speed: software improvements to make codes--particularly the mathematical algorithms that form them--run faster and more efficiently. Speed up math? Is that really possible? According to Livermore physicist Eugene Brooks, the answer is a resounding yes. ''Sure, you get great speed-ups by improving hardware,'' says Brooks, the deputy leadermore » for Computational Physics in N Division, which is part of Livermore's Physics and Advanced Technologies (PAT) Directorate. ''But the real bonus comes on the software side, where improvements in software can lead to orders of magnitude improvement in run times.'' Brooks knows whereof he speaks. Working with Laboratory physicist Abraham Szoeke and others, he has been instrumental in devising ways to shrink the running time of what has, historically, been a tough computational nut to crack: radiation transport codes based on the statistical or Monte Carlo method of calculation. And Brooks is not the only one. Others around the Laboratory, including physicists Andrew Williamson, Randolph Hood, and Jeff Grossman, have come up with innovative ways to speed up Monte Carlo calculations using pure mathematics.« less

  4. Verification and Validation in a Rapid Software Development Process

    NASA Technical Reports Server (NTRS)

    Callahan, John R.; Easterbrook, Steve M.

    1997-01-01

    The high cost of software production is driving development organizations to adopt more automated design and analysis methods such as rapid prototyping, computer-aided software engineering (CASE) tools, and high-level code generators. Even developers of safety-critical software system have adopted many of these new methods while striving to achieve high levels Of quality and reliability. While these new methods may enhance productivity and quality in many cases, we examine some of the risks involved in the use of new methods in safety-critical contexts. We examine a case study involving the use of a CASE tool that automatically generates code from high-level system designs. We show that while high-level testing on the system structure is highly desirable, significant risks exist in the automatically generated code and in re-validating releases of the generated code after subsequent design changes. We identify these risks and suggest process improvements that retain the advantages of rapid, automated development methods within the quality and reliability contexts of safety-critical projects.

  5. Improved design of subcritical and supercritical cascades using complex characteristics and boundary layer correction

    NASA Technical Reports Server (NTRS)

    Sanz, J. M.

    1983-01-01

    The method of complex characteristics and hodograph transformation for the design of shockless airfoils was extended to design supercritical cascades with high solidities and large inlet angles. This capability was achieved by introducing a conformal mapping of the hodograph domain onto an ellipse and expanding the solution in terms of Tchebycheff polynomials. A computer code was developd based on this idea. A number of airfoils designed with the code are presented. Various supercritical and subcritical compressor, turbine and propeller sections are shown. The lag-entrainment method for the calculation of a turbulent boundary layer was incorporated to the inviscid design code. The results of this calculation are shown for the airfoils described. The elliptic conformal transformation developed to map the hodograph domain onto an ellipse can be used to generate a conformal grid in the physical domain of a cascade of airfoils with open trailing edges with a single transformation. A grid generated with this transformation is shown for the Korn airfoil.

  6. Area, speed and power measurements of FPGA-based complex orthogonal space-time block code channel encoders

    NASA Astrophysics Data System (ADS)

    Passas, Georgios; Freear, Steven; Fawcett, Darren

    2010-01-01

    Space-time coding (STC) is an important milestone in modern wireless communications. In this technique, more copies of the same signal are transmitted through different antennas (space) and different symbol periods (time), to improve the robustness of a wireless system by increasing its diversity gain. STCs are channel coding algorithms that can be readily implemented on a field programmable gate array (FPGA) device. This work provides some figures for the amount of required FPGA hardware resources, the speed that the algorithms can operate and the power consumption requirements of a space-time block code (STBC) encoder. Seven encoder very high-speed integrated circuit hardware description language (VHDL) designs have been coded, synthesised and tested. Each design realises a complex orthogonal space-time block code with a different transmission matrix. All VHDL designs are parameterisable in terms of sample precision. Precisions ranging from 4 bits to 32 bits have been synthesised. Alamouti's STBC encoder design [Alamouti, S.M. (1998), 'A Simple Transmit Diversity Technique for Wireless Communications', IEEE Journal on Selected Areas in Communications, 16:55-108.] proved to be the best trade-off, since it is on average 3.2 times smaller, 1.5 times faster and requires slightly less power than the next best trade-off in the comparison, which is a 3/4-rate full-diversity 3Tx-antenna STBC.

  7. 77 FR 37471 - National Automotive Sampling System

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-06-21

    ... a comprehensive review of the National Automotive Sampling System (NASS) research design and data... comment on the current data elements, propose new data elements, make suggestions on the research design... should consider to improve crash data. Current NASS data elements, coding instructions, and descriptive...

  8. Aerodynamic design for improved manueverability by use of three-dimensional transonic theory

    NASA Technical Reports Server (NTRS)

    Mann, M. J.; Campbell, R. L.; Ferris, J. C.

    1984-01-01

    Improvements in transonic maneuver performance by the use of three-dimensional transonic theory and a transonic design procedure were examined. The FLO-27 code of Jameson and Caughey was used to design a new wing for a fighter configuration with lower drag at transonic maneuver conditions. The wing airfoil sections were altered to reduce the upper-surface shock strength by means of a design procedure which is based on the iterative application of the FLO-27 code. The plan form of the fighter configuration was fixed and had a leading edge sweep of 45 deg and an aspect ratio of 3.28. Wind-tunnel tests were conducted on this configuration at Mach numbers from 0.60 to 0.95 and angles of attack from -2 deg to 17 deg. The transonic maneuver performance of this configuration was evaluated by comparison with a wing designed by empirical methods and a wing designed primarily by two-dimensional transonic theory. The configuration designed by the use of FLO-27 had the same or lower drag than the empirical wing and, for some conditions, lower drag than the two-dimensional design. From some maneuver conditions, the drag of the two-dimensional design was somewhat lower.

  9. Quality improvement utilizing in-situ simulation for a dual-hospital pediatric code response team.

    PubMed

    Yager, Phoebe; Collins, Corey; Blais, Carlene; O'Connor, Kathy; Donovan, Patricia; Martinez, Maureen; Cummings, Brian; Hartnick, Christopher; Noviski, Natan

    2016-09-01

    Given the rarity of in-hospital pediatric emergency events, identification of gaps and inefficiencies in the code response can be difficult. In-situ, simulation-based medical education programs can identify unrecognized systems-based challenges. We hypothesized that developing an in-situ, simulation-based pediatric emergency response program would identify latent inefficiencies in a complex, dual-hospital pediatric code response system and allow rapid intervention testing to improve performance before implementation at an institutional level. Pediatric leadership from two hospitals with a shared pediatric code response team employed the Institute for Healthcare Improvement's (IHI) Breakthrough Model for Collaborative Improvement to design a program consisting of Plan-Do-Study-Act cycles occurring in a simulated environment. The objectives of the program were to 1) identify inefficiencies in our pediatric code response; 2) correlate to current workflow; 3) employ an iterative process to test quality improvement interventions in a safe environment; and 4) measure performance before actual implementation at the institutional level. Twelve dual-hospital, in-situ, simulated, pediatric emergencies occurred over one year. The initial simulated event allowed identification of inefficiencies including delayed provider response, delayed initiation of cardiopulmonary resuscitation (CPR), and delayed vascular access. These gaps were linked to process issues including unreliable code pager activation, slow elevator response, and lack of responder familiarity with layout and contents of code cart. From first to last simulation with multiple simulated process improvements, code response time for secondary providers coming from the second hospital decreased from 29 to 7 min, time to CPR initiation decreased from 90 to 15 s, and vascular access obtainment decreased from 15 to 3 min. Some of these simulated process improvements were adopted into the institutional response while others continue to be trended over time for evidence that observed changes represent a true new state of control. Utilizing the IHI's Breakthrough Model, we developed a simulation-based program to 1) successfully identify gaps and inefficiencies in a complex, dual-hospital, pediatric code response system and 2) provide an environment in which to safely test quality improvement interventions before institutional dissemination. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  10. Overall Traveling-Wave-Tube Efficiency Improved By Optimized Multistage Depressed Collector Design

    NASA Technical Reports Server (NTRS)

    Vaden, Karl R.

    2002-01-01

    Depressed Collector Design The microwave traveling wave tube (TWT) is used widely for space communications and high-power airborne transmitting sources. One of the most important features in designing a TWT is overall efficiency. Yet, overall TWT efficiency is strongly dependent on the efficiency of the electron beam collector, particularly for high values of collector efficiency. For these reasons, the NASA Glenn Research Center developed an optimization algorithm based on simulated annealing to quickly design highly efficient multistage depressed collectors (MDC's). Simulated annealing is a strategy for solving highly nonlinear combinatorial optimization problems. Its major advantage over other methods is its ability to avoid becoming trapped in local minima. Simulated annealing is based on an analogy to statistical thermodynamics, specifically the physical process of annealing: heating a material to a temperature that permits many atomic rearrangements and then cooling it carefully and slowly, until it freezes into a strong, minimum-energy crystalline structure. This minimum energy crystal corresponds to the optimal solution of a mathematical optimization problem. The TWT used as a baseline for optimization was the 32-GHz, 10-W, helical TWT developed for the Cassini mission to Saturn. The method of collector analysis and design used was a 2-1/2-dimensional computational procedure that employs two types of codes, a large signal analysis code and an electron trajectory code. The large signal analysis code produces the spatial, energetic, and temporal distributions of the spent beam entering the MDC. An electron trajectory code uses the resultant data to perform the actual collector analysis. The MDC was optimized for maximum MDC efficiency and minimum final kinetic energy of all collected electrons (to reduce heat transfer). The preceding figure shows the geometric and electrical configuration of an optimized collector with an efficiency of 93.8 percent. The results show the improvement in collector efficiency from 89.7 to 93.8 percent, resulting in an increase of three overall efficiency points. In addition, the time to design a highly efficient MDC was reduced from a month to a few days. All work was done in-house at Glenn for the High Rate Data Delivery Program. Future plans include optimizing the MDC and TWT interaction circuit in tandem to further improve overall TWT efficiency.

  11. On the decoding process in ternary error-correcting output codes.

    PubMed

    Escalera, Sergio; Pujol, Oriol; Radeva, Petia

    2010-01-01

    A common way to model multiclass classification problems is to design a set of binary classifiers and to combine them. Error-Correcting Output Codes (ECOC) represent a successful framework to deal with these type of problems. Recent works in the ECOC framework showed significant performance improvements by means of new problem-dependent designs based on the ternary ECOC framework. The ternary framework contains a larger set of binary problems because of the use of a "do not care" symbol that allows us to ignore some classes by a given classifier. However, there are no proper studies that analyze the effect of the new symbol at the decoding step. In this paper, we present a taxonomy that embeds all binary and ternary ECOC decoding strategies into four groups. We show that the zero symbol introduces two kinds of biases that require redefinition of the decoding design. A new type of decoding measure is proposed, and two novel decoding strategies are defined. We evaluate the state-of-the-art coding and decoding strategies over a set of UCI Machine Learning Repository data sets and into a real traffic sign categorization problem. The experimental results show that, following the new decoding strategies, the performance of the ECOC design is significantly improved.

  12. A Subsonic Aircraft Design Optimization With Neural Network and Regression Approximators

    NASA Technical Reports Server (NTRS)

    Patnaik, Surya N.; Coroneos, Rula M.; Guptill, James D.; Hopkins, Dale A.; Haller, William J.

    2004-01-01

    The Flight-Optimization-System (FLOPS) code encountered difficulty in analyzing a subsonic aircraft. The limitation made the design optimization problematic. The deficiencies have been alleviated through use of neural network and regression approximations. The insight gained from using the approximators is discussed in this paper. The FLOPS code is reviewed. Analysis models are developed and validated for each approximator. The regression method appears to hug the data points, while the neural network approximation follows a mean path. For an analysis cycle, the approximate model required milliseconds of central processing unit (CPU) time versus seconds by the FLOPS code. Performance of the approximators was satisfactory for aircraft analysis. A design optimization capability has been created by coupling the derived analyzers to the optimization test bed CometBoards. The approximators were efficient reanalysis tools in the aircraft design optimization. Instability encountered in the FLOPS analyzer was eliminated. The convergence characteristics were improved for the design optimization. The CPU time required to calculate the optimum solution, measured in hours with the FLOPS code was reduced to minutes with the neural network approximation and to seconds with the regression method. Generation of the approximators required the manipulation of a very large quantity of data. Design sensitivity with respect to the bounds of aircraft constraints is easily generated.

  13. New nonbinary quantum codes with larger distance constructed from BCH codes over 𝔽q2

    NASA Astrophysics Data System (ADS)

    Xu, Gen; Li, Ruihu; Fu, Qiang; Ma, Yuena; Guo, Luobin

    2017-03-01

    This paper concentrates on construction of new nonbinary quantum error-correcting codes (QECCs) from three classes of narrow-sense imprimitive BCH codes over finite field 𝔽q2 (q ≥ 3 is an odd prime power). By a careful analysis on properties of cyclotomic cosets in defining set T of these BCH codes, the improved maximal designed distance of these narrow-sense imprimitive Hermitian dual-containing BCH codes is determined to be much larger than the result given according to Aly et al. [S. A. Aly, A. Klappenecker and P. K. Sarvepalli, IEEE Trans. Inf. Theory 53, 1183 (2007)] for each different code length. Thus families of new nonbinary QECCs are constructed, and the newly obtained QECCs have larger distance than those in previous literature.

  14. CFC (Comment-First-Coding)--A Simple yet Effective Method for Teaching Programming to Information Systems Students

    ERIC Educational Resources Information Center

    Sengupta, Arijit

    2009-01-01

    Programming courses have always been a difficult part of an Information Systems curriculum. While we do not train Information Systems students to be developers, understanding how to build a system always gives students an added perspective to improve their system design and analysis skills. This teaching tip presents CFC (Comment-First-Coding)--a…

  15. OpenEIS. Users Guide

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kim, Woohyun; Lutes, Robert G.; Katipamula, Srinivas

    This document is a users guide for OpenEIS, a software code designed to provide standard methods for authoring, sharing, testing, using and improving algorithms for operational building energy efficiency.

  16. Dual-camera design for coded aperture snapshot spectral imaging.

    PubMed

    Wang, Lizhi; Xiong, Zhiwei; Gao, Dahua; Shi, Guangming; Wu, Feng

    2015-02-01

    Coded aperture snapshot spectral imaging (CASSI) provides an efficient mechanism for recovering 3D spectral data from a single 2D measurement. However, since the reconstruction problem is severely underdetermined, the quality of recovered spectral data is usually limited. In this paper we propose a novel dual-camera design to improve the performance of CASSI while maintaining its snapshot advantage. Specifically, a beam splitter is placed in front of the objective lens of CASSI, which allows the same scene to be simultaneously captured by a grayscale camera. This uncoded grayscale measurement, in conjunction with the coded CASSI measurement, greatly eases the reconstruction problem and yields high-quality 3D spectral data. Both simulation and experimental results demonstrate the effectiveness of the proposed method.

  17. Alternative Formats to Achieve More Efficient Energy Codes for Commercial Buildings

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Conover, David R.; Rosenberg, Michael I.; Halverson, Mark A.

    2013-01-26

    This paper identifies and examines several formats or structures that could be used to create the next generation of more efficient energy codes and standards for commercial buildings. Pacific Northwest National Laboratory (PNNL) is funded by the U.S. Department of Energy’s Building Energy Codes Program (BECP) to provide technical support to the development of ANSI/ASHRAE/IES Standard 90.1. While the majority of PNNL’s ASHRAE Standard 90.1 support focuses on developing and evaluating new requirements, a portion of its work involves consideration of the format of energy standards. In its current working plan, the ASHRAE 90.1 committee has approved an energy goalmore » of 50% improvement in Standard 90.1-2013 relative to Standard 90.1-2004, and will likely be considering higher improvement targets for future versions of the standard. To cost-effectively achieve the 50% goal in manner that can gain stakeholder consensus, formats other than prescriptive must be considered. Alternative formats that include reducing the reliance on prescriptive requirements may make it easier to achieve these aggressive efficiency levels in new codes and standards. The focus on energy code and standard formats is meant to explore approaches to presenting the criteria that will foster compliance, enhance verification, and stimulate innovation while saving energy in buildings. New formats may also make it easier for building designers and owners to design and build the levels of efficiency called for in the new codes and standards. This paper examines a number of potential formats and structures, including prescriptive, performance-based (with sub-formats of performance equivalency and performance targets), capacity constraint-based, and outcome-based. The paper also discusses the pros and cons of each format from the viewpoint of code users and of code enforcers.« less

  18. New primary renal diagnosis codes for the ERA-EDTA

    PubMed Central

    Venkat-Raman, Gopalakrishnan; Tomson, Charles R.V.; Gao, Yongsheng; Cornet, Ronald; Stengel, Benedicte; Gronhagen-Riska, Carola; Reid, Chris; Jacquelinet, Christian; Schaeffner, Elke; Boeschoten, Els; Casino, Francesco; Collart, Frederic; De Meester, Johan; Zurriaga, Oscar; Kramar, Reinhard; Jager, Kitty J.; Simpson, Keith

    2012-01-01

    The European Renal Association-European Dialysis and Transplant Association (ERA-EDTA) Registry has produced a new set of primary renal diagnosis (PRD) codes that are intended for use by affiliated registries. It is designed specifically for use in renal centres and registries but is aligned with international coding standards supported by the WHO (International Classification of Diseases) and the International Health Terminology Standards Development Organization (SNOMED Clinical Terms). It is available as supplementary material to this paper and free on the internet for non-commercial, clinical, quality improvement and research use, and by agreement with the ERA-EDTA Registry for use by commercial organizations. Conversion between the old and the new PRD codes is possible. The new codes are very flexible and will be actively managed to keep them up-to-date and to ensure that renal medicine can remain at the forefront of the electronic revolution in medicine, epidemiology research and the use of decision support systems to improve the care of patients. PMID:23175621

  19. Improving our application of the health education code of ethics.

    PubMed

    Marks, Ray; Shive, Steven E

    2006-01-01

    The Health Education Code of Ethics was designed to provide a framework of shared values within which health education might be practiced. However, an informal survey conducted on a limited sample in November 2004 indicated that ethics and how to apply the code are topics not readily taught formally within all health education programs. There is, however, an expressed interest among health educators in understanding the code and its application. Because of the immense import of ethics, affecting responsible professional conduct at all levels, this article is designed to introduce the topic to health education practitioners who have had little formal exposure to ethics curricula, as well as to faculty who would like to teach this subject. The authors specifically review several resources that might be especially helpful in fostering a better understanding of this essential but often underestimated aspect of health education practice and research, namely, its ethical application.

  20. Finite element analysis of wirelessly interrogated implantable bio-MEMS

    NASA Astrophysics Data System (ADS)

    Dissanayake, Don W.; Al-Sarawi, Said F.; Lu, Tien-Fu; Abbott, Derek

    2008-12-01

    Wirelessly interrogated bio-MEMS devices are becoming more popular due to many challenges, such as improving the diagnosis, monitoring, and patient wellbeing. The authors present here a passive, low power and small area device, which can be interrogated wirelessly using a uniquely coded signal for a secure and reliable operation. The proposed new approach relies on converting the interrogating coded signal to surface acoustic wave that is then correlated with an embedded code. The suggested method is implemented to operate a micropump, which consist of a specially designed corrugated microdiaphragm to modulate the fluid flow in microchannels. Finite Element Analysis of the micropump operation is presented and a performance was analysed. Design parameters of the diaphragm design were finetuned for optimal performance and different polymer based materials were used in various parts of the micropump to allow for better flexibility and high reliability.

  1. Computer code for preliminary sizing analysis of axial-flow turbines

    NASA Technical Reports Server (NTRS)

    Glassman, Arthur J.

    1992-01-01

    This mean diameter flow analysis uses a stage average velocity diagram as the basis for the computational efficiency. Input design requirements include power or pressure ratio, flow rate, temperature, pressure, and rotative speed. Turbine designs are generated for any specified number of stages and for any of three types of velocity diagrams (symmetrical, zero exit swirl, or impulse) or for any specified stage swirl split. Exit turning vanes can be included in the design. The program output includes inlet and exit annulus dimensions, exit temperature and pressure, total and static efficiencies, flow angles, and last stage absolute and relative Mach numbers. An analysis is presented along with a description of the computer program input and output with sample cases. The analysis and code presented herein are modifications of those described in NASA-TN-D-6702. These modifications improve modeling rigor and extend code applicability.

  2. Medical Ultrasound Video Coding with H.265/HEVC Based on ROI Extraction

    PubMed Central

    Wu, Yueying; Liu, Pengyu; Gao, Yuan; Jia, Kebin

    2016-01-01

    High-efficiency video compression technology is of primary importance to the storage and transmission of digital medical video in modern medical communication systems. To further improve the compression performance of medical ultrasound video, two innovative technologies based on diagnostic region-of-interest (ROI) extraction using the high efficiency video coding (H.265/HEVC) standard are presented in this paper. First, an effective ROI extraction algorithm based on image textural features is proposed to strengthen the applicability of ROI detection results in the H.265/HEVC quad-tree coding structure. Second, a hierarchical coding method based on transform coefficient adjustment and a quantization parameter (QP) selection process is designed to implement the otherness encoding for ROIs and non-ROIs. Experimental results demonstrate that the proposed optimization strategy significantly improves the coding performance by achieving a BD-BR reduction of 13.52% and a BD-PSNR gain of 1.16 dB on average compared to H.265/HEVC (HM15.0). The proposed medical video coding algorithm is expected to satisfy low bit-rate compression requirements for modern medical communication systems. PMID:27814367

  3. Medical Ultrasound Video Coding with H.265/HEVC Based on ROI Extraction.

    PubMed

    Wu, Yueying; Liu, Pengyu; Gao, Yuan; Jia, Kebin

    2016-01-01

    High-efficiency video compression technology is of primary importance to the storage and transmission of digital medical video in modern medical communication systems. To further improve the compression performance of medical ultrasound video, two innovative technologies based on diagnostic region-of-interest (ROI) extraction using the high efficiency video coding (H.265/HEVC) standard are presented in this paper. First, an effective ROI extraction algorithm based on image textural features is proposed to strengthen the applicability of ROI detection results in the H.265/HEVC quad-tree coding structure. Second, a hierarchical coding method based on transform coefficient adjustment and a quantization parameter (QP) selection process is designed to implement the otherness encoding for ROIs and non-ROIs. Experimental results demonstrate that the proposed optimization strategy significantly improves the coding performance by achieving a BD-BR reduction of 13.52% and a BD-PSNR gain of 1.16 dB on average compared to H.265/HEVC (HM15.0). The proposed medical video coding algorithm is expected to satisfy low bit-rate compression requirements for modern medical communication systems.

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lanier, Nicholas Edward

    We have completed implementation of a laser package in LANL's principal AGEX design code, Cassio. Although we have greatly improved our target characterization and uncertainty quantification, we remain unable to satisfactorily simulate the NIF Pleiades data.

  5. Context influences on TALE–DNA binding revealed by quantitative profiling

    PubMed Central

    Rogers, Julia M.; Barrera, Luis A.; Reyon, Deepak; Sander, Jeffry D.; Kellis, Manolis; Joung, J Keith; Bulyk, Martha L.

    2015-01-01

    Transcription activator-like effector (TALE) proteins recognize DNA using a seemingly simple DNA-binding code, which makes them attractive for use in genome engineering technologies that require precise targeting. Although this code is used successfully to design TALEs to target specific sequences, off-target binding has been observed and is difficult to predict. Here we explore TALE–DNA interactions comprehensively by quantitatively assaying the DNA-binding specificities of 21 representative TALEs to ∼5,000–20,000 unique DNA sequences per protein using custom-designed protein-binding microarrays (PBMs). We find that protein context features exert significant influences on binding. Thus, the canonical recognition code does not fully capture the complexity of TALE–DNA binding. We used the PBM data to develop a computational model, Specificity Inference For TAL-Effector Design (SIFTED), to predict the DNA-binding specificity of any TALE. We provide SIFTED as a publicly available web tool that predicts potential genomic off-target sites for improved TALE design. PMID:26067805

  6. Context influences on TALE-DNA binding revealed by quantitative profiling.

    PubMed

    Rogers, Julia M; Barrera, Luis A; Reyon, Deepak; Sander, Jeffry D; Kellis, Manolis; Joung, J Keith; Bulyk, Martha L

    2015-06-11

    Transcription activator-like effector (TALE) proteins recognize DNA using a seemingly simple DNA-binding code, which makes them attractive for use in genome engineering technologies that require precise targeting. Although this code is used successfully to design TALEs to target specific sequences, off-target binding has been observed and is difficult to predict. Here we explore TALE-DNA interactions comprehensively by quantitatively assaying the DNA-binding specificities of 21 representative TALEs to ∼5,000-20,000 unique DNA sequences per protein using custom-designed protein-binding microarrays (PBMs). We find that protein context features exert significant influences on binding. Thus, the canonical recognition code does not fully capture the complexity of TALE-DNA binding. We used the PBM data to develop a computational model, Specificity Inference For TAL-Effector Design (SIFTED), to predict the DNA-binding specificity of any TALE. We provide SIFTED as a publicly available web tool that predicts potential genomic off-target sites for improved TALE design.

  7. An address geocoding method for improving rural spatial information infrastructure

    NASA Astrophysics Data System (ADS)

    Pan, Yuchun; Chen, Baisong; Lu, Zhou; Li, Shuhua; Zhang, Jingbo; Zhou, YanBing

    2010-11-01

    The transition of rural and agricultural management from divisional to integrated mode has highlighted the importance of data integration and sharing. Current data are mostly collected by specific department to satisfy their own needs and lake of considering on wider potential uses. This led to great difference in data format, semantic, and precision even in same area, which is a significant barrier for constructing an integrated rural spatial information system to support integrated management and decision-making. Considering the rural cadastral management system and postal zones, the paper designs a rural address geocoding method based on rural cadastral parcel. It puts forward a geocoding standard which consists of absolute position code, relative position code and extended code. It designs a rural geocoding database model, and addresses collection and update model. Then, based on the rural address geocoding model, it proposed a data model for rural agricultural resources management. The results show that the address coding based on postal code is stable and easy to memorize, two-dimensional coding based on the direction and distance is easy to be located and memorized, while extended code can enhance the extensibility and flexibility of address geocoding.

  8. Improvement of Modeling HTGR Neutron Physics by Uncertainty Analysis with the Use of Cross-Section Covariance Information

    NASA Astrophysics Data System (ADS)

    Boyarinov, V. F.; Grol, A. V.; Fomichenko, P. A.; Ternovykh, M. Yu

    2017-01-01

    This work is aimed at improvement of HTGR neutron physics design calculations by application of uncertainty analysis with the use of cross-section covariance information. Methodology and codes for preparation of multigroup libraries of covariance information for individual isotopes from the basic 44-group library of SCALE-6 code system were developed. A 69-group library of covariance information in a special format for main isotopes and elements typical for high temperature gas cooled reactors (HTGR) was generated. This library can be used for estimation of uncertainties, associated with nuclear data, in analysis of HTGR neutron physics with design codes. As an example, calculations of one-group cross-section uncertainties for fission and capture reactions for main isotopes of the MHTGR-350 benchmark, as well as uncertainties of the multiplication factor (k∞) for the MHTGR-350 fuel compact cell model and fuel block model were performed. These uncertainties were estimated by the developed technology with the use of WIMS-D code and modules of SCALE-6 code system, namely, by TSUNAMI, KENO-VI and SAMS. Eight most important reactions on isotopes for MHTGR-350 benchmark were identified, namely: 10B(capt), 238U(n,γ), ν5, 235U(n,γ), 238U(el), natC(el), 235U(fiss)-235U(n,γ), 235U(fiss).

  9. On optimal designs of transparent WDM networks with 1 + 1 protection leveraged by all-optical XOR network coding schemes

    NASA Astrophysics Data System (ADS)

    Dao, Thanh Hai

    2018-01-01

    Network coding techniques are seen as the new dimension to improve the network performances thanks to the capability of utilizing network resources more efficiently. Indeed, the application of network coding to the realm of failure recovery in optical networks has been marking a major departure from traditional protection schemes as it could potentially achieve both rapid recovery and capacity improvement, challenging the prevailing wisdom of trading capacity efficiency for speed recovery and vice versa. In this context, the maturing of all-optical XOR technologies appears as a good match to the necessity of a more efficient protection in transparent optical networks. In addressing this opportunity, we propose to use a practical all-optical XOR network coding to leverage the conventional 1 + 1 optical path protection in transparent WDM optical networks. The network coding-assisted protection solution combines protection flows of two demands sharing the same destination node in supportive conditions, paving the way for reducing the backup capacity. A novel mathematical model taking into account the operation of new protection scheme for optimal network designs is formulated as the integer linear programming. Numerical results based on extensive simulations on realistic topologies, COST239 and NSFNET networks, are presented to highlight the benefits of our proposal compared to the conventional approach in terms of wavelength resources efficiency and network throughput.

  10. Designing and Assessing Learning

    ERIC Educational Resources Information Center

    Quan, Hong; Liu, Dandan; Cun, Xiangqin; Lu, Yingchun

    2009-01-01

    This paper analyses the design, implementation and assessment of a level 2 module for non-English major students in higher vocational and professional education. 1132001 is a code of module that uses active methods to teach college English in China. It specifically reflects on the module's advantage and defect for developing and improving learning…

  11. A Spherical Active Coded Aperture for 4π Gamma-ray Imaging

    DOE PAGES

    Hellfeld, Daniel; Barton, Paul; Gunter, Donald; ...

    2017-09-22

    Gamma-ray imaging facilitates the efficient detection, characterization, and localization of compact radioactive sources in cluttered environments. Fieldable detector systems employing active planar coded apertures have demonstrated broad energy sensitivity via both coded aperture and Compton imaging modalities. But, planar configurations suffer from a limited field-of-view, especially in the coded aperture mode. In order to improve upon this limitation, we introduce a novel design by rearranging the detectors into an active coded spherical configuration, resulting in a 4pi isotropic field-of-view for both coded aperture and Compton imaging. This work focuses on the low- energy coded aperture modality and the optimization techniquesmore » used to determine the optimal number and configuration of 1 cm 3 CdZnTe coplanar grid detectors on a 14 cm diameter sphere with 192 available detector locations.« less

  12. A good performance watermarking LDPC code used in high-speed optical fiber communication system

    NASA Astrophysics Data System (ADS)

    Zhang, Wenbo; Li, Chao; Zhang, Xiaoguang; Xi, Lixia; Tang, Xianfeng; He, Wenxue

    2015-07-01

    A watermarking LDPC code, which is a strategy designed to improve the performance of the traditional LDPC code, was introduced. By inserting some pre-defined watermarking bits into original LDPC code, we can obtain a more correct estimation about the noise level in the fiber channel. Then we use them to modify the probability distribution function (PDF) used in the initial process of belief propagation (BP) decoding algorithm. This algorithm was tested in a 128 Gb/s PDM-DQPSK optical communication system and results showed that the watermarking LDPC code had a better tolerances to polarization mode dispersion (PMD) and nonlinearity than that of traditional LDPC code. Also, by losing about 2.4% of redundancy for watermarking bits, the decoding efficiency of the watermarking LDPC code is about twice of the traditional one.

  13. A long constraint length VLSI Viterbi decoder for the DSN

    NASA Technical Reports Server (NTRS)

    Statman, J. I.; Zimmerman, G.; Pollara, F.; Collins, O.

    1988-01-01

    A Viterbi decoder, capable of decoding convolutional codes with constraint lengths up to 15, is under development for the Deep Space Network (DSN). The objective is to complete a prototype of this decoder by late 1990, and demonstrate its performance using the (15, 1/4) encoder in Galileo. The decoder is expected to provide 1 to 2 dB improvement in bit SNR, compared to the present (7, 1/2) code and existing Maximum Likelihood Convolutional Decoder (MCD). The decoder will be fully programmable for any code up to constraint length 15, and code rate 1/2 to 1/6. The decoder architecture and top-level design are described.

  14. AN EXTENSION OF THE ATHENA++ CODE FRAMEWORK FOR GRMHD BASED ON ADVANCED RIEMANN SOLVERS AND STAGGERED-MESH CONSTRAINED TRANSPORT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    White, Christopher J.; Stone, James M.; Gammie, Charles F.

    2016-08-01

    We present a new general relativistic magnetohydrodynamics (GRMHD) code integrated into the Athena++ framework. Improving upon the techniques used in most GRMHD codes, ours allows the use of advanced, less diffusive Riemann solvers, in particular HLLC and HLLD. We also employ a staggered-mesh constrained transport algorithm suited for curvilinear coordinate systems in order to maintain the divergence-free constraint of the magnetic field. Our code is designed to work with arbitrary stationary spacetimes in one, two, or three dimensions, and we demonstrate its reliability through a number of tests. We also report on its promising performance and scalability.

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hellfeld, Daniel; Barton, Paul; Gunter, Donald

    Gamma-ray imaging facilitates the efficient detection, characterization, and localization of compact radioactive sources in cluttered environments. Fieldable detector systems employing active planar coded apertures have demonstrated broad energy sensitivity via both coded aperture and Compton imaging modalities. But, planar configurations suffer from a limited field-of-view, especially in the coded aperture mode. In order to improve upon this limitation, we introduce a novel design by rearranging the detectors into an active coded spherical configuration, resulting in a 4pi isotropic field-of-view for both coded aperture and Compton imaging. This work focuses on the low- energy coded aperture modality and the optimization techniquesmore » used to determine the optimal number and configuration of 1 cm 3 CdZnTe coplanar grid detectors on a 14 cm diameter sphere with 192 available detector locations.« less

  16. A Clustering-Based Approach to Enriching Code Foraging Environment.

    PubMed

    Niu, Nan; Jin, Xiaoyu; Niu, Zhendong; Cheng, Jing-Ru C; Li, Ling; Kataev, Mikhail Yu

    2016-09-01

    Developers often spend valuable time navigating and seeking relevant code in software maintenance. Currently, there is a lack of theoretical foundations to guide tool design and evaluation to best shape the code base to developers. This paper contributes a unified code navigation theory in light of the optimal food-foraging principles. We further develop a novel framework for automatically assessing the foraging mechanisms in the context of program investigation. We use the framework to examine to what extent the clustering of software entities affects code foraging. Our quantitative analysis of long-lived open-source projects suggests that clustering enriches the software environment and improves foraging efficiency. Our qualitative inquiry reveals concrete insights into real developer's behavior. Our research opens the avenue toward building a new set of ecologically valid code navigation tools.

  17. Euler Technology Assessment for Preliminary Aircraft Design: Compressibility Predictions by Employing the Cartesian Unstructured Grid SPLITFLOW Code

    NASA Technical Reports Server (NTRS)

    Finley, Dennis B.; Karman, Steve L., Jr.

    1996-01-01

    The objective of the second phase of the Euler Technology Assessment program was to evaluate the ability of Euler computational fluid dynamics codes to predict compressible flow effects over a generic fighter wind tunnel model. This portion of the study was conducted by Lockheed Martin Tactical Aircraft Systems, using an in-house Cartesian-grid code called SPLITFLOW. The Cartesian grid technique offers several advantages, including ease of volume grid generation and reduced number of cells compared to other grid schemes. SPLITFLOW also includes grid adaption of the volume grid during the solution to resolve high-gradient regions. The SPLITFLOW code predictions of configuration forces and moments are shown to be adequate for preliminary design, including predictions of sideslip effects and the effects of geometry variations at low and high angles-of-attack. The transonic pressure prediction capabilities of SPLITFLOW are shown to be improved over subsonic comparisons. The time required to generate the results from initial surface data is on the order of several hours, including grid generation, which is compatible with the needs of the design environment.

  18. Application of numerical methods to heat transfer and thermal stress analysis of aerospace vehicles

    NASA Technical Reports Server (NTRS)

    Wieting, A. R.

    1979-01-01

    The paper describes a thermal-structural design analysis study of a fuel-injection strut for a hydrogen-cooled scramjet engine for a supersonic transport, utilizing finite-element methodology. Applications of finite-element and finite-difference codes to the thermal-structural design-analysis of space transports and structures are discussed. The interaction between the thermal and structural analyses has led to development of finite-element thermal methodology to improve the integration between these two disciplines. The integrated thermal-structural analysis capability developed within the framework of a computer code is outlined.

  19. Demonstration of Automatically-Generated Adjoint Code for Use in Aerodynamic Shape Optimization

    NASA Technical Reports Server (NTRS)

    Green, Lawrence; Carle, Alan; Fagan, Mike

    1999-01-01

    Gradient-based optimization requires accurate derivatives of the objective function and constraints. These gradients may have previously been obtained by manual differentiation of analysis codes, symbolic manipulators, finite-difference approximations, or existing automatic differentiation (AD) tools such as ADIFOR (Automatic Differentiation in FORTRAN). Each of these methods has certain deficiencies, particularly when applied to complex, coupled analyses with many design variables. Recently, a new AD tool called ADJIFOR (Automatic Adjoint Generation in FORTRAN), based upon ADIFOR, was developed and demonstrated. Whereas ADIFOR implements forward-mode (direct) differentiation throughout an analysis program to obtain exact derivatives via the chain rule of calculus, ADJIFOR implements the reverse-mode counterpart of the chain rule to obtain exact adjoint form derivatives from FORTRAN code. Automatically-generated adjoint versions of the widely-used CFL3D computational fluid dynamics (CFD) code and an algebraic wing grid generation code were obtained with just a few hours processing time using the ADJIFOR tool. The codes were verified for accuracy and were shown to compute the exact gradient of the wing lift-to-drag ratio, with respect to any number of shape parameters, in about the time required for 7 to 20 function evaluations. The codes have now been executed on various computers with typical memory and disk space for problems with up to 129 x 65 x 33 grid points, and for hundreds to thousands of independent variables. These adjoint codes are now used in a gradient-based aerodynamic shape optimization problem for a swept, tapered wing. For each design iteration, the optimization package constructs an approximate, linear optimization problem, based upon the current objective function, constraints, and gradient values. The optimizer subroutines are called within a design loop employing the approximate linear problem until an optimum shape is found, the design loop limit is reached, or no further design improvement is possible due to active design variable bounds and/or constraints. The resulting shape parameters are then used by the grid generation code to define a new wing surface and computational grid. The lift-to-drag ratio and its gradient are computed for the new design by the automatically-generated adjoint codes. Several optimization iterations may be required to find an optimum wing shape. Results from two sample cases will be discussed. The reader should note that this work primarily represents a demonstration of use of automatically- generated adjoint code within an aerodynamic shape optimization. As such, little significance is placed upon the actual optimization results, relative to the method for obtaining the results.

  20. NASA Rotor 37 CFD Code Validation: Glenn-HT Code

    NASA Technical Reports Server (NTRS)

    Ameri, Ali A.

    2010-01-01

    In order to advance the goals of NASA aeronautics programs, it is necessary to continuously evaluate and improve the computational tools used for research and design at NASA. One such code is the Glenn-HT code which is used at NASA Glenn Research Center (GRC) for turbomachinery computations. Although the code has been thoroughly validated for turbine heat transfer computations, it has not been utilized for compressors. In this work, Glenn-HT was used to compute the flow in a transonic compressor and comparisons were made to experimental data. The results presented here are in good agreement with this data. Most of the measures of performance are well within the measurement uncertainties and the exit profiles of interest agree with the experimental measurements.

  1. Heuristic rules embedded genetic algorithm for in-core fuel management optimization

    NASA Astrophysics Data System (ADS)

    Alim, Fatih

    The objective of this study was to develop a unique methodology and a practical tool for designing loading pattern (LP) and burnable poison (BP) pattern for a given Pressurized Water Reactor (PWR) core. Because of the large number of possible combinations for the fuel assembly (FA) loading in the core, the design of the core configuration is a complex optimization problem. It requires finding an optimal FA arrangement and BP placement in order to achieve maximum cycle length while satisfying the safety constraints. Genetic Algorithms (GA) have been already used to solve this problem for LP optimization for both PWR and Boiling Water Reactor (BWR). The GA, which is a stochastic method works with a group of solutions and uses random variables to make decisions. Based on the theories of evaluation, the GA involves natural selection and reproduction of the individuals in the population for the next generation. The GA works by creating an initial population, evaluating it, and then improving the population by using the evaluation operators. To solve this optimization problem, a LP optimization package, GARCO (Genetic Algorithm Reactor Code Optimization) code is developed in the framework of this thesis. This code is applicable for all types of PWR cores having different geometries and structures with an unlimited number of FA types in the inventory. To reach this goal, an innovative GA is developed by modifying the classical representation of the genotype. To obtain the best result in a shorter time, not only the representation is changed but also the algorithm is changed to use in-core fuel management heuristics rules. The improved GA code was tested to demonstrate and verify the advantages of the new enhancements. The developed methodology is explained in this thesis and preliminary results are shown for the VVER-1000 reactor hexagonal geometry core and the TMI-1 PWR. The improved GA code was tested to verify the advantages of new enhancements. The core physics code used for VVER in this research is Moby-Dick, which was developed to analyze the VVER by SKODA Inc. The SIMULATE-3 code, which is an advanced two-group nodal code, is used to analyze the TMI-1.

  2. Future capabilities for the Deep Space Network

    NASA Technical Reports Server (NTRS)

    Berner, J. B.; Bryant, S. H.; Andrews, K. S.

    2004-01-01

    This paper will look at three new capabilities that are in different stages of development. First, turbo decoding, which provides improved telemetry performance for data rates up to about 1 Mbps, will be discussed. Next, pseudo-noise ranging will be presented. Pseudo-noise ranging has several advantages over the current sequential ranging, anmely easier operations, improved performance, and the capability to be used in a regenerative implementation on a spacecraft. Finally, Low Density Parity Check decoding will be discussed. LDPC codes can provide performance that matches or slightly exceed turbo codes, but are designed for use in the 10 Mbps range.

  3. Bilayer Protograph Codes for Half-Duplex Relay Channels

    NASA Technical Reports Server (NTRS)

    Divsalar, Dariush; VanNguyen, Thuy; Nosratinia, Aria

    2013-01-01

    Direct to Earth return links are limited by the size and power of lander devices. A standard alternative is provided by a two-hops return link: a proximity link (from lander to orbiter relay) and a deep-space link (from orbiter relay to Earth). Although direct to Earth return links are limited by the size and power of lander devices, using an additional link and a proposed coding for relay channels, one can obtain a more reliable signal. Although significant progress has been made in the relay coding problem, existing codes must be painstakingly optimized to match to a single set of channel conditions, many of them do not offer easy encoding, and most of them do not have structured design. A high-performing LDPC (low-density parity-check) code for the relay channel addresses simultaneously two important issues: a code structure that allows low encoding complexity, and a flexible rate-compatible code that allows matching to various channel conditions. Most of the previous high-performance LDPC codes for the relay channel are tightly optimized for a given channel quality, and are not easily adapted without extensive re-optimization for various channel conditions. This code for the relay channel combines structured design and easy encoding with rate compatibility to allow adaptation to the three links involved in the relay channel, and furthermore offers very good performance. The proposed code is constructed by synthesizing a bilayer structure with a pro to graph. In addition to the contribution to relay encoding, an improved family of protograph codes was produced for the point-to-point AWGN (additive white Gaussian noise) channel whose high-rate members enjoy thresholds that are within 0.07 dB of capacity. These LDPC relay codes address three important issues in an integrative manner: low encoding complexity, modular structure allowing for easy design, and rate compatibility so that the code can be easily matched to a variety of channel conditions without extensive re-optimization. The main problem of half-duplex relay coding can be reduced to the simultaneous design of two codes at two rates and two SNRs (signal-to-noise ratios), such that one is a subset of the other. This problem can be addressed by forceful optimization, but a clever method of addressing this problem is via the bilayer lengthened (BL) LDPC structure. This method uses a bilayer Tanner graph to make the two codes while using a concept of "parity forwarding" with subsequent successive decoding that removes the need to directly address the issue of uneven SNRs among the symbols of a given codeword. This method is attractive in that it addresses some of the main issues in the design of relay codes, but it does not by itself give rise to highly structured codes with simple encoding, nor does it give rate-compatible codes. The main contribution of this work is to construct a class of codes that simultaneously possess a bilayer parity- forwarding mechanism, while also benefiting from the properties of protograph codes having an easy encoding, a modular design, and being a rate-compatible code.

  4. The development of a program analysis environment for Ada: Reverse engineering tools for Ada

    NASA Technical Reports Server (NTRS)

    Cross, James H., II

    1991-01-01

    The Graphical Representations of Algorithms, Structures, and Processes for Ada (GRASP/Ada) has successfully created and prototyped a new algorithm level graphical representation for Ada software, the Control Structure Diagram (CSD). The primary impetus for creation of the CSD was to improve the comprehension efficiency of Ada software and thus improve reliability and reduce costs. The emphasis was on the automatic generation of the CSD from Ada source code to support reverse engineering and maintenance. The CSD has the potential to replace traditional prettyprinted Ada source code. In Phase 1 of the GRASP/Ada project, the CSD graphical constructs were created and applied manually to several small Ada programs. A prototype (Version 1) was designed and implemented using FLEX and BISON running under the Virtual Memory System (VMS) on a VAX 11-780. In Phase 2, the prototype was improved and ported to the Sun 4 platform under UNIX. A user interface was designed and partially implemented. The prototype was applied successfully to numerous Ada programs ranging in size from several hundred to several thousand lines of source code. In Phase 3 of the project, the prototype was prepared for limited distribution (GRASP/Ada Version 3.0) to facilitate evaluation. The user interface was extensively reworked. The current prototype provides the capability for the user to generate CSD from Ada source code in a reverse engineering mode with a level of flexibility suitable for practical application.

  5. Computational Fluid Dynamics (CFD) Design of a Blended Wing Body (BWB) with Boundary Layer Ingestion (BLI) Nacelles

    NASA Technical Reports Server (NTRS)

    Morehouse, Melissa B.

    2001-01-01

    A study is being conducted to improve the propulsion/airframe integration for the Blended Wing-Body (BWB) configuration with boundary layer ingestion nacelles. TWO unstructured grid flow solvers, USM3D and FUN3D, have been coupled with different design methods and are being used to redesign the aft wing region and the nacelles to reduce drag and flow separation. An initial study comparing analyses from these two flow solvers against data from a wind tunnel test as well as predictions from the OVERFLOW structured grid code for a BWB without nacelles has been completed. Results indicate that the unstructured grid codes are sufficiently accurate for use in design. Results from the BWB design study will be presented.

  6. Aeroacoustic Prediction Codes

    NASA Technical Reports Server (NTRS)

    Gliebe, P; Mani, R.; Shin, H.; Mitchell, B.; Ashford, G.; Salamah, S.; Connell, S.; Huff, Dennis (Technical Monitor)

    2000-01-01

    This report describes work performed on Contract NAS3-27720AoI 13 as part of the NASA Advanced Subsonic Transport (AST) Noise Reduction Technology effort. Computer codes were developed to provide quantitative prediction, design, and analysis capability for several aircraft engine noise sources. The objective was to provide improved, physics-based tools for exploration of noise-reduction concepts and understanding of experimental results. Methods and codes focused on fan broadband and 'buzz saw' noise and on low-emissions combustor noise and compliment work done by other contractors under the NASA AST program to develop methods and codes for fan harmonic tone noise and jet noise. The methods and codes developed and reported herein employ a wide range of approaches, from the strictly empirical to the completely computational, with some being semiempirical analytical, and/or analytical/computational. Emphasis was on capturing the essential physics while still considering method or code utility as a practical design and analysis tool for everyday engineering use. Codes and prediction models were developed for: (1) an improved empirical correlation model for fan rotor exit flow mean and turbulence properties, for use in predicting broadband noise generated by rotor exit flow turbulence interaction with downstream stator vanes: (2) fan broadband noise models for rotor and stator/turbulence interaction sources including 3D effects, noncompact-source effects. directivity modeling, and extensions to the rotor supersonic tip-speed regime; (3) fan multiple-pure-tone in-duct sound pressure prediction methodology based on computational fluid dynamics (CFD) analysis; and (4) low-emissions combustor prediction methodology and computer code based on CFD and actuator disk theory. In addition. the relative importance of dipole and quadrupole source mechanisms was studied using direct CFD source computation for a simple cascadeigust interaction problem, and an empirical combustor-noise correlation model was developed from engine acoustic test results. This work provided several insights on potential approaches to reducing aircraft engine noise. Code development is described in this report, and those insights are discussed.

  7. Design and performance investigation of LDPC-coded upstream transmission systems in IM/DD OFDM-PONs

    NASA Astrophysics Data System (ADS)

    Gong, Xiaoxue; Guo, Lei; Wu, Jingjing; Ning, Zhaolong

    2016-12-01

    In Intensity-Modulation Direct-Detection (IM/DD) Orthogonal Frequency Division Multiplexing Passive Optical Networks (OFDM-PONs), aside from Subcarrier-to-Subcarrier Intermixing Interferences (SSII) induced by square-law detection, the same laser frequency for data sending from Optical Network Units (ONUs) results in ONU-to-ONU Beating Interferences (OOBI) at the receiver. To mitigate those interferences, we design a Low-Density Parity Check (LDPC)-coded and spectrum-efficient upstream transmission system. A theoretical channel model is also derived, in order to analyze the detrimental factors influencing system performances. Simulation results demonstrate that the receiver sensitivity is improved 3.4 dB and 2.5 dB under QPSK and 8QAM, respectively, after 100 km Standard Single-Mode Fiber (SSMF) transmission. Furthermore, the spectrum efficiency can be improved by about 50%.

  8. Integrating automated structured analysis and design with Ada programming support environments

    NASA Technical Reports Server (NTRS)

    Hecht, Alan; Simmons, Andy

    1986-01-01

    Ada Programming Support Environments (APSE) include many powerful tools that address the implementation of Ada code. These tools do not address the entire software development process. Structured analysis is a methodology that addresses the creation of complete and accurate system specifications. Structured design takes a specification and derives a plan to decompose the system subcomponents, and provides heuristics to optimize the software design to minimize errors and maintenance. It can also produce the creation of useable modules. Studies have shown that most software errors result from poor system specifications, and that these errors also become more expensive to fix as the development process continues. Structured analysis and design help to uncover error in the early stages of development. The APSE tools help to insure that the code produced is correct, and aid in finding obscure coding errors. However, they do not have the capability to detect errors in specifications or to detect poor designs. An automated system for structured analysis and design TEAMWORK, which can be integrated with an APSE to support software systems development from specification through implementation is described. These tools completement each other to help developers improve quality and productivity, as well as to reduce development and maintenance costs. Complete system documentation and reusable code also resultss from the use of these tools. Integrating an APSE with automated tools for structured analysis and design provide capabilities and advantages beyond those realized with any of these systems used by themselves.

  9. Context-aware and locality-constrained coding for image categorization.

    PubMed

    Xiao, Wenhua; Wang, Bin; Liu, Yu; Bao, Weidong; Zhang, Maojun

    2014-01-01

    Improving the coding strategy for BOF (Bag-of-Features) based feature design has drawn increasing attention in recent image categorization works. However, the ambiguity in coding procedure still impedes its further development. In this paper, we introduce a context-aware and locality-constrained Coding (CALC) approach with context information for describing objects in a discriminative way. It is generally achieved by learning a word-to-word cooccurrence prior to imposing context information over locality-constrained coding. Firstly, the local context of each category is evaluated by learning a word-to-word cooccurrence matrix representing the spatial distribution of local features in neighbor region. Then, the learned cooccurrence matrix is used for measuring the context distance between local features and code words. Finally, a coding strategy simultaneously considers locality in feature space and context space, while introducing the weight of feature is proposed. This novel coding strategy not only semantically preserves the information in coding, but also has the ability to alleviate the noise distortion of each class. Extensive experiments on several available datasets (Scene-15, Caltech101, and Caltech256) are conducted to validate the superiority of our algorithm by comparing it with baselines and recent published methods. Experimental results show that our method significantly improves the performance of baselines and achieves comparable and even better performance with the state of the arts.

  10. CFD Modeling of Free-Piston Stirling Engines

    NASA Technical Reports Server (NTRS)

    Ibrahim, Mounir B.; Zhang, Zhi-Guo; Tew, Roy C., Jr.; Gedeon, David; Simon, Terrence W.

    2001-01-01

    NASA Glenn Research Center (GRC) is funding Cleveland State University (CSU) to develop a reliable Computational Fluid Dynamics (CFD) code that can predict engine performance with the goal of significant improvements in accuracy when compared to one-dimensional (1-D) design code predictions. The funding also includes conducting code validation experiments at both the University of Minnesota (UMN) and CSU. In this paper a brief description of the work-in-progress is provided in the two areas (CFD and Experiments). Also, previous test results are compared with computational data obtained using (1) a 2-D CFD code obtained from Dr. Georg Scheuerer and further developed at CSU and (2) a multidimensional commercial code CFD-ACE+. The test data and computational results are for (1) a gas spring and (2) a single piston/cylinder with attached annular heat exchanger. The comparisons among the codes are discussed. The paper also discusses plans for conducting code validation experiments at CSU and UMN.

  11. Equivalent plate modeling for conceptual design of aircraft wing structures

    NASA Technical Reports Server (NTRS)

    Giles, Gary L.

    1995-01-01

    This paper describes an analysis method that generates conceptual-level design data for aircraft wing structures. A key requirement is that this data must be produced in a timely manner so that is can be used effectively by multidisciplinary synthesis codes for performing systems studies. Such a capability is being developed by enhancing an equivalent plate structural analysis computer code to provide a more comprehensive, robust and user-friendly analysis tool. The paper focuses on recent enhancements to the Equivalent Laminated Plate Solution (ELAPS) analysis code that significantly expands the modeling capability and improves the accuracy of results. Modeling additions include use of out-of-plane plate segments for representing winglets and advanced wing concepts such as C-wings along with a new capability for modeling the internal rib and spar structure. The accuracy of calculated results is improved by including transverse shear effects in the formulation and by using multiple sets of assumed displacement functions in the analysis. Typical results are presented to demonstrate these new features. Example configurations include a C-wing transport aircraft, a representative fighter wing and a blended-wing-body transport. These applications are intended to demonstrate and quantify the benefits of using equivalent plate modeling of wing structures during conceptual design.

  12. Numerical Assessment of Four-Port Through-Flow Wave Rotor Cycles with Passage Height Variation

    NASA Technical Reports Server (NTRS)

    Paxson, D. E.; Lindau, Jules W.

    1997-01-01

    The potential for improved performance of wave rotor cycles through the use of passage height variation is examined. A Quasi-one-dimensional CFD code with experimentally validated loss models is used to determine the flowfield in the wave rotor passages. Results indicate that a carefully chosen passage height profile can produce substantial performance gains. Numerical performance data are presented for a specific profile, in a four-port, through-flow cycle design which yielded a computed 4.6% increase in design point pressure ratio over a comparably sized rotor with constant passage height. In a small gas turbine topping cycle application, this increased pressure ratio would reduce specific fuel consumption to 22% below the un-topped engine; a significant improvement over the already impressive 18% reductions predicted for the constant passage height rotor. The simulation code is briefly described. The method used to obtain rotor passage height profiles with enhanced performance is presented. Design and off-design results are shown using two different computational techniques. The paper concludes with some recommendations for further work.

  13. The application of coded excitation technology in medical ultrasonic Doppler imaging

    NASA Astrophysics Data System (ADS)

    Li, Weifeng; Chen, Xiaodong; Bao, Jing; Yu, Daoyin

    2008-03-01

    Medical ultrasonic Doppler imaging is one of the most important domains of modern medical imaging technology. The application of coded excitation technology in medical ultrasonic Doppler imaging system has the potential of higher SNR and deeper penetration depth than conventional pulse-echo imaging system, it also improves the image quality, and enhances the sensitivity of feeble signal, furthermore, proper coded excitation is beneficial to received spectrum of Doppler signal. Firstly, this paper analyzes the application of coded excitation technology in medical ultrasonic Doppler imaging system abstractly, showing the advantage and bright future of coded excitation technology, then introduces the principle and the theory of coded excitation. Secondly, we compare some coded serials (including Chirp and fake Chirp signal, Barker codes, Golay's complementary serial, M-sequence, etc). Considering Mainlobe Width, Range Sidelobe Level, Signal-to-Noise Ratio and sensitivity of Doppler signal, we choose Barker codes as coded serial. At last, we design the coded excitation circuit. The result in B-mode imaging and Doppler flow measurement coincided with our expectation, which incarnated the advantage of application of coded excitation technology in Digital Medical Ultrasonic Doppler Endoscope Imaging System.

  14. Improvements to a method for the geometrically nonlinear analysis of compressively loaded stiffened composite panels

    NASA Technical Reports Server (NTRS)

    Stoll, Frederick

    1993-01-01

    The NLPAN computer code uses a finite-strip approach to the analysis of thin-walled prismatic composite structures such as stiffened panels. The code can model in-plane axial loading, transverse pressure loading, and constant through-the-thickness thermal loading, and can account for shape imperfections. The NLPAN code represents an attempt to extend the buckling analysis of the VIPASA computer code into the geometrically nonlinear regime. Buckling mode shapes generated using VIPASA are used in NLPAN as global functions for representing displacements in the nonlinear regime. While the NLPAN analysis is approximate in nature, it is computationally economical in comparison with finite-element analysis, and is thus suitable for use in preliminary design and design optimization. A comprehensive description of the theoretical approach of NLPAN is provided. A discussion of some operational considerations for the NLPAN code is included. NLPAN is applied to several test problems in order to demonstrate new program capabilities, and to assess the accuracy of the code in modeling various types of loading and response. User instructions for the NLPAN computer program are provided, including a detailed description of the input requirements and example input files for two stiffened-panel configurations.

  15. RELAP-7 Closure Correlations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zou, Ling; Berry, R. A.; Martineau, R. C.

    The RELAP-7 code is the next generation nuclear reactor system safety analysis code being developed at the Idaho National Laboratory (INL). The code is based on the INL’s modern scientific software development framework, MOOSE (Multi-Physics Object Oriented Simulation Environment). The overall design goal of RELAP-7 is to take advantage of the previous thirty years of advancements in computer architecture, software design, numerical integration methods, and physical models. The end result will be a reactor systems analysis capability that retains and improves upon RELAP5’s and TRACE’s capabilities and extends their analysis capabilities for all reactor system simulation scenarios. The RELAP-7 codemore » utilizes the well-posed 7-equation two-phase flow model for compressible two-phase flow. Closure models used in the TRACE code has been reviewed and selected to reflect the progress made during the past decades and provide a basis for the colure correlations implemented in the RELAP-7 code. This document provides a summary on the closure correlations that are currently implemented in the RELAP-7 code. The closure correlations include sub-grid models that describe interactions between the fluids and the flow channel, and interactions between the two phases.« less

  16. Accuracy and Completeness of Clinical Coding Using ICD-10 for Ambulatory Visits

    PubMed Central

    Horsky, Jan; Drucker, Elizabeth A.; Ramelson, Harley Z.

    2017-01-01

    This study describes a simulation of diagnostic coding using an EHR. Twenty-three ambulatory clinicians were asked to enter appropriate codes for six standardized scenarios with two different EHRs. Their interactions with the query interface were analyzed for patterns and variations in search strategies and the resulting sets of entered codes for accuracy and completeness. Just over a half of entered codes were appropriate for a given scenario and about a quarter were omitted. Crohn’s disease and diabetes scenarios had the highest rate of inappropriate coding and code variation. The omission rate was higher for secondary than for primary visit diagnoses. Codes for immunization, dialysis dependence and nicotine dependence were the most often omitted. We also found a high rate of variation in the search terms used to query the EHR for the same diagnoses. Changes to the training of clinicians and improved design of EHR query modules may lower the rate of inappropriate and omitted codes. PMID:29854158

  17. The design improvement of horizontal stripline kicker in TPS storage ring

    NASA Astrophysics Data System (ADS)

    Chou, P. J.; Chan, C. K.; Chang, C. C.; Hsu, K. T.; Hu, K. H.; Kuan, C. K.; Sheng, I. C.

    2017-07-01

    We plan to replace the existing horizontal stripline kicker of the transverse feedback system with an improved design. Large reflected power was observed at the downstream port of stripline kicker driven by the feedback amplifier. A rapid surge of vacuum pressure was observed when we tested the high current operation in TPS storage ring in April 2016. A burned feedthrough of the horizontal stripline kicker was discovered during a maintenance shutdown. The improved design is targeted to reduce the reflection of driving power from feedback system and to reduce beam induced RF heating. This major modification of the design is described. The results of RF simulation performed with the electromagnetic code GdfidL are reported as well.

  18. Benchmarking of Neutron Production of Heavy-Ion Transport Codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Remec, Igor; Ronningen, Reginald M.; Heilbronn, Lawrence

    Accurate prediction of radiation fields generated by heavy ion interactions is important in medical applications, space missions, and in design and operation of rare isotope research facilities. In recent years, several well-established computer codes in widespread use for particle and radiation transport calculations have been equipped with the capability to simulate heavy ion transport and interactions. To assess and validate these capabilities, we performed simulations of a series of benchmark-quality heavy ion experiments with the computer codes FLUKA, MARS15, MCNPX, and PHITS. We focus on the comparisons of secondary neutron production. Results are encouraging; however, further improvements in models andmore » codes and additional benchmarking are required.« less

  19. Benchmarking of Heavy Ion Transport Codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Remec, Igor; Ronningen, Reginald M.; Heilbronn, Lawrence

    Accurate prediction of radiation fields generated by heavy ion interactions is important in medical applications, space missions, and in designing and operation of rare isotope research facilities. In recent years, several well-established computer codes in widespread use for particle and radiation transport calculations have been equipped with the capability to simulate heavy ion transport and interactions. To assess and validate these capabilities, we performed simulations of a series of benchmark-quality heavy ion experiments with the computer codes FLUKA, MARS15, MCNPX, and PHITS. We focus on the comparisons of secondary neutron production. Results are encouraging; however, further improvements in models andmore » codes and additional benchmarking are required.« less

  20. Integrated coding-aware intra-ONU scheduling for passive optical networks with inter-ONU traffic

    NASA Astrophysics Data System (ADS)

    Li, Yan; Dai, Shifang; Wu, Weiwei

    2016-12-01

    Recently, with the soaring of traffic among optical network units (ONUs), network coding (NC) is becoming an appealing technique for improving the performance of passive optical networks (PONs) with such inter-ONU traffic. However, in the existed NC-based PONs, NC can only be implemented by buffering inter-ONU traffic at the optical line terminal (OLT) to wait for the establishment of coding condition, such passive uncertain waiting severely limits the effect of NC technique. In this paper, we will study integrated coding-aware intra-ONU scheduling in which the scheduling of inter-ONU traffic within each ONU will be undertaken by the OLT to actively facilitate the forming of coding inter-ONU traffic based on the global inter-ONU traffic distribution, and then the performance of PONs with inter-ONU traffic can be significantly improved. We firstly design two report message patterns and an inter-ONU traffic transmission framework as the basis for the integrated coding-aware intra-ONU scheduling. Three specific scheduling strategies are then proposed for adapting diverse global inter-ONU traffic distributions. The effectiveness of the work is finally evaluated by both theoretical analysis and simulations.

  1. An Innovative Hybrid Loop-Pool SFR Design and Safety Analysis Methods: Today and Tomorrow

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hongbin Zhang; Haihua Zhao; Vincent Mousseau

    2008-04-01

    Investment in commercial sodium cooled fast reactor (SFR) power plants will become possible only if SFRs achieve economic competitiveness as compared to light water reactors and other Generation IV reactors. Toward that end, we have launched efforts to improve the economics and safety of SFRs from the thermal design and safety analyses perspectives at Idaho National Laboratory. From the thermal design perspective, an innovative hybrid loop-pool SFR design has been proposed. This design takes advantage of the inherent safety of a pool design and the compactness of a loop design to further improve economics and safety. From the safety analysesmore » perspective, we have initiated an effort to develop a high fidelity reactor system safety code.« less

  2. Audit of accuracy of clinical coding in oral surgery.

    PubMed

    Naran, S; Hudovsky, A; Antscherl, J; Howells, S; Nouraei, S A R

    2014-10-01

    We aimed to study the accuracy of clinical coding within oral surgery and to identify ways in which it can be improved. We undertook did a multidisciplinary audit of a sample of 646 day case patients who had had oral surgery procedures between 2011 and 2012. We compared the codes given with their case notes and amended any discrepancies. The accuracy of coding was assessed for primary and secondary diagnoses and procedures, and for health resource groupings (HRGs). The financial impact of coding Subjectivity, Variability and Error (SVE) was assessed by reference to national tariffs. The audit resulted in 122 (19%) changes to primary diagnoses. The codes for primary procedures changed in 224 (35%) cases; 310 (48%) morbidities and complications had been missed, and 266 (41%) secondary procedures had been missed or were incorrect. This led to at least one change of coding in 496 (77%) patients, and to the HRG changes in 348 (54%) patients. The financial impact of this was £114 in lost revenue per patient. There is a high incidence of coding errors in oral surgery because of the large number of day cases, a lack of awareness by clinicians of coding issues, and because clinical coders are not always familiar with the large number of highly specialised abbreviations used. Accuracy of coding can be improved through the use of a well-designed proforma, and standards can be maintained by the use of an ongoing data quality assurance programme. Copyright © 2014. Published by Elsevier Ltd.

  3. A Novel Cross-Layer Routing Protocol Based on Network Coding for Underwater Sensor Networks.

    PubMed

    Wang, Hao; Wang, Shilian; Bu, Renfei; Zhang, Eryang

    2017-08-08

    Underwater wireless sensor networks (UWSNs) have attracted increasing attention in recent years because of their numerous applications in ocean monitoring, resource discovery and tactical surveillance. However, the design of reliable and efficient transmission and routing protocols is a challenge due to the low acoustic propagation speed and complex channel environment in UWSNs. In this paper, we propose a novel cross-layer routing protocol based on network coding (NCRP) for UWSNs, which utilizes network coding and cross-layer design to greedily forward data packets to sink nodes efficiently. The proposed NCRP takes full advantages of multicast transmission and decode packets jointly with encoded packets received from multiple potential nodes in the entire network. The transmission power is optimized in our design to extend the life cycle of the network. Moreover, we design a real-time routing maintenance protocol to update the route when detecting inefficient relay nodes. Substantial simulations in underwater environment by Network Simulator 3 (NS-3) show that NCRP significantly improves the network performance in terms of energy consumption, end-to-end delay and packet delivery ratio compared with other routing protocols for UWSNs.

  4. Automatic Implementation of Ttethernet-Based Time-Triggered Avionics Applications

    NASA Astrophysics Data System (ADS)

    Gorcitz, Raul Adrian; Carle, Thomas; Lesens, David; Monchaux, David; Potop-Butucaruy, Dumitru; Sorel, Yves

    2015-09-01

    The design of safety-critical embedded systems such as those used in avionics still involves largely manual phases. But in avionics the definition of standard interfaces embodied in standards such as ARINC 653 or TTEthernet should allow the definition of fully automatic code generation flows that reduce the costs while improving the quality of the generated code, much like compilers have done when replacing manual assembly coding. In this paper, we briefly present such a fully automatic implementation tool, called Lopht, for ARINC653-based time-triggered systems, and then explain how it is currently extended to include support for TTEthernet networks.

  5. NASA Lewis Stirling engine computer code evaluation

    NASA Technical Reports Server (NTRS)

    Sullivan, Timothy J.

    1989-01-01

    In support of the U.S. Department of Energy's Stirling Engine Highway Vehicle Systems program, the NASA Lewis Stirling engine performance code was evaluated by comparing code predictions without engine-specific calibration factors to GPU-3, P-40, and RE-1000 Stirling engine test data. The error in predicting power output was -11 percent for the P-40 and 12 percent for the Re-1000 at design conditions and 16 percent for the GPU-3 at near-design conditions (2000 rpm engine speed versus 3000 rpm at design). The efficiency and heat input predictions showed better agreement with engine test data than did the power predictions. Concerning all data points, the error in predicting the GPU-3 brake power was significantly larger than for the other engines and was mainly a result of inaccuracy in predicting the pressure phase angle. Analysis into this pressure phase angle prediction error suggested that improvements to the cylinder hysteresis loss model could have a significant effect on overall Stirling engine performance predictions.

  6. Extending the Capture Volume of an Iris Recognition System Using Wavefront Coding and Super-Resolution.

    PubMed

    Hsieh, Sheng-Hsun; Li, Yung-Hui; Tien, Chung-Hao; Chang, Chin-Chen

    2016-12-01

    Iris recognition has gained increasing popularity over the last few decades; however, the stand-off distance in a conventional iris recognition system is too short, which limits its application. In this paper, we propose a novel hardware-software hybrid method to increase the stand-off distance in an iris recognition system. When designing the system hardware, we use an optimized wavefront coding technique to extend the depth of field. To compensate for the blurring of the image caused by wavefront coding, on the software side, the proposed system uses a local patch-based super-resolution method to restore the blurred image to its clear version. The collaborative effect of the new hardware design and software post-processing showed great potential in our experiment. The experimental results showed that such improvement cannot be achieved by using a hardware-or software-only design. The proposed system can increase the capture volume of a conventional iris recognition system by three times and maintain the system's high recognition rate.

  7. Aerothermo-Structural Analysis of Low Cost Composite Nozzle/Inlet Components

    NASA Technical Reports Server (NTRS)

    Shivakumar, Kuwigai; Challa, Preeli; Sree, Dave; Reddy, D.

    1999-01-01

    This research is a cooperative effort among the Turbomachinery and Propulsion Division of NASA Glenn, CCMR of NC A&T State University, and the Tuskegee University. The NC A&T is the lead center and Tuskegee University is the participating institution. Objectives of the research were to develop an integrated aerodynamic, thermal and structural analysis code for design of aircraft engine components, such as, nozzles and inlets made of textile composites; conduct design studies on typical inlets for hypersonic transportation vehicles and setup standards test examples and finally manufacture a scaled down composite inlet. These objectives are accomplished through the following seven tasks: (1) identify the relevant public domain codes for all three types of analysis; (2) evaluate the codes for the accuracy of results and computational efficiency; (3) develop aero-thermal and thermal structural mapping algorithms; (4) integrate all the codes into one single code; (5) write a graphical user interface to improve the user friendliness of the code; (6) conduct test studies for rocket based combined-cycle engine inlet; and finally (7) fabricate a demonstration inlet model using textile preform composites. Tasks one, two and six are being pursued. Selected and evaluated NPARC for flow field analysis, CSTEM for in-depth thermal analysis of inlets and nozzles and FRAC3D for stress analysis. These codes have been independently verified for accuracy and performance. In addition, graphical user interface based on micromechanics analysis for laminated as well as textile composites was developed. Demonstration of this code will be made at the conference. A rocket based combined cycle engine was selected for test studies. Flow field analysis of various inlet geometries were studied. Integration of codes is being continued. The codes developed are being applied to a candidate example of trailblazer engine proposed for space transportation. A successful development of the code will provide a simpler, faster and user-friendly tool for conducting design studies of aircraft and spacecraft engines, applicable in high speed civil transport and space missions.

  8. Aerodynamic performances of three fan stator designs operating with rotor having tip speed of 337 meters per second and pressure ratio of 1.54. Relation of analytical code calculations to experimental performance

    NASA Technical Reports Server (NTRS)

    Gelder, T. F.; Schmidt, J. F.; Esgar, G. M.

    1980-01-01

    A hub-to-shroud and a blade-to-blade internal-flow analysis code, both inviscid and basically subsonic, were used to calculate the flow parameters within four stator-blade rows. The produced ratios of maximum suction-surface velocity to trailing-edge velocity correlated well in the midspan region, with the measured total-parameters over the minimum-loss to near stall operating range for all stators and speeds studied. The potential benefits of a blade designed with the aid of these flow analysis codes are illustrated by a proposed redesign of one of the four stators studied. An overall efficiency improvement of 1.6 points above the peak measured for that stator is predicted for the redesign.

  9. Algorithms for GPU-based molecular dynamics simulations of complex fluids: Applications to water, mixtures, and liquid crystals.

    PubMed

    Kazachenko, Sergey; Giovinazzo, Mark; Hall, Kyle Wm; Cann, Natalie M

    2015-09-15

    A custom code for molecular dynamics simulations has been designed to run on CUDA-enabled NVIDIA graphics processing units (GPUs). The double-precision code simulates multicomponent fluids, with intramolecular and intermolecular forces, coarse-grained and atomistic models, holonomic constraints, Nosé-Hoover thermostats, and the generation of distribution functions. Algorithms to compute Lennard-Jones and Gay-Berne interactions, and the electrostatic force using Ewald summations, are discussed. A neighbor list is introduced to improve scaling with respect to system size. Three test systems are examined: SPC/E water; an n-hexane/2-propanol mixture; and a liquid crystal mesogen, 2-(4-butyloxyphenyl)-5-octyloxypyrimidine. Code performance is analyzed for each system. With one GPU, a 33-119 fold increase in performance is achieved compared with the serial code while the use of two GPUs leads to a 69-287 fold improvement and three GPUs yield a 101-377 fold speedup. © 2015 Wiley Periodicals, Inc.

  10. User's Guide for TOUGH2-MP - A Massively Parallel Version of the TOUGH2 Code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Earth Sciences Division; Zhang, Keni; Zhang, Keni

    TOUGH2-MP is a massively parallel (MP) version of the TOUGH2 code, designed for computationally efficient parallel simulation of isothermal and nonisothermal flows of multicomponent, multiphase fluids in one, two, and three-dimensional porous and fractured media. In recent years, computational requirements have become increasingly intensive in large or highly nonlinear problems for applications in areas such as radioactive waste disposal, CO2 geological sequestration, environmental assessment and remediation, reservoir engineering, and groundwater hydrology. The primary objective of developing the parallel-simulation capability is to significantly improve the computational performance of the TOUGH2 family of codes. The particular goal for the parallel simulator ismore » to achieve orders-of-magnitude improvement in computational time for models with ever-increasing complexity. TOUGH2-MP is designed to perform parallel simulation on multi-CPU computational platforms. An earlier version of TOUGH2-MP (V1.0) was based on the TOUGH2 Version 1.4 with EOS3, EOS9, and T2R3D modules, a software previously qualified for applications in the Yucca Mountain project, and was designed for execution on CRAY T3E and IBM SP supercomputers. The current version of TOUGH2-MP (V2.0) includes all fluid property modules of the standard version TOUGH2 V2.0. It provides computationally efficient capabilities using supercomputers, Linux clusters, or multi-core PCs, and also offers many user-friendly features. The parallel simulator inherits all process capabilities from V2.0 together with additional capabilities for handling fractured media from V1.4. This report provides a quick starting guide on how to set up and run the TOUGH2-MP program for users with a basic knowledge of running the (standard) version TOUGH2 code, The report also gives a brief technical description of the code, including a discussion of parallel methodology, code structure, as well as mathematical and numerical methods used. To familiarize users with the parallel code, illustrative sample problems are presented.« less

  11. Fast QC-LDPC code for free space optical communication

    NASA Astrophysics Data System (ADS)

    Wang, Jin; Zhang, Qi; Udeh, Chinonso Paschal; Wu, Rangzhong

    2017-02-01

    Free Space Optical (FSO) Communication systems use the atmosphere as a propagation medium. Hence the atmospheric turbulence effects lead to multiplicative noise related with signal intensity. In order to suppress the signal fading induced by multiplicative noise, we propose a fast Quasi-Cyclic (QC) Low-Density Parity-Check (LDPC) code for FSO Communication systems. As a linear block code based on sparse matrix, the performances of QC-LDPC is extremely near to the Shannon limit. Currently, the studies on LDPC code in FSO Communications is mainly focused on Gauss-channel and Rayleigh-channel, respectively. In this study, the LDPC code design over atmospheric turbulence channel which is nether Gauss-channel nor Rayleigh-channel is closer to the practical situation. Based on the characteristics of atmospheric channel, which is modeled as logarithmic-normal distribution and K-distribution, we designed a special QC-LDPC code, and deduced the log-likelihood ratio (LLR). An irregular QC-LDPC code for fast coding, of which the rates are variable, is proposed in this paper. The proposed code achieves excellent performance of LDPC codes and can present the characteristics of high efficiency in low rate, stable in high rate and less number of iteration. The result of belief propagation (BP) decoding shows that the bit error rate (BER) obviously reduced as the Signal-to-Noise Ratio (SNR) increased. Therefore, the LDPC channel coding technology can effectively improve the performance of FSO. At the same time, the BER, after decoding reduces with the increase of SNR arbitrarily, and not having error limitation platform phenomenon with error rate slowing down.

  12. Improved Design of Beam Tunnel for 42 GHz Gyrotron

    NASA Astrophysics Data System (ADS)

    Singh, Udaybir; Kumar, Nitin; Purohit, L. P.; Sinha, A. K.

    2011-04-01

    In gyrotron, there is the chance of generation and excitation of unwanted RF modes (parasite oscillations). These modes may interact with electron beam and consequently degrade the beam quality. This paper presents the improved design of the beam tunnel to reduce the parasite oscillations and the effect of beam tunnel geometry on the electron beam parameters. The design optimization of the beam tunnel has been done with the help of 3-D simulation software CST-Microwave Studio and the effect of beam tunnel geometry on the electron beam parameters has been analyzed by EGUN code.

  13. Engineering Translation in Mammalian Cell Factories to Increase Protein Yield: The Unexpected Use of Long Non-Coding SINEUP RNAs.

    PubMed

    Zucchelli, Silvia; Patrucco, Laura; Persichetti, Francesca; Gustincich, Stefano; Cotella, Diego

    2016-01-01

    Mammalian cells are an indispensable tool for the production of recombinant proteins in contexts where function depends on post-translational modifications. Among them, Chinese Hamster Ovary (CHO) cells are the primary factories for the production of therapeutic proteins, including monoclonal antibodies (MAbs). To improve expression and stability, several methodologies have been adopted, including methods based on media formulation, selective pressure and cell- or vector engineering. This review presents current approaches aimed at improving mammalian cell factories that are based on the enhancement of translation. Among well-established techniques (codon optimization and improvement of mRNA secondary structure), we describe SINEUPs, a family of antisense long non-coding RNAs that are able to increase translation of partially overlapping protein-coding mRNAs. By exploiting their modular structure, SINEUP molecules can be designed to target virtually any mRNA of interest, and thus to increase the production of secreted proteins. Thus, synthetic SINEUPs represent a new versatile tool to improve the production of secreted proteins in biomanufacturing processes.

  14. A New Design Method of Automotive Electronic Real-time Control System

    NASA Astrophysics Data System (ADS)

    Zuo, Wenying; Li, Yinguo; Wang, Fengjuan; Hou, Xiaobo

    Structure and functionality of automotive electronic control system is becoming more and more complex. The traditional manual programming development mode to realize automotive electronic control system can't satisfy development needs. So, in order to meet diversity and speedability of development of real-time control system, combining model-based design approach and auto code generation technology, this paper proposed a new design method of automotive electronic control system based on Simulink/RTW. Fristly, design algorithms and build a control system model in Matlab/Simulink. Then generate embedded code automatically by RTW and achieve automotive real-time control system development in OSEK/VDX operating system environment. The new development mode can significantly shorten the development cycle of automotive electronic control system, improve program's portability, reusability and scalability and had certain practical value for the development of real-time control system.

  15. National Combustion Code: A Multidisciplinary Combustor Design System

    NASA Technical Reports Server (NTRS)

    Stubbs, Robert M.; Liu, Nan-Suey

    1997-01-01

    The Internal Fluid Mechanics Division conducts both basic research and technology, and system technology research for aerospace propulsion systems components. The research within the division, which is both computational and experimental, is aimed at improving fundamental understanding of flow physics in inlets, ducts, nozzles, turbomachinery, and combustors. This article and the following three articles highlight some of the work accomplished in 1996. A multidisciplinary combustor design system is critical for optimizing the combustor design process. Such a system should include sophisticated computer-aided design (CAD) tools for geometry creation, advanced mesh generators for creating solid model representations, a common framework for fluid flow and structural analyses, modern postprocessing tools, and parallel processing. The goal of the present effort is to develop some of the enabling technologies and to demonstrate their overall performance in an integrated system called the National Combustion Code.

  16. Impact design methods for ceramic components in gas turbine engines

    NASA Technical Reports Server (NTRS)

    Song, J.; Cuccio, J.; Kington, H.

    1991-01-01

    Methods currently under development to design ceramic turbine components with improved impact resistance are presented. Two different modes of impact damage are identified and characterized, i.e., structural damage and local damage. The entire computation is incorporated into the EPIC computer code. Model capability is demonstrated by simulating instrumented plate impact and particle impact tests.

  17. Novel inter and intra prediction tools under consideration for the emerging AV1 video codec

    NASA Astrophysics Data System (ADS)

    Joshi, Urvang; Mukherjee, Debargha; Han, Jingning; Chen, Yue; Parker, Sarah; Su, Hui; Chiang, Angie; Xu, Yaowu; Liu, Zoe; Wang, Yunqing; Bankoski, Jim; Wang, Chen; Keyder, Emil

    2017-09-01

    Google started the WebM Project in 2010 to develop open source, royalty- free video codecs designed specifically for media on the Web. The second generation codec released by the WebM project, VP9, is currently served by YouTube, and enjoys billions of views per day. Realizing the need for even greater compression efficiency to cope with the growing demand for video on the web, the WebM team embarked on an ambitious project to develop a next edition codec AV1, in a consortium of major tech companies called the Alliance for Open Media, that achieves at least a generational improvement in coding efficiency over VP9. In this paper, we focus primarily on new tools in AV1 that improve the prediction of pixel blocks before transforms, quantization and entropy coding are invoked. Specifically, we describe tools and coding modes that improve intra, inter and combined inter-intra prediction. Results are presented on standard test sets.

  18. Observations Regarding Use of Advanced CFD Analysis, Sensitivity Analysis, and Design Codes in MDO

    NASA Technical Reports Server (NTRS)

    Newman, Perry A.; Hou, Gene J. W.; Taylor, Arthur C., III

    1996-01-01

    Observations regarding the use of advanced computational fluid dynamics (CFD) analysis, sensitivity analysis (SA), and design codes in gradient-based multidisciplinary design optimization (MDO) reflect our perception of the interactions required of CFD and our experience in recent aerodynamic design optimization studies using CFD. Sample results from these latter studies are summarized for conventional optimization (analysis - SA codes) and simultaneous analysis and design optimization (design code) using both Euler and Navier-Stokes flow approximations. The amount of computational resources required for aerodynamic design using CFD via analysis - SA codes is greater than that required for design codes. Thus, an MDO formulation that utilizes the more efficient design codes where possible is desired. However, in the aerovehicle MDO problem, the various disciplines that are involved have different design points in the flight envelope; therefore, CFD analysis - SA codes are required at the aerodynamic 'off design' points. The suggested MDO formulation is a hybrid multilevel optimization procedure that consists of both multipoint CFD analysis - SA codes and multipoint CFD design codes that perform suboptimizations.

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    H.E. Mynick, N. Pomphrey and P. Xanthopoulos

    Recent progress in reducing turbulent transport in stellarators and tokamaks by 3D shaping using a stellarator optimization code in conjunction with a gyrokinetic code is presented. The original applications of the method focussed on ion temperature gradient transport in a quasi-axisymmetric stellarator design. Here, an examination of both other turbulence channels and other starting configurations is initiated. It is found that the designs evolved for transport from ion temperature gradient turbulence also display reduced transport from other transport channels whose modes are also stabilized by improved curvature, such as electron temperature gradient and ballooning modes. The optimizer is also appliedmore » to evolving from a tokamak, finding appreciable turbulence reduction for these devices as well. From these studies, improved understanding is obtained of why the deformations found by the optimizer are beneficial, and these deformations are related to earlier theoretical work in both stellarators and tokamaks.« less

  20. Cold flow testing of the Space Shuttle Main Engine high pressure fuel turbine model

    NASA Technical Reports Server (NTRS)

    Hudson, Susan T.; Gaddis, Stephen W.; Johnson, P. D.; Boynton, James L.

    1991-01-01

    In order to experimentally determine the performance of the Space Shuttle Main Engine (SSME) High Pressure Fuel Turbopump (HPFTP) turbine, a 'cold' air flow turbine test program was established at NASA's Marshall Space Flight Center. As part of this test program, a baseline test of Rocketdyne's HPFTP turbine has been completed. The turbine performance and turbine diagnostics such as airfoil surface static pressure distributions, static pressure drops through the turbine, and exit swirl angles were investigated at the turbine design point, over its operating range, and at extreme off-design points. The data was compared to pretest predictions with good results. The test data has been used to improve meanline prediction codes and is now being used to validate various three-dimensional codes. The data will also be scaled to engine conditions and used to improve the SSME steady-state performance model.

  1. National Cost-effectiveness of ASHRAE Standard 90.1-2010 Compared to ASHRAE Standard 90.1-2007

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thornton, Brian; Halverson, Mark A.; Myer, Michael

    Pacific Northwest National Laboratory (PNNL) completed this project for the U.S. Department of Energy’s (DOE’s) Building Energy Codes Program (BECP). DOE’s BECP supports upgrading building energy codes and standards, and the states’ adoption, implementation, and enforcement of upgraded codes and standards. Building energy codes and standards set minimum requirements for energy-efficient design and construction for new and renovated buildings, and impact energy use and greenhouse gas emissions for the life of buildings. Continuous improvement of building energy efficiency is achieved by periodically upgrading energy codes and standards. Ensuring that changes in the code that may alter costs (for building components,more » initial purchase and installation, replacement, maintenance and energy) are cost-effective encourages their acceptance and implementation. ANSI/ASHRAE/IESNA Standard 90.1 is the energy standard for commercial and multi-family residential buildings over three floors.« less

  2. Cost-effectiveness of ASHRAE Standard 90.1-2010 Compared to ASHRAE Standard 90.1-2007

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thornton, Brian A.; Halverson, Mark A.; Myer, Michael

    Pacific Northwest National Laboratory (PNNL) completed this project for the U.S. Department of Energy’s (DOE’s) Building Energy Codes Program (BECP). DOE’s BECP supports upgrading building energy codes and standards, and the states’ adoption, implementation, and enforcement of upgraded codes and standards. Building energy codes and standards set minimum requirements for energy-efficient design and construction for new and renovated buildings, and impact energy use and greenhouse gas emissions for the life of buildings. Continuous improvement of building energy efficiency is achieved by periodically upgrading energy codes and standards. Ensuring that changes in the code that may alter costs (for building components,more » initial purchase and installation, replacement, maintenance and energy) are cost-effective encourages their acceptance and implementation. ANSI/ASHRAE/IESNA Standard 90.1 is the energy standard for commercial and multi-family residential buildings over three floors.« less

  3. Turbine Internal and Film Cooling Modeling For 3D Navier-Stokes Codes

    NASA Technical Reports Server (NTRS)

    DeWitt, Kenneth; Garg Vijay; Ameri, Ali

    2005-01-01

    The aim of this research project is to make use of NASA Glenn on-site computational facilities in order to develop, validate and apply aerodynamic, heat transfer, and turbine cooling models for use in advanced 3D Navier-Stokes Computational Fluid Dynamics (CFD) codes such as the Glenn-" code. Specific areas of effort include: Application of the Glenn-HT code to specific configurations made available under Turbine Based Combined Cycle (TBCC), and Ultra Efficient Engine Technology (UEET) projects. Validating the use of a multi-block code for the time accurate computation of the detailed flow and heat transfer of cooled turbine airfoils. The goal of the current research is to improve the predictive ability of the Glenn-HT code. This will enable one to design more efficient turbine components for both aviation and power generation. The models will be tested against specific configurations provided by NASA Glenn.

  4. High-fidelity plasma codes for burn physics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cooley, James; Graziani, Frank; Marinak, Marty

    Accurate predictions of equation of state (EOS), ionic and electronic transport properties are of critical importance for high-energy-density plasma science. Transport coefficients inform radiation-hydrodynamic codes and impact diagnostic interpretation, which in turn impacts our understanding of the development of instabilities, the overall energy balance of burning plasmas, and the efficacy of self-heating from charged-particle stopping. Important processes include thermal and electrical conduction, electron-ion coupling, inter-diffusion, ion viscosity, and charged particle stopping. However, uncertainties in these coefficients are not well established. Fundamental plasma science codes, also called high-fidelity plasma codes, are a relatively recent computational tool that augments both experimental datamore » and theoretical foundations of transport coefficients. This paper addresses the current status of HFPC codes and their future development, and the potential impact they play in improving the predictive capability of the multi-physics hydrodynamic codes used in HED design.« less

  5. On the validation of a code and a turbulence model appropriate to circulation control airfoils

    NASA Technical Reports Server (NTRS)

    Viegas, J. R.; Rubesin, M. W.; Maccormack, R. W.

    1988-01-01

    A computer code for calculating flow about a circulation control airfoil within a wind tunnel test section has been developed. This code is being validated for eventual use as an aid to design such airfoils. The concept of code validation being used is explained. The initial stages of the process have been accomplished. The present code has been applied to a low-subsonic, 2-D flow about a circulation control airfoil for which extensive data exist. Two basic turbulence models and variants thereof have been successfully introduced into the algorithm, the Baldwin-Lomax algebraic and the Jones-Launder two-equation models of turbulence. The variants include adding a history of the jet development for the algebraic model and adding streamwise curvature effects for both models. Numerical difficulties and difficulties in the validation process are discussed. Turbulence model and code improvements to proceed with the validation process are also discussed.

  6. Advanced helium purge seals for Liquid Oxygen (LOX) turbopumps

    NASA Technical Reports Server (NTRS)

    Shapiro, Wilbur; Lee, Chester C.

    1989-01-01

    Program objectives were to determine three advanced configurations of helium buffer seals capable of providing improved performance in a space shuttle main engine (SSME), high-pressure liquid oxygen (LOX) turbopump environment, and to provide NASA with the analytical tools to determine performance of a variety of seal configurations. The three seal designs included solid-ring fluid-film seals often referred to as floating ring seals, back-to-back fluid-film face seals, and a circumferential sectored seal that incorporated inherent clearance adjustment capabilities. Of the three seals designed, the sectored seal is favored because the self-adjusting clearance features accommodate the variations in clearance that will occur because of thermal and centrifugal distortions without compromising performance. Moreover, leakage can be contained well below the maximum target values; minimizing leakage is important on the SSME since helium is provided by an external tank. A reduction in tank size translates to an increase in payload that can be carried on board the shuttle. The computer codes supplied under this program included a code for analyzing a variety of gas-lubricated, floating ring, and sector seals; a code for analyzing gas-lubricated face seals; a code for optimizing and analyzing gas-lubricated spiral-groove face seals; and a code for determining fluid-film face seal response to runner excitations in as many as five degrees of freedom. These codes proved invaluable for optimizing designs and estimating final performance of the seals described.

  7. Connecting American Manufacturers (CAM) Virtual Manufacturing Marketplace (VMM)

    DTIC Science & Technology

    2013-11-01

    88ABW-2013-5037 1.0 SUMMARY The Connecting American Manufacturing (CAM) initiative was designed to improve the sourcing capability for Department...addition, CAM was designed to increase the number of US companies bidding on DoD business. The team’s overall approach was based on three major...is to designate a North American Industry Classification System (NAICS) code for each opportunity. Every manufacturing opportunity is posted to

  8. An application of interactive computer graphics technology to the design of dispersal mechanisms

    NASA Technical Reports Server (NTRS)

    Richter, B. J.; Welch, B. H.

    1977-01-01

    Interactive computer graphics technology is combined with a general purpose mechanisms computer code to study the operational behavior of three guided bomb dispersal mechanism designs. These studies illustrate the use of computer graphics techniques to discover operational anomalies, to assess the effectiveness of design improvements, to reduce the time and cost of the modeling effort, and to provide the mechanism designer with a visual understanding of the physical operation of such systems.

  9. Boundary-layer stability and airfoil design

    NASA Technical Reports Server (NTRS)

    Viken, Jeffrey K.

    1986-01-01

    Several different natural laminar flow (NLF) airfoils have been analyzed for stability of the laminar boundary layer using linear stability codes. The NLF airfoils analyzed come from three different design conditions: incompressible; compressible with no sweep; and compressible with sweep. Some of the design problems are discussed, concentrating on those problems associated with keeping the boundary layer laminar. Also, there is a discussion on how a linear stability analysis was effectively used to improve the design for some of the airfoils.

  10. Joint design of QC-LDPC codes for coded cooperation system with joint iterative decoding

    NASA Astrophysics Data System (ADS)

    Zhang, Shunwai; Yang, Fengfan; Tang, Lei; Ejaz, Saqib; Luo, Lin; Maharaj, B. T.

    2016-03-01

    In this paper, we investigate joint design of quasi-cyclic low-density-parity-check (QC-LDPC) codes for coded cooperation system with joint iterative decoding in the destination. First, QC-LDPC codes based on the base matrix and exponent matrix are introduced, and then we describe two types of girth-4 cycles in QC-LDPC codes employed by the source and relay. In the equivalent parity-check matrix corresponding to the jointly designed QC-LDPC codes employed by the source and relay, all girth-4 cycles including both type I and type II are cancelled. Theoretical analysis and numerical simulations show that the jointly designed QC-LDPC coded cooperation well combines cooperation gain and channel coding gain, and outperforms the coded non-cooperation under the same conditions. Furthermore, the bit error rate performance of the coded cooperation employing jointly designed QC-LDPC codes is better than those of random LDPC codes and separately designed QC-LDPC codes over AWGN channels.

  11. Design improvement of a pump wear ring labyrinth seal

    NASA Technical Reports Server (NTRS)

    Rhode, David L.; Morrison, G. L.; Ko, S. H.; Waughtal, S. P.

    1987-01-01

    The investigation was successful in obtaining two improved designs for the impeller wear ring seal of the liquid hydrogen turbopump of interest. A finite difference computer code was extensively used in a parametric computational study in determining a cavity configuration with high flow resistance due to turbulence dissipation. These two designs, along with that currently used, were fabricated and tested. The improved designs were denoted Type O and Type S. The measurements showed that Type O and Type S given 67 and 30 percent reduction in leakage over the current design, respectively. It was found that the number of cavities, the step height and the presence of a small stator groove are quite important design features. Also, the tooth thickness is of some significance. Finally, the tooth height and an additional large cavity cut out from the stator (upstream of the step) are of negligible importance.

  12. Highly efficient Cas9-mediated transcriptional programming

    DOE PAGES

    Chavez, Alejandro; Scheiman, Jonathan; Vora, Suhani; ...

    2015-03-02

    The RNA-guided nuclease Cas9 can be reengineered as a programmable transcription factor. However, modest levels of gene activation have limited potential applications. Here we describe an improved transcriptional regulator through the rational design of a tripartite activator, VP64-p65-Rta (VPR), fused to nuclease-null Cas9. Here, we demonstrate its utility in activating endogenous coding and non-coding genes, targeting several genes simultaneously and stimulating neuronal differentiation of human induced pluripotent stem cells (iPSCs).

  13. System analysis with improved thermo-mechanical fuel rod models for modeling current and advanced LWR materials in accident scenarios

    NASA Astrophysics Data System (ADS)

    Porter, Ian Edward

    A nuclear reactor systems code has the ability to model the system response in an accident scenario based on known initial conditions at the onset of the transient. However, there has been a tendency for these codes to lack the detailed thermo-mechanical fuel rod response models needed for accurate prediction of fuel rod failure. This proposed work will couple today's most widely used steady-state (FRAPCON) and transient (FRAPTRAN) fuel rod models with a systems code TRACE for best-estimate modeling of system response in accident scenarios such as a loss of coolant accident (LOCA). In doing so, code modifications will be made to model gamma heating in LWRs during steady-state and accident conditions and to improve fuel rod thermal/mechanical analysis by allowing axial nodalization of burnup-dependent phenomena such as swelling, cladding creep and oxidation. With the ability to model both burnup-dependent parameters and transient fuel rod response, a fuel dispersal study will be conducted using a hypothetical accident scenario under both PWR and BWR conditions to determine the amount of fuel dispersed under varying conditions. Due to the fuel fragmentation size and internal rod pressure both being dependent on burnup, this analysis will be conducted at beginning, middle and end of cycle to examine the effects that cycle time can play on fuel rod failure and dispersal. Current fuel rod and system codes used by the Nuclear Regulatory Commission (NRC) are compilations of legacy codes with only commonly used light water reactor materials, Uranium Dioxide (UO2), Mixed Oxide (U/PuO 2) and zirconium alloys. However, the events at Fukushima Daiichi and Three Mile Island accident have shown the need for exploration into advanced materials possessing improved accident tolerance. This work looks to further modify the NRC codes to include silicon carbide (SiC), an advanced cladding material proposed by current DOE funded research on accident tolerant fuels (ATF). Several additional fuels will also be analyzed, including uranium nitride (UN), uranium carbide (UC) and uranium silicide (U3Si2). Focusing on the system response in an accident scenario, an emphasis is placed on the fracture mechanics of the ceramic cladding by design the fuel rods to eliminate pellet cladding mechanical interaction (PCMI). The time to failure and how much of the fuel in the reactor fails with an advanced fuel design will be analyzed and compared to the current UO2/Zircaloy design using a full scale reactor model.

  14. Laser pulse coded signal frequency measuring device based on DSP and CPLD

    NASA Astrophysics Data System (ADS)

    Zhang, Hai-bo; Cao, Li-hua; Geng, Ai-hui; Li, Yan; Guo, Ru-hai; Wang, Ting-feng

    2011-06-01

    Laser pulse code is an anti-jamming measures used in semi-active laser guided weapons. On account of the laser-guided signals adopting pulse coding mode and the weak signal processing, it need complex calculations in the frequency measurement process according to the laser pulse code signal time correlation to meet the request in optoelectronic countermeasures in semi-active laser guided weapons. To ensure accurately completing frequency measurement in a short time, it needed to carry out self-related process with the pulse arrival time series composed of pulse arrival time, calculate the signal repetition period, and then identify the letter type to achieve signal decoding from determining the time value, number and rank number in a signal cycle by Using CPLD and DSP for signal processing chip, designing a laser-guided signal frequency measurement in the pulse frequency measurement device, improving the signal processing capability through the appropriate software algorithms. In this article, we introduced the principle of frequency measurement of the device, described the hardware components of the device, the system works and software, analyzed the impact of some system factors on the accuracy of the measurement. The experimental results indicated that this system improve the accuracy of the measurement under the premise of volume, real-time, anti-interference, low power of the laser pulse frequency measuring device. The practicality of the design, reliability has been demonstrated from the experimental point of view.

  15. Benchmarking of neutron production of heavy-ion transport codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Remec, I.; Ronningen, R. M.; Heilbronn, L.

    Document available in abstract form only, full text of document follows: Accurate prediction of radiation fields generated by heavy ion interactions is important in medical applications, space missions, and in design and operation of rare isotope research facilities. In recent years, several well-established computer codes in widespread use for particle and radiation transport calculations have been equipped with the capability to simulate heavy ion transport and interactions. To assess and validate these capabilities, we performed simulations of a series of benchmark-quality heavy ion experiments with the computer codes FLUKA, MARS15, MCNPX, and PHITS. We focus on the comparisons of secondarymore » neutron production. Results are encouraging; however, further improvements in models and codes and additional benchmarking are required. (authors)« less

  16. Leveraging the NLM map from SNOMED CT to ICD-10-CM to facilitate adoption of ICD-10-CM.

    PubMed

    Cartagena, F Phil; Schaeffer, Molly; Rifai, Dorothy; Doroshenko, Victoria; Goldberg, Howard S

    2015-05-01

    Develop and test web services to retrieve and identify the most precise ICD-10-CM code(s) for a given clinical encounter. Facilitate creation of user interfaces that 1) provide an initial shortlist of candidate codes, ideally visible on a single screen; and 2) enable code refinement. To satisfy our high-level use cases, the analysis and design process involved reviewing available maps and crosswalks, designing the rule adjudication framework, determining necessary metadata, retrieving related codes, and iteratively improving the code refinement algorithm. The Partners ICD-10-CM Search and Mapping Services (PI-10 Services) are SOAP web services written using Microsoft's.NET 4.0 Framework, Windows Communications Framework, and SQL Server 2012. The services cover 96% of the Partners problem list subset of SNOMED CT codes that map to ICD-10-CM codes and can return up to 76% of the 69,823 billable ICD-10-CM codes prior to creation of custom mapping rules. We consider ways to increase 1) the coverage ratio of the Partners problem list subset of SNOMED CT codes and 2) the upper bound of returnable ICD-10-CM codes by creating custom mapping rules. Future work will investigate the utility of the transitive closure of SNOMED CT codes and other methods to assist in custom rule creation and, ultimately, to provide more complete coverage of ICD-10-CM codes. ICD-10-CM will be easier for clinicians to manage if applications display short lists of candidate codes from which clinicians can subsequently select a code for further refinement. The PI-10 Services support ICD-10 migration by implementing this paradigm and enabling users to consistently and accurately find the best ICD-10-CM code(s) without translation from ICD-9-CM. © The Author 2015. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  17. Seismology software: state of the practice

    NASA Astrophysics Data System (ADS)

    Smith, W. Spencer; Zeng, Zheng; Carette, Jacques

    2018-05-01

    We analyzed the state of practice for software development in the seismology domain by comparing 30 software packages on four aspects: product, implementation, design, and process. We found room for improvement in most seismology software packages. The principal areas of concern include a lack of adequate requirements and design specification documents, a lack of test data to assess reliability, a lack of examples to get new users started, and a lack of technological tools to assist with managing the development process. To assist going forward, we provide recommendations for a document-driven development process that includes a problem statement, development plan, requirement specification, verification and validation (V&V) plan, design specification, code, V&V report, and a user manual. We also provide advice on tool use, including issue tracking, version control, code documentation, and testing tools.

  18. Seismology software: state of the practice

    NASA Astrophysics Data System (ADS)

    Smith, W. Spencer; Zeng, Zheng; Carette, Jacques

    2018-02-01

    We analyzed the state of practice for software development in the seismology domain by comparing 30 software packages on four aspects: product, implementation, design, and process. We found room for improvement in most seismology software packages. The principal areas of concern include a lack of adequate requirements and design specification documents, a lack of test data to assess reliability, a lack of examples to get new users started, and a lack of technological tools to assist with managing the development process. To assist going forward, we provide recommendations for a document-driven development process that includes a problem statement, development plan, requirement specification, verification and validation (V&V) plan, design specification, code, V&V report, and a user manual. We also provide advice on tool use, including issue tracking, version control, code documentation, and testing tools.

  19. Evolvix BEST Names for semantic reproducibility across code2brain interfaces

    PubMed Central

    Scheuer, Katherine S.; Keel, Seth A.; Vyas, Vaibhav; Liblit, Ben; Hanlon, Bret; Ferris, Michael C.; Yin, John; Dutra, Inês; Pietsch, Anthony; Javid, Christine G.; Moog, Cecilia L.; Meyer, Jocelyn; Dresel, Jerdon; McLoone, Brian; Loberger, Sonya; Movaghar, Arezoo; Gilchrist‐Scott, Morgaine; Sabri, Yazeed; Sescleifer, Dave; Pereda‐Zorrilla, Ivan; Zietlow, Andrew; Smith, Rodrigo; Pietenpol, Samantha; Goldfinger, Jacob; Atzen, Sarah L.; Freiberg, Erika; Waters, Noah P.; Nusbaum, Claire; Nolan, Erik; Hotz, Alyssa; Kliman, Richard M.; Mentewab, Ayalew; Fregien, Nathan; Loewe, Martha

    2016-01-01

    Names in programming are vital for understanding the meaning of code and big data. We define code2brain (C2B) interfaces as maps in compilers and brains between meaning and naming syntax, which help to understand executable code. While working toward an Evolvix syntax for general‐purpose programming that makes accurate modeling easy for biologists, we observed how names affect C2B quality. To protect learning and coding investments, C2B interfaces require long‐term backward compatibility and semantic reproducibility (accurate reproduction of computational meaning from coder‐brains to reader‐brains by code alone). Semantic reproducibility is often assumed until confusing synonyms degrade modeling in biology to deciphering exercises. We highlight empirical naming priorities from diverse individuals and roles of names in different modes of computing to show how naming easily becomes impossibly difficult. We present the Evolvix BEST (Brief, Explicit, Summarizing, Technical) Names concept for reducing naming priority conflicts, test it on a real challenge by naming subfolders for the Project Organization Stabilizing Tool system, and provide naming questionnaires designed to facilitate C2B debugging by improving names used as keywords in a stabilizing programming language. Our experiences inspired us to develop Evolvix using a flipped programming language design approach with some unexpected features and BEST Names at its core. PMID:27918836

  20. Validation of ICD-9-CM coding algorithm for improved identification of hypoglycemia visits.

    PubMed

    Ginde, Adit A; Blanc, Phillip G; Lieberman, Rebecca M; Camargo, Carlos A

    2008-04-01

    Accurate identification of hypoglycemia cases by International Classification of Diseases, Ninth Revision, Clinical Modification (ICD-9-CM) codes will help to describe epidemiology, monitor trends, and propose interventions for this important complication in patients with diabetes. Prior hypoglycemia studies utilized incomplete search strategies and may be methodologically flawed. We sought to validate a new ICD-9-CM coding algorithm for accurate identification of hypoglycemia visits. This was a multicenter, retrospective cohort study using a structured medical record review at three academic emergency departments from July 1, 2005 to June 30, 2006. We prospectively derived a coding algorithm to identify hypoglycemia visits using ICD-9-CM codes (250.3, 250.8, 251.0, 251.1, 251.2, 270.3, 775.0, 775.6, and 962.3). We confirmed hypoglycemia cases by chart review identified by candidate ICD-9-CM codes during the study period. The case definition for hypoglycemia was documented blood glucose 3.9 mmol/l or emergency physician charted diagnosis of hypoglycemia. We evaluated individual components and calculated the positive predictive value. We reviewed 636 charts identified by the candidate ICD-9-CM codes and confirmed 436 (64%) cases of hypoglycemia by chart review. Diabetes with other specified manifestations (250.8), often excluded in prior hypoglycemia analyses, identified 83% of hypoglycemia visits, and unspecified hypoglycemia (251.2) identified 13% of hypoglycemia visits. The absence of any predetermined co-diagnosis codes improved the positive predictive value of code 250.8 from 62% to 92%, while excluding only 10 (2%) true hypoglycemia visits. Although prior analyses included only the first-listed ICD-9 code, more than one-quarter of identified hypoglycemia visits were outside this primary diagnosis field. Overall, the proposed algorithm had 89% positive predictive value (95% confidence interval, 86-92) for detecting hypoglycemia visits. The proposed algorithm improves on prior strategies to identify hypoglycemia visits in administrative data sets and will enhance the ability to study the epidemiology and design interventions for this important complication of diabetes care.

  1. Numerical, analytical, experimental study of fluid dynamic forces in seals

    NASA Technical Reports Server (NTRS)

    Shapiro, William; Artiles, Antonio; Aggarwal, Bharat; Walowit, Jed; Athavale, Mahesh M.; Preskwas, Andrzej J.

    1992-01-01

    NASA/Lewis Research Center is sponsoring a program for providing computer codes for analyzing and designing turbomachinery seals for future aerospace and engine systems. The program is made up of three principal components: (1) the development of advanced three dimensional (3-D) computational fluid dynamics codes, (2) the production of simpler two dimensional (2-D) industrial codes, and (3) the development of a knowledge based system (KBS) that contains an expert system to assist in seal selection and design. The first task has been to concentrate on cylindrical geometries with straight, tapered, and stepped bores. Improvements have been made by adoption of a colocated grid formulation, incorporation of higher order, time accurate schemes for transient analysis and high order discretization schemes for spatial derivatives. This report describes the mathematical formulations and presents a variety of 2-D results, including labyrinth and brush seal flows. Extensions of 3-D are presently in progress.

  2. Statistical Analysis of CFD Solutions from the Third AIAA Drag Prediction Workshop

    NASA Technical Reports Server (NTRS)

    Morrison, Joseph H.; Hemsch, Michael J.

    2007-01-01

    The first AIAA Drag Prediction Workshop, held in June 2001, evaluated the results from an extensive N-version test of a collection of Reynolds-Averaged Navier-Stokes CFD codes. The code-to-code scatter was more than an order of magnitude larger than desired for design and experimental validation of cruise conditions for a subsonic transport configuration. The second AIAA Drag Prediction Workshop, held in June 2003, emphasized the determination of installed pylon-nacelle drag increments and grid refinement studies. The code-to-code scatter was significantly reduced compared to the first DPW, but still larger than desired. However, grid refinement studies showed no significant improvement in code-to-code scatter with increasing grid refinement. The third Drag Prediction Workshop focused on the determination of installed side-of-body fairing drag increments and grid refinement studies for clean attached flow on wing alone configurations and for separated flow on the DLR-F6 subsonic transport model. This work evaluated the effect of grid refinement on the code-to-code scatter for the clean attached flow test cases and the separated flow test cases.

  3. Radiation from advanced solid rocket motor plumes

    NASA Technical Reports Server (NTRS)

    Farmer, Richard C.; Smith, Sheldon D.; Myruski, Brian L.

    1994-01-01

    The overall objective of this study was to develop an understanding of solid rocket motor (SRM) plumes in sufficient detail to accurately explain the majority of plume radiation test data. Improved flowfield and radiation analysis codes were developed to accurately and efficiently account for all the factors which effect radiation heating from rocket plumes. These codes were verified by comparing predicted plume behavior with measured NASA/MSFC ASRM test data. Upon conducting a thorough review of the current state-of-the-art of SRM plume flowfield and radiation prediction methodology and the pertinent data base, the following analyses were developed for future design use. The NOZZRAD code was developed for preliminary base heating design and Al2O3 particle optical property data evaluation using a generalized two-flux solution to the radiative transfer equation. The IDARAD code was developed for rapid evaluation of plume radiation effects using the spherical harmonics method of differential approximation to the radiative transfer equation. The FDNS CFD code with fully coupled Euler-Lagrange particle tracking was validated by comparison to predictions made with the industry standard RAMP code for SRM nozzle flowfield analysis. The FDNS code provides the ability to analyze not only rocket nozzle flow, but also axisymmetric and three-dimensional plume flowfields with state-of-the-art CFD methodology. Procedures for conducting meaningful thermo-vision camera studies were developed.

  4. Effect of two doses of ginkgo biloba extract (EGb 761) on the dual-coding test in elderly subjects.

    PubMed

    Allain, H; Raoul, P; Lieury, A; LeCoz, F; Gandon, J M; d'Arbigny, P

    1993-01-01

    The subjects of this double-blind study were 18 elderly men and women (mean age, 69.3 years) with slight age-related memory impairment. In a crossover-study design, each subject received placebo or an extract of Ginkgo biloba (EGb 761) (320 mg or 600 mg) 1 hour before performing a dual-coding test that measures the speed of information processing; the test consists of several coding series of drawings and words presented at decreasing times of 1920, 960, 480, 240, and 120 ms. The dual-coding phenomenon (a break point between coding verbal material and images) was demonstrated in all the tests. After placebo, the break point was observed at 960 ms and dual coding beginning at 1920 ms. After each dose of the ginkgo extract, the break point (at 480 ms) and dual coding (at 960 ms) were significantly shifted toward a shorter presentation time, indicating an improvement in the speed of information processing.

  5. Improvement of absolute positioning of precision stage based on cooperation the zero position pulse signal and incremental displacement signal

    NASA Astrophysics Data System (ADS)

    Wang, H. H.; Shi, Y. P.; Li, X. H.; Ni, K.; Zhou, Q.; Wang, X. H.

    2018-03-01

    In this paper, a scheme to measure the position of precision stages, with a high precision, is presented. The encoder is composed of a scale grating and a compact two-probe reading head, to read the zero position pulse signal and continuous incremental displacement signal. The scale grating contains different codes, multiple reference codes with different spacing superimposed onto the incremental grooves with an equal spacing structure. The codes of reference mask in the reading head is the same with the reference codes on the scale grating, and generate pulse signal to locate the reference position primarily when the reading head moves along the scale grating. After locating the reference position in a section by means of the pulse signal, the reference position can be located precisely with the amplitude of the incremental displacement signal. A kind of reference codes and scale grating were designed, and experimental results show that the primary precision of the design achieved is 1 μ m. The period of the incremental signal is 1μ m, and 1000/N nm precision can be achieved by subdivide the incremental signal in N times.

  6. Prediction of effects of wing contour modifications on low-speed maximum lift and transonic performance for the EA-6B aircraft

    NASA Technical Reports Server (NTRS)

    Allison, Dennis O.; Waggoner, E. G.

    1990-01-01

    Computational predictions of the effects of wing contour modifications on maximum lift and transonic performance were made and verified against low speed and transonic wind tunnel data. This effort was part of a program to improve the maneuvering capability of the EA-6B electronics countermeasures aircraft, which evolved from the A-6 attack aircraft. The predictions were based on results from three computer codes which all include viscous effects: MCARF, a 2-D subsonic panel code; TAWFIVE, a transonic full potential code; and WBPPW, a transonic small disturbance potential flow code. The modifications were previously designed with the aid of these and other codes. The wing modifications consists of contour changes to the leading edge slats and trailing edge flaps and were designed for increased maximum lift with minimum effect on transonic performance. The prediction of the effects of the modifications are presented, with emphasis on verification through comparisons with wind tunnel data from the National Transonic Facility. Attention is focused on increments in low speed maximum lift and increments in transonic lift, pitching moment, and drag resulting from the contour modifications.

  7. Preliminary LOCA analysis of the westinghouse small modular reactor using the WCOBRA/TRAC-TF2 thermal-hydraulics code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liao, J.; Kucukboyaci, V. N.; Nguyen, L.

    2012-07-01

    The Westinghouse Small Modular Reactor (SMR) is an 800 MWt (> 225 MWe) integral pressurized water reactor (iPWR) with all primary components, including the steam generator and the pressurizer located inside the reactor vessel. The reactor core is based on a partial-height 17x17 fuel assembly design used in the AP1000{sup R} reactor core. The Westinghouse SMR utilizes passive safety systems and proven components from the AP1000 plant design with a compact containment that houses the integral reactor vessel and the passive safety systems. A preliminary loss of coolant accident (LOCA) analysis of the Westinghouse SMR has been performed using themore » WCOBRA/TRAC-TF2 code, simulating a transient caused by a double ended guillotine (DEG) break in the direct vessel injection (DVI) line. WCOBRA/TRAC-TF2 is a new generation Westinghouse LOCA thermal-hydraulics code evolving from the US NRC licensed WCOBRA/TRAC code. It is designed to simulate PWR LOCA events from the smallest break size to the largest break size (DEG cold leg). A significant number of fluid dynamics models and heat transfer models were developed or improved in WCOBRA/TRAC-TF2. A large number of separate effects and integral effects tests were performed for a rigorous code assessment and validation. WCOBRA/TRAC-TF2 was introduced into the Westinghouse SMR design phase to assist a quick and robust passive cooling system design and to identify thermal-hydraulic phenomena for the development of the SMR Phenomena Identification Ranking Table (PIRT). The LOCA analysis of the Westinghouse SMR demonstrates that the DEG DVI break LOCA is mitigated by the injection and venting from the Westinghouse SMR passive safety systems without core heat up, achieving long term core cooling. (authors)« less

  8. On transform coding tools under development for VP10

    NASA Astrophysics Data System (ADS)

    Parker, Sarah; Chen, Yue; Han, Jingning; Liu, Zoe; Mukherjee, Debargha; Su, Hui; Wang, Yongzhe; Bankoski, Jim; Li, Shunyao

    2016-09-01

    Google started the WebM Project in 2010 to develop open source, royaltyfree video codecs designed specifically for media on the Web. The second generation codec released by the WebM project, VP9, is currently served by YouTube, and enjoys billions of views per day. Realizing the need for even greater compression efficiency to cope with the growing demand for video on the web, the WebM team embarked on an ambitious project to develop a next edition codec, VP10, that achieves at least a generational improvement in coding efficiency over VP9. Starting from VP9, a set of new experimental coding tools have already been added to VP10 to achieve decent coding gains. Subsequently, Google joined a consortium of major tech companies called the Alliance for Open Media to jointly develop a new codec AV1. As a result, the VP10 effort is largely expected to merge with AV1. In this paper, we focus primarily on new tools in VP10 that improve coding of the prediction residue using transform coding techniques. Specifically, we describe tools that increase the flexibility of available transforms, allowing the codec to handle a more diverse range or residue structures. Results are presented on a standard test set.

  9. NetCoDer: A Retransmission Mechanism for WSNs Based on Cooperative Relays and Network Coding

    PubMed Central

    Valle, Odilson T.; Montez, Carlos; Medeiros de Araujo, Gustavo; Vasques, Francisco; Moraes, Ricardo

    2016-01-01

    Some of the most difficult problems to deal with when using Wireless Sensor Networks (WSNs) are related to the unreliable nature of communication channels. In this context, the use of cooperative diversity techniques and the application of network coding concepts may be promising solutions to improve the communication reliability. In this paper, we propose the NetCoDer scheme to address this problem. Its design is based on merging cooperative diversity techniques and network coding concepts. We evaluate the effectiveness of the NetCoDer scheme through both an experimental setup with real WSN nodes and a simulation assessment, comparing NetCoDer performance against state-of-the-art TDMA-based (Time Division Multiple Access) retransmission techniques: BlockACK, Master/Slave and Redundant TDMA. The obtained results highlight that the proposed NetCoDer scheme clearly improves the network performance when compared with other retransmission techniques. PMID:27258280

  10. Cross-Layer Design for Video Transmission over Wireless Rician Slow-Fading Channels Using an Adaptive Multiresolution Modulation and Coding Scheme

    NASA Astrophysics Data System (ADS)

    Pei, Yong; Modestino, James W.

    2007-12-01

    We describe a multilayered video transport scheme for wireless channels capable of adapting to channel conditions in order to maximize end-to-end quality of service (QoS). This scheme combines a scalable H.263+ video source coder with unequal error protection (UEP) across layers. The UEP is achieved by employing different channel codes together with a multiresolution modulation approach to transport the different priority layers. Adaptivity to channel conditions is provided through a joint source-channel coding (JSCC) approach which attempts to jointly optimize the source and channel coding rates together with the modulation parameters to obtain the maximum achievable end-to-end QoS for the prevailing channel conditions. In this work, we model the wireless links as slow-fading Rician channel where the channel conditions can be described in terms of the channel signal-to-noise ratio (SNR) and the ratio of specular-to-diffuse energy[InlineEquation not available: see fulltext.]. The multiresolution modulation/coding scheme consists of binary rate-compatible punctured convolutional (RCPC) codes used together with nonuniform phase-shift keyed (PSK) signaling constellations. Results indicate that this adaptive JSCC scheme employing scalable video encoding together with a multiresolution modulation/coding approach leads to significant improvements in delivered video quality for specified channel conditions. In particular, the approach results in considerably improved graceful degradation properties for decreasing channel SNR.

  11. CTF Theory Manual

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Avramova, Maria N.; Salko, Robert K.

    Coolant-Boiling in Rod Arrays|Two Fluids (COBRA-TF) is a thermal/ hydraulic (T/H) simulation code designed for light water reactor (LWR) vessel analysis. It uses a two-fluid, three-field (i.e. fluid film, fluid drops, and vapor) modeling approach. Both sub-channel and 3D Cartesian forms of 9 conservation equations are available for LWR modeling. The code was originally developed by Pacific Northwest Laboratory in 1980 and had been used and modified by several institutions over the last few decades. COBRA-TF also found use at the Pennsylvania State University (PSU) by the Reactor Dynamics and Fuel Management Group (RDFMG) and has been improved, updated, andmore » subsequently re-branded as CTF. As part of the improvement process, it was necessary to generate sufficient documentation for the open-source code which had lacked such material upon being adopted by RDFMG. This document serves mainly as a theory manual for CTF, detailing the many two-phase heat transfer, drag, and important accident scenario models contained in the code as well as the numerical solution process utilized. Coding of the models is also discussed, all with consideration for updates that have been made when transitioning from COBRA-TF to CTF. Further documentation outside of this manual is also available at RDFMG which focus on code input deck generation and source code global variable and module listings.« less

  12. Aerodynamic tailoring of the Learjet Model 60 wing

    NASA Technical Reports Server (NTRS)

    Chandrasekharan, Reuben M.; Hawke, Veronica M.; Hinson, Michael L.; Kennelly, Robert A., Jr.; Madson, Michael D.

    1993-01-01

    The wing of the Learjet Model 60 was tailored for improved aerodynamic characteristics using the TRANAIR transonic full-potential computational fluid dynamics (CFD) code. A root leading edge glove and wing tip fairing were shaped to reduce shock strength, improve cruise drag and extend the buffet limit. The aerodynamic design was validated by wind tunnel test and flight test data.

  13. Improved Design/Reduction of Manufacturing Costs of Space-Traveling Wave Tiube Amplifiers Final Report CRADA No. TC-0461-93

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shang, C. C.; Drasco, M.

    The purpose of the CRADA was to develop new microwave codes for analyzing both slow-,vave structures and beam-wave interactions of traveling wave tube amplifiers (TWTA), the microwave power source for satellite and radar communication systems. The scope of work also included testing and improving power modules through measurements and simulation.

  14. Improvement of signal to noise ratio of time domain mutliplexing fiber Bragg grating sensor network with Golay complementary codes

    NASA Astrophysics Data System (ADS)

    Elgaud, M. M.; Zan, M. S. D.; Abushagur, A. G.; Bakar, A. Ashrif A.

    2017-07-01

    This paper reports the employment of autocorrelation properties of Golay complementary codes (GCC) to enhance the performance of the time domain multiplexing fiber Bragg grating (TDM-FBG) sensing network. By encoding the light from laser with a stream of non-return-to-zero (NRZ) form of GCC and launching it into the sensing area that consists of the FBG sensors, we have found that the FBG signals can be decoded correctly with the autocorrelation calculations, confirming the successful demonstration of coded TDM-FBG sensor network. OptiGrating and OptiSystem simulators were used to design customized FBG sensors and perform the coded TDM-FBG sensor simulations, respectively. Results have substantiated the theoretical dependence of SNR enhancement on the code length of GCC, where the maximum SNR improvement of about 9 dB is achievable with the use of 256 bits of GCC compared to that of 4 bits case. Furthermore, the GCC has also extended the strain exposure up to 30% higher compared to the maximum of the conventional single pulse case. The employment of GCC in the TDM-FBG sensor system provides overall performance enhancement over the conventional single pulse case, under the same conditions.

  15. Surviving "Payment by Results": a simple method of improving clinical coding in burn specialised services in the United Kingdom.

    PubMed

    Wallis, Katy L; Malic, Claudia C; Littlewood, Sonia L; Judkins, Keith; Phipps, Alan R

    2009-03-01

    Coding inpatient episodes plays an important role in determining the financial remuneration of a clinical service. Insufficient or incomplete data may have very significant consequences on its viability. We created a document that improves the coding process in our Burns Centre. At Yorkshire Regional Burns Centre an inpatient summary sheet was designed to prospectively record and present essential information on a daily basis, for use in the coding process. The level of care was also recorded. A 3-month audit was conducted to assess the efficacy of the new forms. Forty-nine patients were admitted to the Burns Centre with a mean age of 27.6 years and TBSA ranging from 0.5% to 65%. The total stay in the Burns Centre was 758 days, of which 22% were at level B3-B5 and 39% at level B2. The use of the new discharge document identified potential income of about 500,000 GB pound sterling at our local daily tariffs for high dependency and intensive care. The new form is able to ensure a high quality of coding with a possible direct impact on the financial resources accrued for burn care.

  16. Building Energy Efficiency in Rural China

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Evans, Meredydd; Yu, Sha; Song, Bo

    2014-04-01

    Rural buildings in China now account for more than half of China’s total building energy use. Forty percent of the floorspace in China is in rural villages and towns. Most of these buildings are very energy inefficient, and may struggle to meet basic needs. They are cold in the winter, and often experience indoor air pollution from fuel use. The Chinese government plans to adopt a voluntary building energy code, or design standard, for rural homes. The goal is to build on China’s success with codes in urban areas to improve efficiency and comfort in rural homes. The Chinese governmentmore » recognizes rural buildings represent a major opportunity for improving national building energy efficiency. The challenges of rural China are also greater than those of urban areas in many ways because of the limited local capacity and low income levels. The Chinese government wants to expand on new programs to subsidize energy efficiency improvements in rural homes to build capacity for larger-scale improvement. This article summarizes the trends and status of rural building energy use in China. It then provides an overview of the new rural building design standard, and describes options and issues to move forward with implementation.« less

  17. Coded aperture solution for improving the performance of traffic enforcement cameras

    NASA Astrophysics Data System (ADS)

    Masoudifar, Mina; Pourreza, Hamid Reza

    2016-10-01

    A coded aperture camera is proposed for automatic license plate recognition (ALPR) systems. It captures images using a noncircular aperture. The aperture pattern is designed for the rapid acquisition of high-resolution images while preserving high spatial frequencies of defocused regions. It is obtained by minimizing an objective function, which computes the expected value of perceptual deblurring error. The imaging conditions and camera sensor specifications are also considered in the proposed function. The designed aperture improves the depth of field (DoF) and subsequently ALPR performance. The captured images can be directly analyzed by the ALPR software up to a specific depth, which is 13 m in our case, though it is 11 m for the circular aperture. Moreover, since the deblurring results of images captured by our aperture yield fewer artifacts than those captured by the circular aperture, images can be first deblurred and then analyzed by the ALPR software. In this way, the DoF and recognition rate can be improved at the same time. Our case study shows that the proposed camera can improve the DoF up to 17 m while it is limited to 11 m in the conventional aperture.

  18. Design and implementation of online automatic judging system

    NASA Astrophysics Data System (ADS)

    Liang, Haohui; Chen, Chaojie; Zhong, Xiuyu; Chen, Yuefeng

    2017-06-01

    For lower efficiency and poorer reliability in programming training and competition by currently artificial judgment, design an Online Automatic Judging (referred to as OAJ) System. The OAJ system including the sandbox judging side and Web side, realizes functions of automatically compiling and running the tested codes, and generating evaluation scores and corresponding reports. To prevent malicious codes from damaging system, the OAJ system utilizes sandbox, ensuring the safety of the system. The OAJ system uses thread pools to achieve parallel test, and adopt database optimization mechanism, such as horizontal split table, to improve the system performance and resources utilization rate. The test results show that the system has high performance, high reliability, high stability and excellent extensibility.

  19. Computer program optimizes design of nuclear radiation shields

    NASA Technical Reports Server (NTRS)

    Lahti, G. P.

    1971-01-01

    Computer program, OPEX 2, determines minimum weight, volume, or cost for shields. Program incorporates improved coding, simplified data input, spherical geometry, and an expanded output. Method is capable of altering dose-thickness relationship when a shield layer has been removed.

  20. Quiet, Efficient Fans for Spaceflight: An Overview of NASA's Technology Development Plan

    NASA Technical Reports Server (NTRS)

    Koch, L. Danielle

    2010-01-01

    A Technology Development Plan to improve the aerodynamic and acoustic performance of spaceflight fans has been submitted to NASA s Exploration Technology Development Program. The plan describes a research program intended to make broader use of the technology developed at NASA Glenn to increase the efficiency and reduce the noise of aircraft engine fans. The goal is to develop a set of well-characterized government-owned fans nominally suited for spacecraft ventilation and cooling systems. NASA s Exploration Life Support community will identify design point conditions for the fans in this study. Computational Fluid Dynamics codes will be used in the design and analysis process. The fans will be built and used in a series of tests. Data from aerodynamic and acoustic performance tests will be used to validate performance predictions. These performance maps will also be entered into a database to help spaceflight fan system developers make informed design choices. Velocity measurements downstream of fan rotor blades and stator vanes will also be collected and used for code validation. Details of the fan design, analysis, and testing will be publicly reported. With access to fan geometry and test data, the small fan industry can independently evaluate design and analysis methods and work towards improvement.

  1. Information Environments

    NASA Technical Reports Server (NTRS)

    Follen, Gregory J.; Naiman, Cynthia

    2003-01-01

    The objective of GRC CNIS/IE work is to build a plug-n-play infrastructure that provides the Grand Challenge Applications with a suite of tools for coupling codes together, numerical zooming between fidelity of codes and gaining deployment of these simulations onto the Information Power Grid. The GRC CNIS/IE work will streamline and improve this process by providing tighter integration of various tools through the use of object oriented design of component models and data objects and through the use of CORBA (Common Object Request Broker Architecture).

  2. Improving performance of DS-CDMA systems using chaotic complex Bernoulli spreading codes

    NASA Astrophysics Data System (ADS)

    Farzan Sabahi, Mohammad; Dehghanfard, Ali

    2014-12-01

    The most important goal of spreading spectrum communication system is to protect communication signals against interference and exploitation of information by unintended listeners. In fact, low probability of detection and low probability of intercept are two important parameters to increase the performance of the system. In Direct Sequence Code Division Multiple Access (DS-CDMA) systems, these properties are achieved by multiplying the data information in spreading sequences. Chaotic sequences, with their particular properties, have numerous applications in constructing spreading codes. Using one-dimensional Bernoulli chaotic sequence as spreading code is proposed in literature previously. The main feature of this sequence is its negative auto-correlation at lag of 1, which with proper design, leads to increase in efficiency of the communication system based on these codes. On the other hand, employing the complex chaotic sequences as spreading sequence also has been discussed in several papers. In this paper, use of two-dimensional Bernoulli chaotic sequences is proposed as spreading codes. The performance of a multi-user synchronous and asynchronous DS-CDMA system will be evaluated by applying these sequences under Additive White Gaussian Noise (AWGN) and fading channel. Simulation results indicate improvement of the performance in comparison with conventional spreading codes like Gold codes as well as similar complex chaotic spreading sequences. Similar to one-dimensional Bernoulli chaotic sequences, the proposed sequences also have negative auto-correlation. Besides, construction of complex sequences with lower average cross-correlation is possible with the proposed method.

  3. Optimal Codes for the Burst Erasure Channel

    NASA Technical Reports Server (NTRS)

    Hamkins, Jon

    2010-01-01

    Deep space communications over noisy channels lead to certain packets that are not decodable. These packets leave gaps, or bursts of erasures, in the data stream. Burst erasure correcting codes overcome this problem. These are forward erasure correcting codes that allow one to recover the missing gaps of data. Much of the recent work on this topic concentrated on Low-Density Parity-Check (LDPC) codes. These are more complicated to encode and decode than Single Parity Check (SPC) codes or Reed-Solomon (RS) codes, and so far have not been able to achieve the theoretical limit for burst erasure protection. A block interleaved maximum distance separable (MDS) code (e.g., an SPC or RS code) offers near-optimal burst erasure protection, in the sense that no other scheme of equal total transmission length and code rate could improve the guaranteed correctible burst erasure length by more than one symbol. The optimality does not depend on the length of the code, i.e., a short MDS code block interleaved to a given length would perform as well as a longer MDS code interleaved to the same overall length. As a result, this approach offers lower decoding complexity with better burst erasure protection compared to other recent designs for the burst erasure channel (e.g., LDPC codes). A limitation of the design is its lack of robustness to channels that have impairments other than burst erasures (e.g., additive white Gaussian noise), making its application best suited for correcting data erasures in layers above the physical layer. The efficiency of a burst erasure code is the length of its burst erasure correction capability divided by the theoretical upper limit on this length. The inefficiency is one minus the efficiency. The illustration compares the inefficiency of interleaved RS codes to Quasi-Cyclic (QC) LDPC codes, Euclidean Geometry (EG) LDPC codes, extended Irregular Repeat Accumulate (eIRA) codes, array codes, and random LDPC codes previously proposed for burst erasure protection. As can be seen, the simple interleaved RS codes have substantially lower inefficiency over a wide range of transmission lengths.

  4. A Novel Cross-Layer Routing Protocol Based on Network Coding for Underwater Sensor Networks

    PubMed Central

    Wang, Hao; Wang, Shilian; Bu, Renfei; Zhang, Eryang

    2017-01-01

    Underwater wireless sensor networks (UWSNs) have attracted increasing attention in recent years because of their numerous applications in ocean monitoring, resource discovery and tactical surveillance. However, the design of reliable and efficient transmission and routing protocols is a challenge due to the low acoustic propagation speed and complex channel environment in UWSNs. In this paper, we propose a novel cross-layer routing protocol based on network coding (NCRP) for UWSNs, which utilizes network coding and cross-layer design to greedily forward data packets to sink nodes efficiently. The proposed NCRP takes full advantages of multicast transmission and decode packets jointly with encoded packets received from multiple potential nodes in the entire network. The transmission power is optimized in our design to extend the life cycle of the network. Moreover, we design a real-time routing maintenance protocol to update the route when detecting inefficient relay nodes. Substantial simulations in underwater environment by Network Simulator 3 (NS-3) show that NCRP significantly improves the network performance in terms of energy consumption, end-to-end delay and packet delivery ratio compared with other routing protocols for UWSNs. PMID:28786915

  5. An improved design method for EPC middleware

    NASA Astrophysics Data System (ADS)

    Lou, Guohuan; Xu, Ran; Yang, Chunming

    2014-04-01

    For currently existed problems and difficulties during the small and medium enterprises use EPC (Electronic Product Code) ALE (Application Level Events) specification to achieved middleware, based on the analysis of principle of EPC Middleware, an improved design method for EPC middleware is presented. This method combines the powerful function of MySQL database, uses database to connect reader-writer with upper application system, instead of development of ALE application program interface to achieve a middleware with general function. This structure is simple and easy to implement and maintain. Under this structure, different types of reader-writers added can be configured conveniently and the expandability of the system is improved.

  6. 13 CFR 121.1103 - What are the procedures for appealing a NAICS code designation?

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... appealing a NAICS code designation? 121.1103 Section 121.1103 Business Credit and Assistance SMALL BUSINESS... Determinations and Naics Code Designations § 121.1103 What are the procedures for appealing a NAICS code... code designation and applicable size standard must be served and filed within 10 calendar days after...

  7. PD5: a general purpose library for primer design software.

    PubMed

    Riley, Michael C; Aubrey, Wayne; Young, Michael; Clare, Amanda

    2013-01-01

    Complex PCR applications for large genome-scale projects require fast, reliable and often highly sophisticated primer design software applications. Presently, such applications use pipelining methods to utilise many third party applications and this involves file parsing, interfacing and data conversion, which is slow and prone to error. A fully integrated suite of software tools for primer design would considerably improve the development time, the processing speed, and the reliability of bespoke primer design software applications. The PD5 software library is an open-source collection of classes and utilities, providing a complete collection of software building blocks for primer design and analysis. It is written in object-oriented C(++) with an emphasis on classes suitable for efficient and rapid development of bespoke primer design programs. The modular design of the software library simplifies the development of specific applications and also integration with existing third party software where necessary. We demonstrate several applications created using this software library that have already proved to be effective, but we view the project as a dynamic environment for building primer design software and it is open for future development by the bioinformatics community. Therefore, the PD5 software library is published under the terms of the GNU General Public License, which guarantee access to source-code and allow redistribution and modification. The PD5 software library is downloadable from Google Code and the accompanying Wiki includes instructions and examples: http://code.google.com/p/primer-design.

  8. Computational Aerodynamic Simulations of a Spacecraft Cabin Ventilation Fan Design

    NASA Technical Reports Server (NTRS)

    Tweedt, Daniel L.

    2010-01-01

    Quieter working environments for astronauts are needed if future long-duration space exploration missions are to be safe and productive. Ventilation and payload cooling fans are known to be dominant sources of noise, with the International Space Station being a good case in point. To address this issue cost effectively, early attention to fan design, selection, and installation has been recommended, leading to an effort by NASA to examine the potential for small-fan noise reduction by improving fan aerodynamic design. As a preliminary part of that effort, the aerodynamics of a cabin ventilation fan designed by Hamilton Sundstrand has been simulated using computational fluid dynamics codes, and the computed solutions analyzed to quantify various aspects of the fan aerodynamics and performance. Four simulations were performed at the design rotational speed: two at the design flow rate and two at off-design flow rates. Following a brief discussion of the computational codes, various aerodynamic- and performance-related quantities derived from the computed flow fields are presented along with relevant flow field details. The results show that the computed fan performance is in generally good agreement with stated design goals.

  9. Recent Improvements in Aerodynamic Design Optimization on Unstructured Meshes

    NASA Technical Reports Server (NTRS)

    Nielsen, Eric J.; Anderson, W. Kyle

    2000-01-01

    Recent improvements in an unstructured-grid method for large-scale aerodynamic design are presented. Previous work had shown such computations to be prohibitively long in a sequential processing environment. Also, robust adjoint solutions and mesh movement procedures were difficult to realize, particularly for viscous flows. To overcome these limiting factors, a set of design codes based on a discrete adjoint method is extended to a multiprocessor environment using a shared memory approach. A nearly linear speedup is demonstrated, and the consistency of the linearizations is shown to remain valid. The full linearization of the residual is used to precondition the adjoint system, and a significantly improved convergence rate is obtained. A new mesh movement algorithm is implemented and several advantages over an existing technique are presented. Several design cases are shown for turbulent flows in two and three dimensions.

  10. Comparison of Standards for Accessible Design between America and China

    NASA Astrophysics Data System (ADS)

    Han, Ying; Tao, Xu Feng

    2018-06-01

    Aiming at the accessibility design and environmental constructions of China, comparing the differences of codes for accessibility design between domestic and abroad. It's including four aspects: the difference in audience of accessible design, content of the facilities, quantitative indexes and vision and hearing accessible design. It analyzed that the early stage of our country is backward in terms of accessibility and construction, mainly based on differences in values and professional education. In the end it put forward three suggestions to improve the construction of barrier-free environment in China.

  11. Update of GRASP/Ada reverse engineering tools for Ada

    NASA Technical Reports Server (NTRS)

    Cross, James H., II

    1992-01-01

    The GRASP/Ada project (Graphical Representations of Algorithms, Structures, and Processes for Ada) has successfully created and prototyped a new algorithmic level graphical representation of Ada software, the Control Structure Diagram (CSD). The primary impetus for creation of the CSD was to improve the comprehension efficiency of Ada software and, as a result, improve reliability and reduce costs. The emphasis was on the automatic generation of the CSD from Ada PDL or source code to support reverse engineering and maintenance. The CSD has the potential to replace traditional prettyprinted Ada source code. In Phase 1 of the GRASP/Ada project, the CSD graphical constructs were created and applied manually to several small Ada programs. A prototype (Version 1) was designed and implemented using FLEX and BISON running under VMS on a VAS 11-780. In Phase 2, the prototype was improved and ported to the Sun 4 platform under UNIX. A user interface was designed and partially implemented using the HP widget toolkit and the X Windows System. In Phase 3, the user interface was extensively reworked using the Athena widget toolkit and X Windows. The prototype was applied successfully to numerous Ada programs ranging in size from several hundred to several thousand lines of source code. Following Phase 3, the prototype was evaluated by software engineering students at Auburn University and then updated with significant enhancements to the user interface including editing capabilities. Version 3.2 of the prototype was prepared for limited distribution to facilitate further evaluation. The current prototype provides the capability for the user to generate CSD's from Ada PDL or source code in a reverse engineering as well as forward engineering mode with a level of flexibility suitable for practical application.

  12. TFaNS-Tone Fan Noise Design/Prediction System: Users' Manual TFaNS Version 1.5

    NASA Technical Reports Server (NTRS)

    Topol, David A.; Huff, Dennis L. (Technical Monitor)

    2003-01-01

    TFaNS is the Tone Fan Noise Design/Prediction System developed by Pratt & Whitney under contract to NASA Glenn. The purpose of this system is to predict tone noise emanating from a fan stage including the effects of reflection and transmission by the rotor and stator and by the duct inlet and nozzle. The first version of this design system was developed under a previous NASA contract. Several improvements have been made to TFaNS. This users' manual shows how to run this new system. TFaNS consists of the codes that compute the acoustic properties (reflection and transmission coefficients) of the various elements and writes them to files, CUP3D Fan Noise Coupling Code that reads these files, solves the coupling problem, and outputs the desired noise predictions, and AWAKEN CFD/Measured Wake Postprocessor which reformats CFD wake predictions and/or measured wake data so they can be used by the system. This report provides information on code input and file structure essential for potential users of TFaNS.

  13. Code Lavender: Cultivating Intentional Acts of Kindness in Response to Stressful Work Situations.

    PubMed

    Davidson, Judy E; Graham, Patricia; Montross-Thomas, Lori; Norcross, William; Zerbi, Giovanna

    Providing healthcare can be stressful. Gone unchecked, clinicians may experience decreased compassion, and increased burnout or secondary traumatic stress. Code Lavender is designed to increase acts of kindness after stressful workplace events occur. To test the feasibility of providing Code Lavender. After stressful events in the workplace, staff will provide, receive, and recommend Code Lavender to others. The provision of Code Lavender will improve Professional Quality of Life Scale (ProQoL) scores, general job satisfaction, and feeling cared for in the workplace. Pilot program testing and evaluation. Staff and physicians on four hospital units were informed of the Code Lavender kit availability, which includes words of comfort, chocolate, lavender essential oil, and employee health referral information. Feasibility data and ProQoL scores were collected at baseline and three months. At baseline, 48% (n = 164) reported a stressful event at work in the last three months. Post-intervention, 51% reported experiencing a stressful workplace event, with 32% receiving a Code Lavender kit from their co-workers as a result (n = 83). Of those who received the Code Lavender intervention; 100% found it helpful, and 84% would recommend it to others. No significant changes were demonstrated before and after the intervention in ProQoL scores or job satisfaction, however the emotion of feeling cared-for improved. Results warrant continuation and further dissemination of Code Lavender. Investigators have received requests to expand the program implying positive reception of the intervention. Additional interventions are needed to overcome workplace stressors. A more intense peer support program is being tested. Copyright © 2017. Published by Elsevier Inc.

  14. Overview of Edge Simulation Laboratory (ESL)

    NASA Astrophysics Data System (ADS)

    Cohen, R. H.; Dorr, M.; Hittinger, J.; Rognlien, T.; Umansky, M.; Xiong, A.; Xu, X.; Belli, E.; Candy, J.; Snyder, P.; Colella, P.; Martin, D.; Sternberg, T.; van Straalen, B.; Bodi, K.; Krasheninnikov, S.

    2006-10-01

    The ESL is a new collaboration to build a full-f electromagnetic gyrokinetic code for tokamak edge plasmas using continuum methods. Target applications are edge turbulence and transport (neoclassical and anomalous), and edge-localized modes. Initially the project has three major threads: (i) verification and validation of TEMPEST, the project's initial (electrostatic) edge code which can be run in 4D (neoclassical and transport-timescale applications) or 5D (turbulence); (ii) design of the next generation code, which will include more complete physics (electromagnetics, fluid equation option, improved collisions) and advanced numerics (fully conservative, high-order discretization, mapped multiblock grids, adaptivity), and (iii) rapid-prototype codes to explore the issues attached to solving fully nonlinear gyrokinetics with steep radial gradiens. We present a brief summary of the status of each of these activities.

  15. Shaping electromagnetic waves using software-automatically-designed metasurfaces.

    PubMed

    Zhang, Qian; Wan, Xiang; Liu, Shuo; Yuan Yin, Jia; Zhang, Lei; Jun Cui, Tie

    2017-06-15

    We present a fully digital procedure of designing reflective coding metasurfaces to shape reflected electromagnetic waves. The design procedure is completely automatic, controlled by a personal computer. In details, the macro coding units of metasurface are automatically divided into several types (e.g. two types for 1-bit coding, four types for 2-bit coding, etc.), and each type of the macro coding units is formed by discretely random arrangement of micro coding units. By combining an optimization algorithm and commercial electromagnetic software, the digital patterns of the macro coding units are optimized to possess constant phase difference for the reflected waves. The apertures of the designed reflective metasurfaces are formed by arranging the macro coding units with certain coding sequence. To experimentally verify the performance, a coding metasurface is fabricated by automatically designing two digital 1-bit unit cells, which are arranged in array to constitute a periodic coding metasurface to generate the required four-beam radiations with specific directions. Two complicated functional metasurfaces with circularly- and elliptically-shaped radiation beams are realized by automatically designing 4-bit macro coding units, showing excellent performance of the automatic designs by software. The proposed method provides a smart tool to realize various functional devices and systems automatically.

  16. Spatial correlation-based side information refinement for distributed video coding

    NASA Astrophysics Data System (ADS)

    Taieb, Mohamed Haj; Chouinard, Jean-Yves; Wang, Demin

    2013-12-01

    Distributed video coding (DVC) architecture designs, based on distributed source coding principles, have benefitted from significant progresses lately, notably in terms of achievable rate-distortion performances. However, a significant performance gap still remains when compared to prediction-based video coding schemes such as H.264/AVC. This is mainly due to the non-ideal exploitation of the video sequence temporal correlation properties during the generation of side information (SI). In fact, the decoder side motion estimation provides only an approximation of the true motion. In this paper, a progressive DVC architecture is proposed, which exploits the spatial correlation of the video frames to improve the motion-compensated temporal interpolation (MCTI). Specifically, Wyner-Ziv (WZ) frames are divided into several spatially correlated groups that are then sent progressively to the receiver. SI refinement (SIR) is performed as long as these groups are being decoded, thus providing more accurate SI for the next groups. It is shown that the proposed progressive SIR method leads to significant improvements over the Discover DVC codec as well as other SIR schemes recently introduced in the literature.

  17. MCNP Version 6.2 Release Notes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Werner, Christopher John; Bull, Jeffrey S.; Solomon, C. J.

    Monte Carlo N-Particle or MCNP ® is a general-purpose Monte Carlo radiation-transport code designed to track many particle types over broad ranges of energies. This MCNP Version 6.2 follows the MCNP6.1.1 beta version and has been released in order to provide the radiation transport community with the latest feature developments and bug fixes for MCNP. Since the last release of MCNP major work has been conducted to improve the code base, add features, and provide tools to facilitate ease of use of MCNP version 6.2 as well as the analysis of results. These release notes serve as a general guidemore » for the new/improved physics, source, data, tallies, unstructured mesh, code enhancements and tools. For more detailed information on each of the topics, please refer to the appropriate references or the user manual which can be found at http://mcnp.lanl.gov. This release of MCNP version 6.2 contains 39 new features in addition to 172 bug fixes and code enhancements. There are still some 33 known issues the user should familiarize themselves with (see Appendix).« less

  18. Reusability of coded data in the primary care electronic medical record: A dynamic cohort study concerning cancer diagnoses.

    PubMed

    Sollie, Annet; Sijmons, Rolf H; Helsper, Charles; Numans, Mattijs E

    2017-03-01

    To assess quality and reusability of coded cancer diagnoses in routine primary care data. To identify factors that influence data quality and areas for improvement. A dynamic cohort study in a Dutch network database containing 250,000 anonymized electronic medical records (EMRs) from 52 general practices was performed. Coded data from 2000 to 2011 for the three most common cancer types (breast, colon and prostate cancer) was compared to the Netherlands Cancer Registry. Data quality is expressed in Standard Incidence Ratios (SIRs): the ratio between the number of coded cases observed in the primary care network database and the expected number of cases based on the Netherlands Cancer Registry. Ratios were multiplied by 100% for readability. The overall SIR was 91.5% (95%CI 88.5-94.5) and showed improvement over the years. SIRs differ between cancer types: from 71.5% for colon cancer in males to 103.9% for breast cancer. There are differences in data quality (SIRs 76.2% - 99.7%) depending on the EMR system used, with SIRs up to 232.9% for breast cancer. Frequently observed errors in routine healthcare data can be classified as: lack of integrity checks, inaccurate use and/or lack of codes, and lack of EMR system functionality. Re-users of coded routine primary care Electronic Medical Record data should be aware that 30% of cancer cases can be missed. Up to 130% of cancer cases found in the EMR data can be false-positive. The type of EMR system and the type of cancer influence the quality of coded diagnosis registry. While data quality can be improved (e.g. through improving system design and by training EMR system users), re-use should only be taken care of by appropriately trained experts. Copyright © 2016. Published by Elsevier B.V.

  19. Evaluation of Cueing Innovation for Pressure Ulcer Prevention Using Staff Focus Groups.

    PubMed

    Yap, Tracey L; Kennerly, Susan; Corazzini, Kirsten; Porter, Kristie; Toles, Mark; Anderson, Ruth A

    2014-07-25

    The purpose of the manuscript is to describe long-term care (LTC) staff perceptions of a music cueing intervention designed to improve staff integration of pressure ulcer (PrU) prevention guidelines regarding consistent and regular movement of LTC residents a minimum of every two hours. The Diffusion of Innovation (DOI) model guided staff interviews about their perceptions of the intervention's characteristics, outcomes, and sustainability. This was a qualitative, observational study of staff perceptions of the PrU prevention intervention conducted in Midwestern U.S. LTC facilities (N = 45 staff members). One focus group was held in each of eight intervention facilities using a semi-structured interview protocol. Transcripts were analyzed using thematic content analysis, and summaries for each category were compared across groups. The a priori codes (observability, trialability, compatibility, relative advantage and complexity) described the innovation characteristics, and the sixth code, sustainability, was identified in the data. Within each code, two themes emerged as a positive or negative response regarding characteristics of the innovation. Moreover, within the sustainability code, a third theme emerged that was labeled "brainstormed ideas", focusing on strategies for improving the innovation. Cueing LTC staff using music offers a sustainable potential to improve PrU prevention practices, to increase resident movement, which can subsequently lead to a reduction in PrUs.

  20. Experimental investigations, modeling, and analyses of high-temperature devices for space applications: Part 1. Final report, June 1996--December 1998

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tournier, J.; El-Genk, M.S.; Huang, L.

    1999-01-01

    The Institute of Space and Nuclear Power Studies at the University of New Mexico has developed a computer simulation of cylindrical geometry alkali metal thermal-to-electric converter cells using a standard Fortran 77 computer code. The objective and use of this code was to compare the experimental measurements with computer simulations, upgrade the model as appropriate, and conduct investigations of various methods to improve the design and performance of the devices for improved efficiency, durability, and longer operational lifetime. The Institute of Space and Nuclear Power Studies participated in vacuum testing of PX series alkali metal thermal-to-electric converter cells and developedmore » the alkali metal thermal-to-electric converter Performance Evaluation and Analysis Model. This computer model consisted of a sodium pressure loss model, a cell electrochemical and electric model, and a radiation/conduction heat transfer model. The code closely predicted the operation and performance of a wide variety of PX series cells which led to suggestions for improvements to both lifetime and performance. The code provides valuable insight into the operation of the cell, predicts parameters of components within the cell, and is a useful tool for predicting both the transient and steady state performance of systems of cells.« less

  1. Experimental investigations, modeling, and analyses of high-temperature devices for space applications: Part 2. Final report, June 1996--December 1998

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tournier, J.; El-Genk, M.S.; Huang, L.

    1999-01-01

    The Institute of Space and Nuclear Power Studies at the University of New Mexico has developed a computer simulation of cylindrical geometry alkali metal thermal-to-electric converter cells using a standard Fortran 77 computer code. The objective and use of this code was to compare the experimental measurements with computer simulations, upgrade the model as appropriate, and conduct investigations of various methods to improve the design and performance of the devices for improved efficiency, durability, and longer operational lifetime. The Institute of Space and Nuclear Power Studies participated in vacuum testing of PX series alkali metal thermal-to-electric converter cells and developedmore » the alkali metal thermal-to-electric converter Performance Evaluation and Analysis Model. This computer model consisted of a sodium pressure loss model, a cell electrochemical and electric model, and a radiation/conduction heat transfer model. The code closely predicted the operation and performance of a wide variety of PX series cells which led to suggestions for improvements to both lifetime and performance. The code provides valuable insight into the operation of the cell, predicts parameters of components within the cell, and is a useful tool for predicting both the transient and steady state performance of systems of cells.« less

  2. Universal Design as a Booster for Housing Quality and Architectural Practice.

    PubMed

    Denizou, Karine

    2016-01-01

    Norwegian central government has for the last decade increasingly focused on universal design. Fundamental changes in the Norwegian building code and corresponding regulations in 2010 give an apparently clear framework for the implementation of accessibility and universal design. However, it seems that neither increased awareness of accessibility requirements and universal design, nor compliance with the building code guarantees improvement of housing quality and usability. The Norwegian regulations have gone further in the direction of performance requirements than most other countries. This applies to all types of requirements, including requirements for usability, functionality and accessibility. Hardly any specifications are to be found in the regulations. Ideally, this lack of specifications should give designers the opportunity to develop innovative answers and hence to respond to different contexts and needs. Still, many architects and builders ask for clear specifications, in order to simplify and speed up design processes and make control of solutions easier. Many architects understand guidelines as minimum requirements, and are thus reproducing the identical solutions without considering the context and the needs of the users. They see accessibility as another regulatory pressure and requirements as restrictions rather than positive incentives. However, there are examples of designers who have internalised the regulatory framework and thus are able to create and integrate inclusive design in their daily work. Based on recent research conducted by SINTEF Building and Infrastructure and financed by the Norwegian State Housing Bank, this paper presents examples of practice where dwellings have been developed within a framework of universal design. Focus of the research has been on the approach of the design team and their understanding and use of the regulatory framework in order to create better homes in dialogue with the building authorities. Main objectives are to: - Contribute to better understanding of universal design as a tool and a method to improve housing quality and usability - Investigate the conditions for developing dwellings with innovative and functional solutions in compliance with the building code - Discuss challenges in interpreting the requirements and in taking the needs of various resident groups into account.

  3. Improving the accuracy of operation coding in surgical discharge summaries

    PubMed Central

    Martinou, Eirini; Shouls, Genevieve; Betambeau, Nadine

    2014-01-01

    Procedural coding in surgical discharge summaries is extremely important; as well as communicating to healthcare staff which procedures have been performed, it also provides information that is used by the hospital's coding department. The OPCS code (Office of Population, Censuses and Surveys Classification of Surgical Operations and Procedures) is used to generate the tariff that allows the hospital to be reimbursed for the procedure. We felt that the OPCS coding on discharge summaries was often incorrect within our breast and endocrine surgery department. A baseline measurement over two months demonstrated that 32% of operations had been incorrectly coded, resulting in an incorrect tariff being applied and an estimated loss to the Trust of £17,000. We developed a simple but specific OPCS coding table in collaboration with the clinical coding team and breast surgeons that summarised all operations performed within our department. This table was disseminated across the team, specifically to the junior doctors who most frequently complete the discharge summaries. Re-audit showed 100% of operations were accurately coded, demonstrating the effectiveness of the coding table. We suggest that specifically designed coding tables be introduced across each surgical department to ensure accurate OPCS codes are used to produce better quality surgical discharge summaries and to ensure correct reimbursement to the Trust. PMID:26734286

  4. Improved Helicopter Rotor Performance Prediction through Loose and Tight CFD/CSD Coupling

    NASA Astrophysics Data System (ADS)

    Ickes, Jacob C.

    Helicopters and other Vertical Take-Off or Landing (VTOL) vehicles exhibit an interesting combination of structural dynamic and aerodynamic phenomena which together drive the rotor performance. The combination of factors involved make simulating the rotor a challenging and multidisciplinary effort, and one which is still an active area of interest in the industry because of the money and time it could save during design. Modern tools allow the prediction of rotorcraft physics from first principles. Analysis of the rotor system with this level of accuracy provides the understanding necessary to improve its performance. There has historically been a divide between the comprehensive codes which perform aeroelastic rotor simulations using simplified aerodynamic models, and the very computationally intensive Navier-Stokes Computational Fluid Dynamics (CFD) solvers. As computer resources become more available, efforts have been made to replace the simplified aerodynamics of the comprehensive codes with the more accurate results from a CFD code. The objective of this work is to perform aeroelastic rotorcraft analysis using first-principles simulations for both fluids and structural predictions using tools available at the University of Toledo. Two separate codes are coupled together in both loose coupling (data exchange on a periodic interval) and tight coupling (data exchange each time step) schemes. To allow the coupling to be carried out in a reliable and efficient way, a Fluid-Structure Interaction code was developed which automatically performs primary functions of loose and tight coupling procedures. Flow phenomena such as transonics, dynamic stall, locally reversed flow on a blade, and Blade-Vortex Interaction (BVI) were simulated in this work. Results of the analysis show aerodynamic load improvement due to the inclusion of the CFD-based airloads in the structural dynamics analysis of the Computational Structural Dynamics (CSD) code. Improvements came in the form of improved peak/trough magnitude prediction, better phase prediction of these locations, and a predicted signal with a frequency content more like the flight test data than the CSD code acting alone. Additionally, a tight coupling analysis was performed as a demonstration of the capability and unique aspects of such an analysis. This work shows that away from the center of the flight envelope, the aerodynamic modeling of the CSD code can be replaced with a more accurate set of predictions from a CFD code with an improvement in the aerodynamic results. The better predictions come at substantially increased computational costs between 1,000 and 10,000 processor-hours.

  5. GPU Optimizations for a Production Molecular Docking Code*

    PubMed Central

    Landaverde, Raphael; Herbordt, Martin C.

    2015-01-01

    Modeling molecular docking is critical to both understanding life processes and designing new drugs. In previous work we created the first published GPU-accelerated docking code (PIPER) which achieved a roughly 5× speed-up over a contemporaneous 4 core CPU. Advances in GPU architecture and in the CPU code, however, have since reduced this relalative performance by a factor of 10. In this paper we describe the upgrade of GPU PIPER. This required an entire rewrite, including algorithm changes and moving most remaining non-accelerated CPU code onto the GPU. The result is a 7× improvement in GPU performance and a 3.3× speedup over the CPU-only code. We find that this difference in time is almost entirely due to the difference in run times of the 3D FFT library functions on CPU (MKL) and GPU (cuFFT), respectively. The GPU code has been integrated into the ClusPro docking server which has over 4000 active users. PMID:26594667

  6. GPU Optimizations for a Production Molecular Docking Code.

    PubMed

    Landaverde, Raphael; Herbordt, Martin C

    2014-09-01

    Modeling molecular docking is critical to both understanding life processes and designing new drugs. In previous work we created the first published GPU-accelerated docking code (PIPER) which achieved a roughly 5× speed-up over a contemporaneous 4 core CPU. Advances in GPU architecture and in the CPU code, however, have since reduced this relalative performance by a factor of 10. In this paper we describe the upgrade of GPU PIPER. This required an entire rewrite, including algorithm changes and moving most remaining non-accelerated CPU code onto the GPU. The result is a 7× improvement in GPU performance and a 3.3× speedup over the CPU-only code. We find that this difference in time is almost entirely due to the difference in run times of the 3D FFT library functions on CPU (MKL) and GPU (cuFFT), respectively. The GPU code has been integrated into the ClusPro docking server which has over 4000 active users.

  7. Reliability of SNOMED-CT Coding by Three Physicians using Two Terminology Browsers

    PubMed Central

    Chiang, Michael F.; Hwang, John C.; Yu, Alexander C.; Casper, Daniel S.; Cimino, James J.; Starren, Justin

    2006-01-01

    SNOMED-CT has been promoted as a reference terminology for electronic health record (EHR) systems. Many important EHR functions are based on the assumption that medical concepts will be coded consistently by different users. This study is designed to measure agreement among three physicians using two SNOMED-CT terminology browsers to encode 242 concepts from five ophthalmology case presentations in a publicly-available clinical journal. Inter-coder reliability, based on exact coding match by each physician, was 44% using one browser and 53% using the other. Intra-coder reliability testing revealed that a different SNOMED-CT code was obtained up to 55% of the time when the two browsers were used by one user to encode the same concept. These results suggest that the reliability of SNOMED-CT coding is imperfect, and may be a function of browsing methodology. A combination of physician training, terminology refinement, and browser improvement may help increase the reproducibility of SNOMED-CT coding. PMID:17238317

  8. The development of a thermal hydraulic feedback mechanism with a quasi-fixed point iteration scheme for control rod position modeling for the TRIGSIMS-TH application

    NASA Astrophysics Data System (ADS)

    Karriem, Veronica V.

    Nuclear reactor design incorporates the study and application of nuclear physics, nuclear thermal hydraulic and nuclear safety. Theoretical models and numerical methods implemented in computer programs are utilized to analyze and design nuclear reactors. The focus of this PhD study's is the development of an advanced high-fidelity multi-physics code system to perform reactor core analysis for design and safety evaluations of research TRIGA-type reactors. The fuel management and design code system TRIGSIMS was further developed to fulfill the function of a reactor design and analysis code system for the Pennsylvania State Breazeale Reactor (PSBR). TRIGSIMS, which is currently in use at the PSBR, is a fuel management tool, which incorporates the depletion code ORIGEN-S (part of SCALE system) and the Monte Carlo neutronics solver MCNP. The diffusion theory code ADMARC-H is used within TRIGSIMS to accelerate the MCNP calculations. It manages the data and fuel isotopic content and stores it for future burnup calculations. The contribution of this work is the development of an improved version of TRIGSIMS, named TRIGSIMS-TH. TRIGSIMS-TH incorporates a thermal hydraulic module based on the advanced sub-channel code COBRA-TF (CTF). CTF provides the temperature feedback needed in the multi-physics calculations as well as the thermal hydraulics modeling capability of the reactor core. The temperature feedback model is using the CTF-provided local moderator and fuel temperatures for the cross-section modeling for ADMARC-H and MCNP calculations. To perform efficient critical control rod calculations, a methodology for applying a control rod position was implemented in TRIGSIMS-TH, making this code system a modeling and design tool for future core loadings. The new TRIGSIMS-TH is a computer program that interlinks various other functional reactor analysis tools. It consists of the MCNP5, ADMARC-H, ORIGEN-S, and CTF. CTF was coupled with both MCNP and ADMARC-H to provide the heterogeneous temperature distribution throughout the core. Each of these codes is written in its own computer language performing its function and outputs a set of data. TRIGSIMS-TH provides an effective use and data manipulation and transfer between different codes. With the implementation of feedback and control- rod-position modeling methodologies, the TRIGSIMS-TH calculations are more accurate and in a better agreement with measured data. The PSBR is unique in many ways and there are no "off-the-shelf" codes, which can model this design in its entirety. In particular, PSBR has an open core design, which is cooled by natural convection. Combining several codes into a unique system brings many challenges. It also requires substantial knowledge of both operation and core design of the PSBR. This reactor is in operation decades and there is a fair amount of studies and developments in both PSBR thermal hydraulics and neutronics. Measured data is also available for various core loadings and can be used for validation activities. The previous studies and developments in PSBR modeling also aids as a guide to assess the findings of the work herein. In order to incorporate new methods and codes into exiting TRIGSIMS, a re-evaluation of various components of the code was performed to assure the accuracy and efficiency of the existing CTF/MCNP5/ADMARC-H multi-physics coupling. A new set of ADMARC-H diffusion coefficients and cross sections was generated using the SERPENT code. This was needed as the previous data was not generated with thermal hydraulic feedback and the ARO position was used as the critical rod position. The B4C was re-evaluated for this update. The data exchange between ADMARC-H and MCNP5 was modified. The basic core model is given a flexibility to allow for various changes within the core model, and this feature was implemented in TRIGSIMS-TH. The PSBR core in the new code model can be expanded and changed. This allows the new code to be used as a modeling tool for design and analyses of future code loadings.

  9. Application of advanced computational codes in the design of an experiment for a supersonic throughflow fan rotor

    NASA Technical Reports Server (NTRS)

    Wood, Jerry R.; Schmidt, James F.; Steinke, Ronald J.; Chima, Rodrick V.; Kunik, William G.

    1987-01-01

    Increased emphasis on sustained supersonic or hypersonic cruise has revived interest in the supersonic throughflow fan as a possible component in advanced propulsion systems. Use of a fan that can operate with a supersonic inlet axial Mach number is attractive from the standpoint of reducing the inlet losses incurred in diffusing the flow from a supersonic flight Mach number to a subsonic one at the fan face. The design of the experiment using advanced computational codes to calculate the components required is described. The rotor was designed using existing turbomachinery design and analysis codes modified to handle fully supersonic axial flow through the rotor. A two-dimensional axisymmetric throughflow design code plus a blade element code were used to generate fan rotor velocity diagrams and blade shapes. A quasi-three-dimensional, thin shear layer Navier-Stokes code was used to assess the performance of the fan rotor blade shapes. The final design was stacked and checked for three-dimensional effects using a three-dimensional Euler code interactively coupled with a two-dimensional boundary layer code. The nozzle design in the expansion region was analyzed with a three-dimensional parabolized viscous code which corroborated the results from the Euler code. A translating supersonic diffuser was designed using these same codes.

  10. National Underground Mines Inventory

    DTIC Science & Technology

    1983-10-01

    system is well designed to minimize water accumulation on the drift levels. In many areas, sufficient water has accumulated to make the use of boots a...four characters designate Field office. 17-18 State Code Pic 99 FIPS code for state in which minets located. 19-21 County Code Plc 999 FIPS code for... Designate a general product class based onSIC code. 28-29 Nine Type Plc 99 Natal/Nonmetal mine type code. Based on subunit operations code and canvass code

  11. Support the Design of Improved IUE NEWSIPS High Dispersion Extraction Algorithms: Improved IUE High Dispersion Extraction Algorithms

    NASA Technical Reports Server (NTRS)

    Lawton, Pat

    2004-01-01

    The objective of this work was to support the design of improved IUE NEWSIPS high dispersion extraction algorithms. The purpose of this work was to evaluate use of the Linearized Image (LIHI) file versus the Re-Sampled Image (SIHI) file, evaluate various extraction, and design algorithms for evaluation of IUE High Dispersion spectra. It was concluded the use of the Re-Sampled Image (SIHI) file was acceptable. Since the Gaussian profile worked well for the core and the Lorentzian profile worked well for the wings, the Voigt profile was chosen for use in the extraction algorithm. It was found that the gamma and sigma parameters varied significantly across the detector, so gamma and sigma masks for the SWP detector were developed. Extraction code was written.

  12. A general panel sizing computer code and its application to composite structural panels

    NASA Technical Reports Server (NTRS)

    Anderson, M. S.; Stroud, W. J.

    1978-01-01

    A computer code for obtaining the dimensions of optimum (least mass) stiffened composite structural panels is described. The procedure, which is based on nonlinear mathematical programming and a rigorous buckling analysis, is applicable to general cross sections under general loading conditions causing buckling. A simplified method of accounting for bow-type imperfections is also included. Design studies in the form of structural efficiency charts for axial compression loading are made with the code for blade and hat stiffened panels. The effects on panel mass of imperfections, material strength limitations, and panel stiffness requirements are also examined. Comparisons with previously published experimental data show that accounting for imperfections improves correlation between theory and experiment.

  13. Contracting to improve your revenue cycle performance.

    PubMed

    Welter, Terri L; Semko, George A; Miller, Tony; Lauer, Roberta

    2007-09-01

    The following key drivers of commercial contract variability can have a material effect on your hospital's revenue cycle: Claim form variance. Benefit design. Contract complexity. Coding variance. Medical necessity. Precertification/authorization. Claim adjudication/appeal requirements. Additional documentation requirements. Timeliness of payment. Third-party payer activity.

  14. Dissemination and support of ARGUS for accelerator applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    The ARGUS code is a three-dimensional code system for simulating for interactions between charged particles, electric and magnetic fields, and complex structure. It is a system of modules that share common utilities for grid and structure input, data handling, memory management, diagnostics, and other specialized functions. The code includes the fields due to the space charge and current density of the particles to achieve a self-consistent treatment of the particle dynamics. The physic modules in ARGUS include three-dimensional field solvers for electrostatics and electromagnetics, a three-dimensional electromagnetic frequency-domain module, a full particle-in-cell (PIC) simulation module, and a steady-state PIC model.more » These are described in the Appendix to this report. This project has a primary mission of developing the capabilities of ARGUS in accelerator modeling of release to the accelerator design community. Five major activities are being pursued in parallel during the first year of the project. To improve the code and/or add new modules that provide capabilities needed for accelerator design. To produce a User's Guide that documents the use of the code for all users. To release the code and the User's Guide to accelerator laboratories for their own use, and to obtain feed-back from the. To build an interactive user interface for setting up ARGUS calculations. To explore the use of ARGUS on high-power workstation platforms.« less

  15. NASA F-16XL supersonic laminar flow control program overview

    NASA Technical Reports Server (NTRS)

    Fischer, Michael C.

    1992-01-01

    The viewgraphs and discussion of the NASA supersonic laminar flow control program are provided. Successful application of laminar flow control to a High Speed Civil Transport (HSCT) offers significant benefits in reductions of take-off gross weight, mission fuel burn, cruise drag, structural temperatures, engine size, emissions, and sonic boom. The ultimate economic success of the proposed HSCT may depend on the successful adaption of laminar flow control, which offers the single most significant potential improvements in lift drag ratio (L/D) of all the aerodynamic technologies under consideration. The F-16XL Supersonic Laminar Flow Control (SLFC) Experiment was conceived based on the encouraging results of in-house and NASA supported industry studies to determine if laminar flow control is feasible for the HSCT. The primary objective is to achieve extensive laminar flow (50-60 percent chord) on a highly swept supersonic wing. Data obtained from the flight test will be used to validate existing Euler and Navier Stokes aerodynamic codes and transition prediction boundary layer stability codes. These validated codes and developed design methodology will be delivered to industry for their use in designing supersonic laminar flow control wings. Results from this experiment will establish preliminary suction system design criteria enabling industry to better size the suction system and develop improved estimates of system weight, fuel volume loss due to wing ducting, turbocompressor power requirements, etc. so that benefits and penalties can be more accurately assessed.

  16. Seismic Safety Of Simple Masonry Buildings

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Guadagnuolo, Mariateresa; Faella, Giuseppe

    2008-07-08

    Several masonry buildings comply with the rules for simple buildings provided by seismic codes. For these buildings explicit safety verifications are not compulsory if specific code rules are fulfilled. In fact it is assumed that their fulfilment ensures a suitable seismic behaviour of buildings and thus adequate safety under earthquakes. Italian and European seismic codes differ in the requirements for simple masonry buildings, mostly concerning the building typology, the building geometry and the acceleration at site. Obviously, a wide percentage of buildings assumed simple by codes should satisfy the numerical safety verification, so that no confusion and uncertainty have tomore » be given rise to designers who must use the codes. This paper aims at evaluating the seismic response of some simple unreinforced masonry buildings that comply with the provisions of the new Italian seismic code. Two-story buildings, having different geometry, are analysed and results from nonlinear static analyses performed by varying the acceleration at site are presented and discussed. Indications on the congruence between code rules and results of numerical analyses performed according to the code itself are supplied and, in this context, the obtained result can provide a contribution for improving the seismic code requirements.« less

  17. Design of Phoneme MIDI Codes Using the MIDI Encoding Tool “Auto-F” and Realizing Voice Synthesizing Functions Based on Musical Sounds

    NASA Astrophysics Data System (ADS)

    Modegi, Toshio

    Using our previously developed audio to MIDI code converter tool “Auto-F”, from given vocal acoustic signals we can create MIDI data, which enable to playback the voice-like signals with a standard MIDI synthesizer. Applying this tool, we are constructing a MIDI database, which consists of previously converted simple harmonic structured MIDI codes from a set of 71 Japanese male and female syllable recorded signals. And we are developing a novel voice synthesizing system based on harmonically synthesizing musical sounds, which can generate MIDI data and playback voice signals with a MIDI synthesizer by giving Japanese plain (kana) texts, referring to the syllable MIDI code database. In this paper, we propose an improved MIDI converter tool, which can produce temporally higher-resolution MIDI codes. Then we propose an algorithm separating a set of 20 consonant and vowel phoneme MIDI codes from 71 syllable MIDI converted codes in order to construct a voice synthesizing system. And, we present the evaluation results of voice synthesizing quality between these separated phoneme MIDI codes and their original syllable MIDI codes by our developed 4-syllable word listening tests.

  18. Comparative analysis of design codes for timber bridges in Canada, the United States, and Europe

    Treesearch

    James Wacker; James (Scott) Groenier

    2010-01-01

    The United States recently completed its transition from the allowable stress design code to the load and resistance factor design (LRFD) reliability-based code for the design of most highway bridges. For an international perspective on the LRFD-based bridge codes, a comparative analysis is presented: a study addressed national codes of the United States, Canada, and...

  19. A novel design of optical CDMA system based on TCM and FFH

    NASA Astrophysics Data System (ADS)

    Fang, Jun-Bin; Xu, Zhi-Hai; Huang, Hong-bin; Zheng, Liming; Chen, Shun-er; Liu, Wei-ping

    2005-02-01

    For the application in Passive Optical Network (PON), a novel design of OCDMA system scheme is proposed in this paper. There are two key components included in this scheme: a new kind of OCDMA encoder/decoder system based on TCM and FFH and an improved Optical Line Terminal (OLT) receiving system with improved anti-interference performance by the use of Long Period Fiber Grating (LPFG). In the encoder/decoder system, Trellis Coded Modulation (TCM) encoder is applied in front of the FFH modulator. Original signal firstly is encoded through TCM encoder, and then the redundant code out of the TCM encoder will be mapped into one of the FFH modulation signal subsets for transmission. On the receiver (decoder) side, transmitting signal is demodulated through FFH and decoded by trellis decoder. Owing to the fact that high coding gain can be acquired by TCM without adding transmitting band and reducing transmitting speed, TCM is utilized to ameliorate bit error performance and reduce multi-user interference. In the OLT receiving system, EDFA and LPFG are placed in front of decoder to get excellent gain flatness on a large bandwidth, and Optical Hard Limiter (OHL) is also deployed to improve detection performance, through which the anti-interference performance of receiving system can be greatly enhanced. At the same time, some software is used to simulate the system performance for further analysis and authentication. The related work in this paper provides a valuable reference to the research.

  20. Patient safety principles in family medicine residency accreditation standards and curriculum objectives

    PubMed Central

    Kassam, Aliya; Sharma, Nishan; Harvie, Margot; O’Beirne, Maeve; Topps, Maureen

    2016-01-01

    Abstract Objective To conduct a thematic analysis of the College of Family Physicians of Canada’s (CFPC’s) Red Book accreditation standards and the Triple C Competency-based Curriculum objectives with respect to patient safety principles. Design Thematic content analysis of the CFPC’s Red Book accreditation standards and the Triple C curriculum. Setting Canada. Main outcome measures Coding frequency of the patient safety principles (ie, patient engagement; respectful, transparent relationships; complex systems; a just and trusting culture; responsibility and accountability for actions; and continuous learning and improvement) found in the analyzed CFPC documents. Results Within the analyzed CFPC documents, the most commonly found patient safety principle was patient engagement (n = 51 coding references); the least commonly found patient safety principles were a just and trusting culture (n = 5 coding references) and complex systems (n = 5 coding references). Other patient safety principles that were uncommon included responsibility and accountability for actions (n = 7 coding references) and continuous learning and improvement (n = 12 coding references). Conclusion Explicit inclusion of patient safety content such as the use of patient safety principles is needed for residency training programs across Canada to ensure the full spectrum of care is addressed, from community-based care to acute hospital-based care. This will ensure a patient safety culture can be cultivated from residency and sustained into primary care practice. PMID:27965349

  1. Aeroheating Design Issues for Reusable Launch Vehicles: A Perspective

    NASA Technical Reports Server (NTRS)

    Zoby, E. Vincent; Thompson, Richard A.; Wurster, Kathryn E.

    2004-01-01

    An overview of basic aeroheating design issues for Reusable Launch Vehicles (RLV), which addresses the application of hypersonic ground-based testing, and computational fluid dynamic (CFD) and engineering codes, is presented. Challenges inherent to the prediction of aeroheating environments required for the successful design of the RLV Thermal Protection System (TPS) are discussed in conjunction with the importance of employing appropriate experimental/computational tools. The impact of the information garnered by using these tools in the resulting analyses, ultimately enhancing the RLV TPS design is illustrated. A wide range of topics is presented in this overview; e.g. the impact of flow physics issues such as boundary-layer transition, including effects of distributed and discrete roughness, shock-shock interactions, and flow separation/reattachment. Also, the benefit of integrating experimental and computational studies to gain an improved understanding of flow phenomena is illustrated. From computational studies, the effect of low-density conditions and of uncertainties in material surface properties on the computed heating rates a r e highlighted as well as the significant role of CFD in improving the Outer Mold Line (OML) definition to reduce aeroheating while maintaining aerodynamic performance. Appropriate selection of the TPS design trajectories and trajectory shaping to mitigate aeroheating levels and loads are discussed. Lastly, an illustration of an aeroheating design process is presented whereby data from hypersonic wind-tunnel tests are integrated with predictions from CFD codes and engineering methods to provide heating environments along an entry trajectory as required for TPS design.

  2. Aeroheating Design Issues for Reusable Launch Vehicles: A Perspective

    NASA Technical Reports Server (NTRS)

    Zoby, E. Vincent; Thompson, Richard A.; Wurster, Kathryn E.

    2004-01-01

    An overview of basic aeroheating design issues for Reusable Launch Vehicles (RLV), which addresses the application of hypersonic ground-based testing, and computational fluid dynamic (CFD) and engineering codes, is presented. Challenges inherent to the prediction of aeroheating environments required for the successful design of the RLV Thermal Protection System (TPS) are discussed in conjunction with the importance of employing appropriate experimental/computational tools. The impact of the information garnered by using these tools in the resulting analyses, ultimately enhancing the RLV TPS design is illustrated. A wide range of topics is presented in this overview; e.g. the impact of flow physics issues such as boundary-layer transition, including effects of distributed and discrete roughness, shockshock interactions, and flow separation/reattachment. Also, the benefit of integrating experimental and computational studies to gain an improved understanding of flow phenomena is illustrated. From computational studies, the effect of low-density conditions and of uncertainties in material surface properties on the computed heating rates are highlighted as well as the significant role of CFD in improving the Outer Mold Line (OML) definition to reduce aeroheating while maintaining aerodynamic performance. Appropriate selection of the TPS design trajectories and trajectory shaping to mitigate aeroheating levels and loads are discussed. Lastly, an illustration of an aeroheating design process is presented whereby data from hypersonic wind-tunnel tests are integrated with predictions from CFD codes and engineering methods to provide heating environments along an entry trajectory as required for TPS design.

  3. Improved lossless intra coding for H.264/MPEG-4 AVC.

    PubMed

    Lee, Yung-Lyul; Han, Ki-Hun; Sullivan, Gary J

    2006-09-01

    A new lossless intra coding method based on sample-by-sample differential pulse code modulation (DPCM) is presented as an enhancement of the H.264/MPEG-4 AVC standard. The H.264/AVC design includes a multidirectional spatial prediction method to reduce spatial redundancy by using neighboring samples as a prediction for the samples in a block of data to be encoded. In the new lossless intra coding method, the spatial prediction is performed based on samplewise DPCM instead of in the block-based manner used in the current H.264/AVC standard, while the block structure is retained for the residual difference entropy coding process. We show that the new method, based on samplewise DPCM, does not have a major complexity penalty, despite its apparent pipeline dependencies. Experiments show that the new lossless intra coding method reduces the bit rate by approximately 12% in comparison with the lossless intra coding method previously included in the H.264/AVC standard. As a result, the new method is currently being adopted into the H.264/AVC standard in a new enhancement project.

  4. Statistical Analysis of the AIAA Drag Prediction Workshop CFD Solutions

    NASA Technical Reports Server (NTRS)

    Morrison, Joseph H.; Hemsch, Michael J.

    2007-01-01

    The first AIAA Drag Prediction Workshop (DPW), held in June 2001, evaluated the results from an extensive N-version test of a collection of Reynolds-Averaged Navier-Stokes CFD codes. The code-to-code scatter was more than an order of magnitude larger than desired for design and experimental validation of cruise conditions for a subsonic transport configuration. The second AIAA Drag Prediction Workshop, held in June 2003, emphasized the determination of installed pylon-nacelle drag increments and grid refinement studies. The code-to-code scatter was significantly reduced compared to the first DPW, but still larger than desired. However, grid refinement studies showed no significant improvement in code-to-code scatter with increasing grid refinement. The third AIAA Drag Prediction Workshop, held in June 2006, focused on the determination of installed side-of-body fairing drag increments and grid refinement studies for clean attached flow on wing alone configurations and for separated flow on the DLR-F6 subsonic transport model. This report compares the transonic cruise prediction results of the second and third workshops using statistical analysis.

  5. A Degree Distribution Optimization Algorithm for Image Transmission

    NASA Astrophysics Data System (ADS)

    Jiang, Wei; Yang, Junjie

    2016-09-01

    Luby Transform (LT) code is the first practical implementation of digital fountain code. The coding behavior of LT code is mainly decided by the degree distribution which determines the relationship between source data and codewords. Two degree distributions are suggested by Luby. They work well in typical situations but not optimally in case of finite encoding symbols. In this work, the degree distribution optimization algorithm is proposed to explore the potential of LT code. Firstly selection scheme of sparse degrees for LT codes is introduced. Then probability distribution is optimized according to the selected degrees. In image transmission, bit stream is sensitive to the channel noise and even a single bit error may cause the loss of synchronization between the encoder and the decoder. Therefore the proposed algorithm is designed for image transmission situation. Moreover, optimal class partition is studied for image transmission with unequal error protection. The experimental results are quite promising. Compared with LT code with robust soliton distribution, the proposed algorithm improves the final quality of recovered images obviously with the same overhead.

  6. Development of a Federal Standard for a Video Codec Operating at P X 64 Kbps

    DTIC Science & Technology

    1991-02-01

    defined. Which blocks will be encoded, with what type of code, and with what accuracy is under control of the designer . While there is less freedom for the...decoder, post processing, such as filtering or interpolation, can be used to improve the appearance of the image and is under control of the designer ...the only one, or, in the case that more video sources are to be distinguished, it is that designated "video #1". VWA2 Equivalent to VIA, but designating

  7. A CFD-based aerodynamic design procedure for hypersonic wind-tunnel nozzles

    NASA Technical Reports Server (NTRS)

    Korte, John J.

    1993-01-01

    A new procedure which unifies the best of current classical design practices, computational fluid dynamics (CFD), and optimization procedures is demonstrated for designing the aerodynamic lines of hypersonic wind-tunnel nozzles. The new procedure can be used to design hypersonic wind tunnel nozzles with thick boundary layers where the classical design procedure has been shown to break down. An efficient CFD code, which solves the parabolized Navier-Stokes (PNS) equations using an explicit upwind algorithm, is coupled to a least-squares (LS) optimization procedure. A LS problem is formulated to minimize the difference between the computed flow field and the objective function, consisting of the centerline Mach number distribution and the exit Mach number and flow angle profiles. The aerodynamic lines of the nozzle are defined using a cubic spline, the slopes of which are optimized with the design procedure. The advantages of the new procedure are that it allows full use of powerful CFD codes in the design process, solves an optimization problem to determine the new contour, can be used to design new nozzles or improve sections of existing nozzles, and automatically compensates the nozzle contour for viscous effects as part of the unified design procedure. The new procedure is demonstrated by designing two Mach 15, a Mach 12, and a Mach 18 helium nozzles. The flexibility of the procedure is demonstrated by designing the two Mach 15 nozzles using different constraints, the first nozzle for a fixed length and exit diameter and the second nozzle for a fixed length and throat diameter. The computed flow field for the Mach 15 least squares parabolized Navier-Stokes (LS/PNS) designed nozzle is compared with the classically designed nozzle and demonstrates a significant improvement in the flow expansion process and uniform core region.

  8. Efficient coding and detection of ultra-long IDs for visible light positioning systems.

    PubMed

    Zhang, Hualong; Yang, Chuanchuan

    2018-05-14

    Visible light positioning (VLP) is a promising technique to complement Global Navigation Satellite System (GNSS) such as Global positioning system (GPS) and BeiDou Navigation Satellite System (BDS) which features the advantage of low-cost and high accuracy. The situation becomes even more crucial for indoor environments, where satellite signals are weak or even unavailable. For large-scale application of VLP, there would be a considerable number of Light emitting diode (LED) IDs, which bring forward the demand of long LED ID detection. In particular, to provision indoor localization globally, a convenient way is to program a unique ID into each LED during manufacture. This poses a big challenge for image sensors, such as the CMOS camera in everybody's hands since the long ID covers the span of multiple frames. In this paper, we investigate the detection of ultra-long ID using rolling shutter cameras. By analyzing the pattern of data loss in each frame, we proposed a novel coding technique to improve the efficiency of LED ID detection. We studied the performance of Reed-Solomon (RS) code in this system and designed a new coding method which considered the trade-off between performance and decoding complexity. Coding technique decreases the number of frames needed in data processing, significantly reduces the detection time, and improves the accuracy of detection. Numerical and experimental results show that the detected LED ID can be much longer with the coding technique. Besides, our proposed coding method is proved to achieve a performance close to that of RS code while the decoding complexity is much lower.

  9. The Relationship Between Financial Incentives and Quality of Diabetes Care in Ontario, Canada

    PubMed Central

    Kiran, Tara; Victor, J. Charles; Kopp, Alexander; Shah, Baiju R.; Glazier, Richard H.

    2012-01-01

    OBJECTIVE We assessed the impact of a diabetes incentive code introduced for primary care physicians in Ontario, Canada, in 2002 on quality of diabetes care at the population and patient level. RESEARCH DESIGN AND METHODS We analyzed administrative data for 757,928 Ontarians with diabetes to examine the use of the code and receipt of three evidence-based monitoring tests from 2006 to 2008. We assessed testing rates over time and before and after billing of the incentive code. RESULTS One-quarter of Ontarians with diabetes had an incentive code billed by their physician. The proportion receiving the optimal number of all three monitoring tests (HbA1c, cholesterol, and eye tests) rose gradually from 16% in 2000 to 27% in 2008. Individuals who were younger, lived in rural areas, were not enrolled in a primary care model, or had a mental illness were less likely to receive all three recommended tests. Patients with higher numbers of incentive code billings in 2006–2008 were more likely to receive recommended testing but also were more likely to have received the highest level of recommended testing prior to introduction of the incentive code. Following the same patients over time, improvement in recommended testing was no greater after billing of the first incentive code than before. CONCLUSIONS The diabetes incentive code led to minimal improvement in quality of diabetes care at the population and patient level. Our findings suggest that physicians who provide the highest quality care prior to incentives may be those most likely to claim incentive payments. PMID:22456866

  10. Improved Convergence Rate of Multi-Group Scattering Moment Tallies for Monte Carlo Neutron Transport Codes

    NASA Astrophysics Data System (ADS)

    Nelson, Adam

    Multi-group scattering moment matrices are critical to the solution of the multi-group form of the neutron transport equation, as they are responsible for describing the change in direction and energy of neutrons. These matrices, however, are difficult to correctly calculate from the measured nuclear data with both deterministic and stochastic methods. Calculating these parameters when using deterministic methods requires a set of assumptions which do not hold true in all conditions. These quantities can be calculated accurately with stochastic methods, however doing so is computationally expensive due to the poor efficiency of tallying scattering moment matrices. This work presents an improved method of obtaining multi-group scattering moment matrices from a Monte Carlo neutron transport code. This improved method of tallying the scattering moment matrices is based on recognizing that all of the outgoing particle information is known a priori and can be taken advantage of to increase the tallying efficiency (therefore reducing the uncertainty) of the stochastically integrated tallies. In this scheme, the complete outgoing probability distribution is tallied, supplying every one of the scattering moment matrices elements with its share of data. In addition to reducing the uncertainty, this method allows for the use of a track-length estimation process potentially offering even further improvement to the tallying efficiency. Unfortunately, to produce the needed distributions, the probability functions themselves must undergo an integration over the outgoing energy and scattering angle dimensions. This integration is too costly to perform during the Monte Carlo simulation itself and therefore must be performed in advance by way of a pre-processing code. The new method increases the information obtained from tally events and therefore has a significantly higher efficiency than the currently used techniques. The improved method has been implemented in a code system containing a new pre-processor code, NDPP, and a Monte Carlo neutron transport code, OpenMC. This method is then tested in a pin cell problem and a larger problem designed to accentuate the importance of scattering moment matrices. These tests show that accuracy was retained while the figure-of-merit for generating scattering moment matrices and fission energy spectra was significantly improved.

  11. Finite element based electric motor design optimization

    NASA Technical Reports Server (NTRS)

    Campbell, C. Warren

    1993-01-01

    The purpose of this effort was to develop a finite element code for the analysis and design of permanent magnet electric motors. These motors would drive electromechanical actuators in advanced rocket engines. The actuators would control fuel valves and thrust vector control systems. Refurbishing the hydraulic systems of the Space Shuttle after each flight is costly and time consuming. Electromechanical actuators could replace hydraulics, improve system reliability, and reduce down time.

  12. Resolving Phase Ambiguities In OQPSK

    NASA Technical Reports Server (NTRS)

    Nguyen, Tien M.

    1991-01-01

    Improved design for modulator and demodulator in offset-quaternary-phase-key-shifting (OQPSK) communication system enables receiver to resolve ambiguity in estimated phase of received signal. Features include unique-code-word modulation and detection and digital implementation of Costas loop in carrier-recovery subsystem. Enchances performance of carrier-recovery subsystem, reduces complexity of receiver by removing redundant circuits from previous design, and eliminates dependence of timing in receiver upon parallel-to-serial-conversion clock.

  13. A Cockpit Display Designed to Enable Limited Flight Deck Separation Responsibility

    NASA Technical Reports Server (NTRS)

    Johnson, Walter W.; Battiste, Vernol; Bochow, Sheila Holland

    2003-01-01

    Cockpit displays need to be substantially improved to serve the goals of situational awareness, conflict detection, and path replanning, in Free Flight. This paper describes the design of such an advanced cockpit display, along with an initial simulation based usability evaluation. Flight crews were particularly enthusiastic about color coding for relative altitude, dynamically pulsing predictors, and the use of 3-D flight plans for alerting and situational awareness.

  14. The 2008 U.S. Geological Survey national seismic hazard models and maps for the central and eastern United States

    USGS Publications Warehouse

    Petersen, Mark D.; Frankel, Arthur D.; Harmsen, Stephen C.; Mueller, Charles S.; Boyd, Oliver S.; Luco, Nicolas; Wheeler, Russell L.; Rukstales, Kenneth S.; Haller, Kathleen M.

    2012-01-01

    In this paper, we describe the scientific basis for the source and ground-motion models applied in the 2008 National Seismic Hazard Maps, the development of new products that are used for building design and risk analyses, relationships between the hazard maps and design maps used in building codes, and potential future improvements to the hazard maps.

  15. A CFD analysis of blade row interactions within a high-speed axial compressor

    NASA Astrophysics Data System (ADS)

    Richman, Michael Scott

    Aircraft engine design provides many technical and financial hurdles. In an effort to streamline the design process, save money, and improve reliability and performance, many manufacturers are relying on computational fluid dynamic simulations. An overarching goal of the design process for military aircraft engines is to reduce size and weight while maintaining (or improving) reliability. Designers often turn to the compression system to accomplish this goal. As pressure ratios increase and the number of compression stages decrease, many problems arise, for example stability and high cycle fatigue (HCF) become significant as individual stage loading is increased. CFD simulations have recently been employed to assist in the understanding of the aeroelastic problems. For accurate multistage blade row HCF prediction, it is imperative that advanced three-dimensional blade row unsteady aerodynamic interaction codes be validated with appropriate benchmark data. This research addresses this required validation process for TURBO, an advanced three-dimensional multi-blade row turbomachinery CFD code. The solution/prediction accuracy is characterized, identifying key flow field parameters driving the inlet guide vane (IGV) and stator response to the rotor generated forcing functions. The result is a quantified evaluation of the ability of TURBO to predict not only the fundamental flow field characteristics but the three dimensional blade loading.

  16. Apply network coding for H.264/SVC multicasting

    NASA Astrophysics Data System (ADS)

    Wang, Hui; Kuo, C.-C. Jay

    2008-08-01

    In a packet erasure network environment, video streaming benefits from error control in two ways to achieve graceful degradation. The first approach is application-level (or the link-level) forward error-correction (FEC) to provide erasure protection. The second error control approach is error concealment at the decoder end to compensate lost packets. A large amount of research work has been done in the above two areas. More recently, network coding (NC) techniques have been proposed for efficient data multicast over networks. It was shown in our previous work that multicast video streaming benefits from NC for its throughput improvement. An algebraic model is given to analyze the performance in this work. By exploiting the linear combination of video packets along nodes in a network and the SVC video format, the system achieves path diversity automatically and enables efficient video delivery to heterogeneous receivers in packet erasure channels. The application of network coding can protect video packets against the erasure network environment. However, the rank defficiency problem of random linear network coding makes the error concealment inefficiently. It is shown by computer simulation that the proposed NC video multicast scheme enables heterogenous receiving according to their capacity constraints. But it needs special designing to improve the video transmission performance when applying network coding.

  17. 34 CFR 601.21 - Code of conduct.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... a brochure, a workshop, or training. (B) Food, refreshments, training, or informational material furnished to an agent as an integral part of a training session that is designed to improve the service of a... such training contributes to the professional development of the agent. (C) Favorable terms, conditions...

  18. 34 CFR 601.21 - Code of conduct.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... a brochure, a workshop, or training. (B) Food, refreshments, training, or informational material furnished to an agent as an integral part of a training session that is designed to improve the service of a... such training contributes to the professional development of the agent. (C) Favorable terms, conditions...

  19. Code qualification of structural materials for AFCI advanced recycling reactors.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Natesan, K.; Li, M.; Majumdar, S.

    2012-05-31

    This report summarizes the further findings from the assessments of current status and future needs in code qualification and licensing of reference structural materials and new advanced alloys for advanced recycling reactors (ARRs) in support of Advanced Fuel Cycle Initiative (AFCI). The work is a combined effort between Argonne National Laboratory (ANL) and Oak Ridge National Laboratory (ORNL) with ANL as the technical lead, as part of Advanced Structural Materials Program for AFCI Reactor Campaign. The report is the second deliverable in FY08 (M505011401) under the work package 'Advanced Materials Code Qualification'. The overall objective of the Advanced Materials Codemore » Qualification project is to evaluate key requirements for the ASME Code qualification and the Nuclear Regulatory Commission (NRC) approval of structural materials in support of the design and licensing of the ARR. Advanced materials are a critical element in the development of sodium reactor technologies. Enhanced materials performance not only improves safety margins and provides design flexibility, but also is essential for the economics of future advanced sodium reactors. Code qualification and licensing of advanced materials are prominent needs for developing and implementing advanced sodium reactor technologies. Nuclear structural component design in the U.S. must comply with the ASME Boiler and Pressure Vessel Code Section III (Rules for Construction of Nuclear Facility Components) and the NRC grants the operational license. As the ARR will operate at higher temperatures than the current light water reactors (LWRs), the design of elevated-temperature components must comply with ASME Subsection NH (Class 1 Components in Elevated Temperature Service). However, the NRC has not approved the use of Subsection NH for reactor components, and this puts additional burdens on materials qualification of the ARR. In the past licensing review for the Clinch River Breeder Reactor Project (CRBRP) and the Power Reactor Innovative Small Module (PRISM), the NRC/Advisory Committee on Reactor Safeguards (ACRS) raised numerous safety-related issues regarding elevated-temperature structural integrity criteria. Most of these issues remained unresolved today. These critical licensing reviews provide a basis for the evaluation of underlying technical issues for future advanced sodium-cooled reactors. Major materials performance issues and high temperature design methodology issues pertinent to the ARR are addressed in the report. The report is organized as follows: the ARR reference design concepts proposed by the Argonne National Laboratory and four industrial consortia were reviewed first, followed by a summary of the major code qualification and licensing issues for the ARR structural materials. The available database is presented for the ASME Code-qualified structural alloys (e.g. 304, 316 stainless steels, 2.25Cr-1Mo, and mod.9Cr-1Mo), including physical properties, tensile properties, impact properties and fracture toughness, creep, fatigue, creep-fatigue interaction, microstructural stability during long-term thermal aging, material degradation in sodium environments and effects of neutron irradiation for both base metals and weld metals. An assessment of modified versions of Type 316 SS, i.e. Type 316LN and its Japanese version, 316FR, was conducted to provide a perspective for codification of 316LN or 316FR in Subsection NH. Current status and data availability of four new advanced alloys, i.e. NF616, NF616+TMT, NF709, and HT-UPS, are also addressed to identify the R&D needs for their code qualification for ARR applications. For both conventional and new alloys, issues related to high temperature design methodology are described to address the needs for improvements for the ARR design and licensing. Assessments have shown that there are significant data gaps for the full qualification and licensing of the ARR structural materials. Development and evaluation of structural materials require a variety of experimental facilities that have been seriously degraded in the past. The availability and additional needs for the key experimental facilities are summarized at the end of the report. Detailed information covered in each Chapter is given.« less

  20. New test techniques and analytical procedures for understanding the behavior of advanced propellers

    NASA Technical Reports Server (NTRS)

    Stefko, G. L.; Bober, L. J.; Neumann, H. E.

    1983-01-01

    Analytical procedures and experimental techniques were developed to improve the capability to design advanced high speed propellers. Some results from the propeller lifting line and lifting surface aerodynamic analysis codes are compared with propeller force data, probe data and laser velocimeter data. In general, the code comparisons with data indicate good qualitative agreement. A rotating propeller force balance demonstrated good accuracy and reduced test time by 50 percent. Results from three propeller flow visualization techniques are shown which illustrate some of the physical phenomena occurring on these propellers.

  1. CFD propels NASP propulsion progress

    NASA Technical Reports Server (NTRS)

    Povinelli, Louis A.; Dwoyer, Douglas L.; Green, Michael J.

    1990-01-01

    The most complex aerothermodynamics encountered in the National Aerospace Plane (NASP) propulsion system are associated with the fuel-mixing and combustion-reaction flows of its combustor section; adequate CFD tools must be developed to model shock-wave systems, turbulent hydrogen/air mixing, flow separation, and combustion. Improvements to existing CFD codes have involved extension from two dimensions to three, as well as the addition of finite-rate hydrogen-air chemistry. A novel CFD code for the treatment of reacting flows throughout the NASP, designated GASP, uses the most advanced upwind-differencing technology.

  2. CFD propels NASP propulsion progress

    NASA Astrophysics Data System (ADS)

    Povinelli, Louis A.; Dwoyer, Douglas L.; Green, Michael J.

    1990-07-01

    The most complex aerothermodynamics encountered in the National Aerospace Plane (NASP) propulsion system are associated with the fuel-mixing and combustion-reaction flows of its combustor section; adequate CFD tools must be developed to model shock-wave systems, turbulent hydrogen/air mixing, flow separation, and combustion. Improvements to existing CFD codes have involved extension from two dimensions to three, as well as the addition of finite-rate hydrogen-air chemistry. A novel CFD code for the treatment of reacting flows throughout the NASP, designated GASP, uses the most advanced upwind-differencing technology.

  3. Ultrasonic Array for Obstacle Detection Based on CDMA with Kasami Codes

    PubMed Central

    Diego, Cristina; Hernández, Álvaro; Jiménez, Ana; Álvarez, Fernando J.; Sanz, Rebeca; Aparicio, Joaquín

    2011-01-01

    This paper raises the design of an ultrasonic array for obstacle detection based on Phased Array (PA) techniques, which steers the acoustic beam through the environment by electronics rather than mechanical means. The transmission of every element in the array has been encoded, according to Code Division for Multiple Access (CDMA), which allows multiple beams to be transmitted simultaneously. All these features together enable a parallel scanning system which does not only improve the image rate but also achieves longer inspection distances in comparison with conventional PA techniques. PMID:22247675

  4. Improved Therapeutic Regimens for Treatment of Post-Traumatic Ocular Infections

    DTIC Science & Technology

    2011-05-01

    cystoid macular oedema in uveitis . Clin. Exp. Ophthalmol. 29, 2–6 (2001). 36 Campochiaro PA, Lim JI. Aminoglycoside toxicity in the treatment of...TELEPHONE NUMBER (include area code) Standard Form 298 (Rev. 8-98) Prescribed by ANSI Std. Z39.18 Improved Therapeutic Regimens for Treatment of...injury and adequate treatment . This proposal was designed to analyze the effectiveness of antibiotics, anti-inflammatory drugs, and non-conventional

  5. Competitiveness Improvement Project Informational Workshop

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sinclair, Karin C; Preus, Robert W; Dana, Scott

    This presentation was given at the Competitiveness Improvement Project (CIP) Informational Workshop on December 6, 2017. Topics covered during the workshop include an overview of the CIP, past projects, scoring criteria, technical support opportunities, certification body requirements, standards applicable to distributed wind generators, information on the National Electric Code, certification testing requirements, test site requirements, National Environmental Policy Act, design review, levelized cost of energy, procurement/contracting, project management/deliverables, and outreach materials.

  6. Neural Coding for Effective Rehabilitation

    PubMed Central

    2014-01-01

    Successful neurological rehabilitation depends on accurate diagnosis, effective treatment, and quantitative evaluation. Neural coding, a technology for interpretation of functional and structural information of the nervous system, has contributed to the advancements in neuroimaging, brain-machine interface (BMI), and design of training devices for rehabilitation purposes. In this review, we summarized the latest breakthroughs in neuroimaging from microscale to macroscale levels with potential diagnostic applications for rehabilitation. We also reviewed the achievements in electrocorticography (ECoG) coding with both animal models and human beings for BMI design, electromyography (EMG) interpretation for interaction with external robotic systems, and robot-assisted quantitative evaluation on the progress of rehabilitation programs. Future rehabilitation would be more home-based, automatic, and self-served by patients. Further investigations and breakthroughs are mainly needed in aspects of improving the computational efficiency in neuroimaging and multichannel ECoG by selection of localized neuroinformatics, validation of the effectiveness in BMI guided rehabilitation programs, and simplification of the system operation in training devices. PMID:25258708

  7. A Computer Code for Gas Turbine Engine Weight And Disk Life Estimation

    NASA Technical Reports Server (NTRS)

    Tong, Michael T.; Ghosn, Louis J.; Halliwell, Ian; Wickenheiser, Tim (Technical Monitor)

    2002-01-01

    Reliable engine-weight estimation at the conceptual design stage is critical to the development of new aircraft engines. It helps to identify the best engine concept amongst several candidates. In this paper, the major enhancements to NASA's engine-weight estimate computer code (WATE) are described. These enhancements include the incorporation of improved weight-calculation routines for the compressor and turbine disks using the finite-difference technique. Furthermore, the stress distribution for various disk geometries was also incorporated, for a life-prediction module to calculate disk life. A material database, consisting of the material data of most of the commonly-used aerospace materials, has also been incorporated into WATE. Collectively, these enhancements provide a more realistic and systematic way to calculate the engine weight. They also provide additional insight into the design trade-off between engine life and engine weight. To demonstrate the new capabilities, the enhanced WATE code is used to perform an engine weight/life trade-off assessment on a production aircraft engine.

  8. Understanding Engineers' Responsibilities: A Prerequisite to Designing Engineering Education : Commentary on "Educating Engineers for the Public Good Through International Internships: Evidence from a Case Study at Universitat Politècnica de València".

    PubMed

    Murphy, Colleen; Gardoni, Paolo

    2017-07-18

    The development of the curriculum for engineering education (course requirements as well as extra-curricular activities like study abroad and internships) should be based on a comprehensive understanding of engineers' responsibilities. The responsibilities that are constitutive of being an engineer include striving to fulfill the standards of excellence set by technical codes; to improve the idealized models that engineers use to predict, for example, the behavior of alternative designs; and to achieve the internal goods such as safety and sustainability as they are reflected in the design codes. Globalization has implications for these responsibilities and, in turn, for engineering education, by, for example, modifying the collection of possible solutions recognized for existing problems. In addition, international internships can play an important role in fostering the requisite moral imagination of engineering students.

  9. Improving data sharing in research with context-free encoded missing data.

    PubMed

    Hoevenaar-Blom, Marieke P; Guillemont, Juliette; Ngandu, Tiia; Beishuizen, Cathrien R L; Coley, Nicola; Moll van Charante, Eric P; Andrieu, Sandrine; Kivipelto, Miia; Soininen, Hilkka; Brayne, Carol; Meiller, Yannick; Richard, Edo

    2017-01-01

    Lack of attention to missing data in research may result in biased results, loss of power and reduced generalizability. Registering reasons for missing values at the time of data collection, or-in the case of sharing existing data-before making data available to other teams, can save time and efforts, improve scientific value and help to prevent erroneous assumptions and biased results. To ensure that encoding of missing data is sufficient to understand the reason why data are missing, it should ideally be context-free. Therefore, 11 context-free codes of missing data were carefully designed based on three completed randomized controlled clinical trials and tested in a new randomized controlled clinical trial by an international team consisting of clinical researchers and epidemiologists with extended experience in designing and conducting trials and an Information System expert. These codes can be divided into missing due to participant and/or participation characteristics (n = 6), missing by design (n = 4), and due to a procedural error (n = 1). Broad implementation of context-free missing data encoding may enhance the possibilities of data sharing and pooling, thus allowing more powerful analyses using existing data.

  10. Implicit time-integration method for simultaneous solution of a coupled non-linear system

    NASA Astrophysics Data System (ADS)

    Watson, Justin Kyle

    Historically large physical problems have been divided into smaller problems based on the physics involved. This is no different in reactor safety analysis. The problem of analyzing a nuclear reactor for design basis accidents is performed by a handful of computer codes each solving a portion of the problem. The reactor thermal hydraulic response to an event is determined using a system code like TRAC RELAP Advanced Computational Engine (TRACE). The core power response to the same accident scenario is determined using a core physics code like Purdue Advanced Core Simulator (PARCS). Containment response to the reactor depressurization in a Loss Of Coolant Accident (LOCA) type event is calculated by a separate code. Sub-channel analysis is performed with yet another computer code. This is just a sample of the computer codes used to solve the overall problems of nuclear reactor design basis accidents. Traditionally each of these codes operates independently from each other using only the global results from one calculation as boundary conditions to another. Industry's drive to uprate power for reactors has motivated analysts to move from a conservative approach to design basis accident towards a best estimate method. To achieve a best estimate calculation efforts have been aimed at coupling the individual physics models to improve the accuracy of the analysis and reduce margins. The current coupling techniques are sequential in nature. During a calculation time-step data is passed between the two codes. The individual codes solve their portion of the calculation and converge to a solution before the calculation is allowed to proceed to the next time-step. This thesis presents a fully implicit method of simultaneous solving the neutron balance equations, heat conduction equations and the constitutive fluid dynamics equations. It discusses the problems involved in coupling different physics phenomena within multi-physics codes and presents a solution to these problems. The thesis also outlines the basic concepts behind the nodal balance equations, heat transfer equations and the thermal hydraulic equations, which will be coupled to form a fully implicit nonlinear system of equations. The coupling of separate physics models to solve a larger problem and improve accuracy and efficiency of a calculation is not a new idea, however implementing them in an implicit manner and solving the system simultaneously is. Also the application to reactor safety codes is new and has not be done with thermal hydraulics and neutronics codes on realistic applications in the past. The coupling technique described in this thesis is applicable to other similar coupled thermal hydraulic and core physics reactor safety codes. This technique is demonstrated using coupled input decks to show that the system is solved correctly and then verified by using two derivative test problems based on international benchmark problems the OECD/NRC Three mile Island (TMI) Main Steam Line Break (MSLB) problem (representative of pressurized water reactor analysis) and the OECD/NRC Peach Bottom (PB) Turbine Trip (TT) benchmark (representative of boiling water reactor analysis).

  11. Methodology for sensitivity analysis, approximate analysis, and design optimization in CFD for multidisciplinary applications. [computational fluid dynamics

    NASA Technical Reports Server (NTRS)

    Taylor, Arthur C., III; Hou, Gene W.

    1992-01-01

    Fundamental equations of aerodynamic sensitivity analysis and approximate analysis for the two dimensional thin layer Navier-Stokes equations are reviewed, and special boundary condition considerations necessary to apply these equations to isolated lifting airfoils on 'C' and 'O' meshes are discussed in detail. An efficient strategy which is based on the finite element method and an elastic membrane representation of the computational domain is successfully tested, which circumvents the costly 'brute force' method of obtaining grid sensitivity derivatives, and is also useful in mesh regeneration. The issue of turbulence modeling is addressed in a preliminary study. Aerodynamic shape sensitivity derivatives are efficiently calculated, and their accuracy is validated on two viscous test problems, including: (1) internal flow through a double throat nozzle, and (2) external flow over a NACA 4-digit airfoil. An automated aerodynamic design optimization strategy is outlined which includes the use of a design optimization program, an aerodynamic flow analysis code, an aerodynamic sensitivity and approximate analysis code, and a mesh regeneration and grid sensitivity analysis code. Application of the optimization methodology to the two test problems in each case resulted in a new design having a significantly improved performance in the aerodynamic response of interest.

  12. Energy spectrum of 208Pb(n,x) reactions

    NASA Astrophysics Data System (ADS)

    Tel, E.; Kavun, Y.; Özdoǧan, H.; Kaplan, A.

    2018-02-01

    Fission and fusion reactor technologies have been investigated since 1950's on the world. For reactor technology, fission and fusion reaction investigations are play important role for improve new generation technologies. Especially, neutron reaction studies have an important place in the development of nuclear materials. So neutron effects on materials should study as theoretically and experimentally for improve reactor design. For this reason, Nuclear reaction codes are very useful tools when experimental data are unavailable. For such circumstances scientists created many nuclear reaction codes such as ALICE/ASH, CEM95, PCROSS, TALYS, GEANT, FLUKA. In this study we used ALICE/ASH, PCROSS and CEM95 codes for energy spectrum calculation of outgoing particles from Pb bombardment by neutron. While Weisskopf-Ewing model has been used for the equilibrium process in the calculations, full exciton, hybrid and geometry dependent hybrid nuclear reaction models have been used for the pre-equilibrium process. The calculated results have been discussed and compared with the experimental data taken from EXFOR.

  13. Low thrust chemical rocket technology

    NASA Technical Reports Server (NTRS)

    Schneider, Steven J.

    1992-01-01

    An on-going technology program to improve the performance of low thrust chemical rockets for spacecraft on-board propulsion applications is reviewed. Improved performance and lifetime is sought by the development of new predictive tools to understand the combustion and flow physics, introduction of high temperature materials and improved component designs to optimize performance, and use of higher performance propellants. Improved predictive technology is sought through the comparison of both local and global predictions with experimental data. Predictions are based on both the RPLUS Navier-Stokes code with finite rate kinetics and the JANNAF methodology. Data were obtained with laser-based diagnostics along with global performance measurements. Results indicate that the modeling of the injector and the combustion process needs improvement in these codes and flow visualization with a technique such as 2-D laser induced fluorescence (LIF) would aid in resolving issues of flow symmetry and shear layer combustion processes. High temperature material fabrication processes are under development and small rockets are being designed, fabricated, and tested using these new materials. Rhenium coated with iridium for oxidation protection was produced by the Chemical Vapor Deposition (CVD) process and enabled an 800 K increase in rocket operating temperature. Performance gains with this material in rockets using Earth storable propellants (nitrogen tetroxide and monomethylhydrazine or hydrazine) were obtained through component redesign to eliminate fuel film cooling and its associated combustion inefficiency while managing head end thermal soakback. Material interdiffusion and oxidation characteristics indicated that the requisite lifetimes of tens of hours were available for thruster applications. Rockets were designed, fabricated, and tested with thrusts of 22, 62, 440 and 550 N. Performance improvements of 10 to 20 seconds specific impulse were demonstrated. Higher performance propellants were evaluated: Space storable propellants, including liquid oxygen (LOX) as the oxidizer with nitrogen hydrides or hydrocarbon as fuels. Specifically, a LOX/hydrazine engine was designed, fabricated, and shown to have a 95 pct theoretical c-star which translates into a projected vacuum specific impulse of 345 seconds at an area ratio of 204:1. Further performance improvment can be obtained by the use of LOX/hydrogen propellants, especially for manned spacecraft applications, and specific designs must be developed and advanced through flight qualification.

  14. Rolling-refresher simulation improves performance and retention of paediatric intensive care unit nurse code cart management.

    PubMed

    Singleton, Marcy N; Allen, Kimberly F; Li, Zhongze; McNerney, Kevin; Naber, Urs H; Braga, Matthew S

    2018-04-01

    Paediatric Intensive Care Unit Nurses (PICU RNs) manage the code cart during paediatric emergencies at the Children's Hospital at Dartmouth-Hitchcock. These are low -frequency, high-stakes events. An uncontrolled intervention study with 6-month follow-up. A collaboration of physician and nursing experts developed a rolling-refresher training programme consisting of five simulated scenarios, including 22 code cart skills, to establish nursing code cart competency. The cohort of PICU RNs underwent a competency assessment in training 1. To achieve competence, the participating RN received immediate feedback and instruction and repeated each task until mastery during training 1. The competencies were repeated 6 months later, designated training 2. Thirty-two RNs participated in training 1. Sixteen RNs (50%) completed the second training. Our rolling-refresher training programme resulted in a 43% reduction in the odds of first attempt failures between training 1 and training 2 (p=0.01). Multivariate linear regression evaluating the difference in first attempt failure between training 1 and training 2 revealed that the following covariates were not significantly associated with this improvement: interval Paediatric Advanced Life Support training, interval use of the code cart or defibrillator (either real or simulated) and time between training sessions. Univariate analysis between the two trainings revealed a statistically significant reduction in first attempt failures for: preparing an epinephrine infusion (72% vs 41%, p=0.04) and providing bag-mask ventilation (28% vs 0%, p=0.02). Our rolling-refresher training programme demonstrated significant improvement in performance for low-frequency, high-risk skills required to manage a paediatric code cart with retention after initial training.

  15. Influence of Shock Wave on the Flutter Behavior of Fan Blades Investigated

    NASA Technical Reports Server (NTRS)

    Srivastava, Rakesh; Bakhle, Milind A.; Stefko, George L.

    2003-01-01

    Modern fan designs have blades with forward sweep; a lean, thin cross section; and a wide chord to improve performance and reduce noise. These geometric features coupled with the presence of a shock wave can lead to flutter instability. Flutter is a self-excited dynamic instability arising because of fluid-structure interaction, which causes the energy from the surrounding fluid to be extracted by the vibrating structure. An in-flight occurrence of flutter could be catastrophic and is a significant design issue for rotor blades in gas turbines. Understanding the flutter behavior and the influence of flow features on flutter will lead to a better and safer design. An aeroelastic analysis code, TURBO, has been developed and validated for flutter calculations at the NASA Glenn Research Center. The code has been used to understand the occurrence of flutter in a forward-swept fan design. The forward-swept fan, which consists of 22 inserted blades, encountered flutter during wind tunnel tests at part speed conditions.

  16. Guidance, navigation, and control subsystem equipment selection algorithm using expert system methods

    NASA Technical Reports Server (NTRS)

    Allen, Cheryl L.

    1991-01-01

    Enhanced engineering tools can be obtained through the integration of expert system methodologies and existing design software. The application of these methodologies to the spacecraft design and cost model (SDCM) software provides an improved technique for the selection of hardware for unmanned spacecraft subsystem design. The knowledge engineering system (KES) expert system development tool was used to implement a smarter equipment section algorithm than that which is currently achievable through the use of a standard data base system. The guidance, navigation, and control subsystems of the SDCM software was chosen as the initial subsystem for implementation. The portions of the SDCM code which compute the selection criteria and constraints remain intact, and the expert system equipment selection algorithm is embedded within this existing code. The architecture of this new methodology is described and its implementation is reported. The project background and a brief overview of the expert system is described, and once the details of the design are characterized, an example of its implementation is demonstrated.

  17. COMMIX-PPC: A three-dimensional transient multicomponent computer program for analyzing performance of power plant condensers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chien, T.H.; Domanus, H.M.; Sha, W.T.

    1993-02-01

    The COMMIX-PPC computer pregrain is an extended and improved version of earlier COMMIX codes and is specifically designed for evaluating the thermal performance of power plant condensers. The COMMIX codes are general-purpose computer programs for the analysis of fluid flow and heat transfer in complex Industrial systems. In COMMIX-PPC, two major features have been added to previously published COMMIX codes. One feature is the incorporation of one-dimensional equations of conservation of mass, momentum, and energy on the tube stile and the proper accounting for the thermal interaction between shell and tube side through the porous-medium approach. The other added featuremore » is the extension of the three-dimensional conservation equations for shell-side flow to treat the flow of a multicomponent medium. COMMIX-PPC is designed to perform steady-state and transient. Three-dimensional analysis of fluid flow with heat transfer tn a power plant condenser. However, the code is designed in a generalized fashion so that, with some modification, it can be used to analyze processes in any heat exchanger or other single-phase engineering applications. Volume I (Equations and Numerics) of this report describes in detail the basic equations, formulation, solution procedures, and models for a phenomena. Volume II (User's Guide and Manual) contains the input instruction, flow charts, sample problems, and descriptions of available options and boundary conditions.« less

  18. Aeroelastic Tailoring Study of N+2 Low Boom Supersonic Commerical Transport Aircraft

    NASA Technical Reports Server (NTRS)

    Pak, Chan-Gi

    2015-01-01

    The Lockheed Martin N+2 Low - boom Supersonic Commercial Transport (LSCT) aircraft was optimized in this study through the use of a multidisciplinary design optimization tool developed at the National Aeronautics and S pace Administration Armstrong Flight Research Center. A total of 111 design variables we re used in the first optimization run. Total structural weight was the objective function in this optimization run. Design requirements for strength, buckling, and flutter we re selected as constraint functions during the first optimization run. The MSC Nastran code was used to obtain the modal, strength, and buckling characteristics. Flutter and trim analyses we re based on ZAERO code, and landing and ground control loads were computed using an in - house code. The w eight penalty to satisfy all the design requirement s during the first optimization run was 31,367 lb, a 9.4% increase from the baseline configuration. The second optimization run was prepared and based on the big-bang big-crunch algorithm. Six composite ply angles for the second and fourth composite layers were selected as discrete design variables for the second optimization run. Composite ply angle changes can't improve the weight configuration of the N+2 LSCT aircraft. However, this second optimization run can create more tolerance for the active and near active strength constraint values for future weight optimization runs.

  19. Special issue on network coding

    NASA Astrophysics Data System (ADS)

    Monteiro, Francisco A.; Burr, Alister; Chatzigeorgiou, Ioannis; Hollanti, Camilla; Krikidis, Ioannis; Seferoglu, Hulya; Skachek, Vitaly

    2017-12-01

    Future networks are expected to depart from traditional routing schemes in order to embrace network coding (NC)-based schemes. These have created a lot of interest both in academia and industry in recent years. Under the NC paradigm, symbols are transported through the network by combining several information streams originating from the same or different sources. This special issue contains thirteen papers, some dealing with design aspects of NC and related concepts (e.g., fountain codes) and some showcasing the application of NC to new services and technologies, such as data multi-view streaming of video or underwater sensor networks. One can find papers that show how NC turns data transmission more robust to packet losses, faster to decode, and more resilient to network changes, such as dynamic topologies and different user options, and how NC can improve the overall throughput. This issue also includes papers showing that NC principles can be used at different layers of the networks (including the physical layer) and how the same fundamental principles can lead to new distributed storage systems. Some of the papers in this issue have a theoretical nature, including code design, while others describe hardware testbeds and prototypes.

  20. Quality Indicators for California Community College Job Placement Programs.

    ERIC Educational Resources Information Center

    Mount San Antonio Community Coll. District, Walnut, CA.

    Designed to help California community colleges in assessing their job placement services, identifying strengths and needs for improvement, and establishing priorities for the future, this color-coded guide lists specific tasks and responsibilities within the four essential functional areas of job placement programs and includes quality indicators…

  1. Improving Decisions with Data

    ERIC Educational Resources Information Center

    Johnson, Doug

    2004-01-01

    Schools gather, store and use an increasingly large amount of data. Keeping track of everything from bus routes to building access codes to test scores to sports equipment is done with the help of electronic database programs. Large databases designed for budgeting and student record keeping have long been an integral part of the educational…

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Atamturktur, Sez; Unal, Cetin; Hemez, Francois

    The project proposed to provide a Predictive Maturity Framework with its companion metrics that (1) introduce a formalized, quantitative means to communicate information between interested parties, (2) provide scientifically dependable means to claim completion of Validation and Uncertainty Quantification (VU) activities, and (3) guide the decision makers in the allocation of Nuclear Energy’s resources for code development and physical experiments. The project team proposed to develop this framework based on two complimentary criteria: (1) the extent of experimental evidence available for the calibration of simulation models and (2) the sophistication of the physics incorporated in simulation models. The proposed frameworkmore » is capable of quantifying the interaction between the required number of physical experiments and degree of physics sophistication. The project team has developed this framework and implemented it with a multi-scale model for simulating creep of a core reactor cladding. The multi-scale model is composed of the viscoplastic self-consistent (VPSC) code at the meso-scale, which represents the visco-plastic behavior and changing properties of a highly anisotropic material and a Finite Element (FE) code at the macro-scale to represent the elastic behavior and apply the loading. The framework developed takes advantage of the transparency provided by partitioned analysis, where independent constituent codes are coupled in an iterative manner. This transparency allows model developers to better understand and remedy the source of biases and uncertainties, whether they stem from the constituents or the coupling interface by exploiting separate-effect experiments conducted within the constituent domain and integral-effect experiments conducted within the full-system domain. The project team has implemented this procedure with the multi- scale VPSC-FE model and demonstrated its ability to improve the predictive capability of the model. Within this framework, the project team has focused on optimizing resource allocation for improving numerical models through further code development and experimentation. Related to further code development, we have developed a code prioritization index (CPI) for coupled numerical models. CPI is implemented to effectively improve the predictive capability of the coupled model by increasing the sophistication of constituent codes. In relation to designing new experiments, we investigated the information gained by the addition of each new experiment used for calibration and bias correction of a simulation model. Additionally, the variability of ‘information gain’ through the design domain has been investigated in order to identify the experiment settings where maximum information gain occurs and thus guide the experimenters in the selection of the experiment settings. This idea was extended to evaluate the information gain from each experiment can be improved by intelligently selecting the experiments, leading to the development of the Batch Sequential Design (BSD) technique. Additionally, we evaluated the importance of sufficiently exploring the domain of applicability in experiment-based validation of high-consequence modeling and simulation by developing a new metric to quantify coverage. This metric has also been incorporated into the design of new experiments. Finally, we have proposed a data-aware calibration approach for the calibration of numerical models. This new method considers the complexity of a numerical model (the number of parameters to be calibrated, parameter uncertainty, and form of the model) and seeks to identify the number of experiments necessary to calibrate the model based on the level of sophistication of the physics. The final component in the project team’s work to improve model calibration and validation methods is the incorporation of robustness to non-probabilistic uncertainty in the input parameters. This is an improvement to model validation and uncertainty quantification stemming beyond the originally proposed scope of the project. We have introduced a new metric for incorporating the concept of robustness into experiment-based validation of numerical models. This project has accounted for the graduation of two Ph.D. students (Kendra Van Buren and Josh Hegenderfer) and two M.S. students (Matthew Egeberg and Parker Shields). One of the doctoral students is now working in the nuclear engineering field and the other one is a post-doctoral fellow at the Los Alamos National Laboratory. Additionally, two more Ph.D. students (Garrison Stevens and Tunc Kulaksiz) who are working towards graduation have been supported by this project.« less

  3. User Interface Design in Medical Distributed Web Applications.

    PubMed

    Serban, Alexandru; Crisan-Vida, Mihaela; Mada, Leonard; Stoicu-Tivadar, Lacramioara

    2016-01-01

    User interfaces are important to facilitate easy learning and operating with an IT application especially in the medical world. An easy to use interface has to be simple and to customize the user needs and mode of operation. The technology in the background is an important tool to accomplish this. The present work aims to creating a web interface using specific technology (HTML table design combined with CSS3) to provide an optimized responsive interface for a complex web application. In the first phase, the current icMED web medical application layout is analyzed, and its structure is designed using specific tools, on source files. In the second phase, a new graphic adaptable interface to different mobile terminals is proposed, (using HTML table design (TD) and CSS3 method) that uses no source files, just lines of code for layout design, improving the interaction in terms of speed and simplicity. For a complex medical software application a new prototype layout was designed and developed using HTML tables. The method uses a CSS code with only CSS classes applied to one or multiple HTML table elements, instead of CSS styles that can be applied to just one DIV tag at once. The technique has the advantage of a simplified CSS code, and a better adaptability to different media resolutions compared to DIV-CSS style method. The presented work is a proof that adaptive web interfaces can be developed just using and combining different types of design methods and technologies, using HTML table design, resulting in a simpler to learn and use interface, suitable for healthcare services.

  4. Corrugated and Composite Nozzle-Inlets for Thrust and Noise Benefits

    NASA Technical Reports Server (NTRS)

    Gilinsky, M.; Blankson, I. M.; Gromov, V. G.; Sakharov, V. I.

    2004-01-01

    The following research results are based on development of an approach previously proposed and investigated in for optimum nozzle design to obtain maximum thrust. The design was denoted a Telescope nozzle. A Telescope nozzle contains one or several internal designs, which are inserted at certain locations into a divergent conical or planar main nozzle near its exit. Such a design provides additional thrust augmentation over 20% by comparison with the optimum single nozzle of equivalent lateral area, What is more, experimental acoustic tests have discovered an essential noise reduction due to application of Telescope nozzles. In this paper, some additional theoretical results are presented for Telescope nozzles and a similar approach is applied for aero-performance improvement of a supersonic inlet. Numerical simulations were conducted for supersonic flow into the divergent portion of a 2D or axisymmetric nozzle with several plane or conical designs as well as into a 2D or axisymmetric supersonic inlet with a forebody. The Kryko-Godunov marching numerical scheme for inviscid supersonic flows was used. Several cases were tested using the NASA CFL3d and IM/MSU Russian codes based on the full Navier-Stokes equations. Numerical simulations were conducted for non reacting flows (both codes) as well as for real high temperature gas flows with non-equilibrium chemical reactions (the latter code). In general, these simulations have confirmed essential benefits of Telescope design applications in propulsion system. Some preliminary numerical simulations of several typical inlet designs were conducted with the goal of inlet design optimization for maneuvering flight conditions.

  5. Simulation Enabled Safeguards Assessment Methodology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Robert Bean; Trond Bjornard; Thomas Larson

    2007-09-01

    It is expected that nuclear energy will be a significant component of future supplies. New facilities, operating under a strengthened international nonproliferation regime will be needed. There is good reason to believe virtual engineering applied to the facility design, as well as to the safeguards system design will reduce total project cost and improve efficiency in the design cycle. Simulation Enabled Safeguards Assessment MEthodology (SESAME) has been developed as a software package to provide this capability for nuclear reprocessing facilities. The software architecture is specifically designed for distributed computing, collaborative design efforts, and modular construction to allow step improvements inmore » functionality. Drag and drop wireframe construction allows the user to select the desired components from a component warehouse, render the system for 3D visualization, and, linked to a set of physics libraries and/or computational codes, conduct process evaluations of the system they have designed.« less

  6. Toward Wireless Health Monitoring via an Analog Signal Compression-Based Biosensing Platform.

    PubMed

    Zhao, Xueyuan; Sadhu, Vidyasagar; Le, Tuan; Pompili, Dario; Javanmard, Mehdi

    2018-06-01

    Wireless all-analog biosensor design for the concurrent microfluidic and physiological signal monitoring is presented in this paper. The key component is an all-analog circuit capable of compressing two analog sources into one analog signal by the analog joint source-channel coding (AJSCC). Two circuit designs are discussed, including the stacked-voltage-controlled voltage source (VCVS) design with the fixed number of levels, and an improved design, which supports a flexible number of AJSCC levels. Experimental results are presented on the wireless biosensor prototype, composed of printed circuit board realizations of the stacked-VCVS design. Furthermore, circuit simulation and wireless link simulation results are presented on the improved design. Results indicate that the proposed wireless biosensor is well suited for sensing two biological signals simultaneously with high accuracy, and can be applied to a wide variety of low-power and low-cost wireless continuous health monitoring applications.

  7. Coupling procedure for TRANSURANUS and KTF codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jimenez, J.; Alglave, S.; Avramova, M.

    2012-07-01

    The nuclear industry aims to ensure safe and economic operation of each single fuel rod introduced in the reactor core. This goal is even more challenging nowadays due to the current strategy of going for higher burn-up (fuel cycles of 18 or 24 months) and longer residence time. In order to achieve that goal, fuel modeling is the key to predict the fuel rod behavior and lifetime under thermal and pressure loads, corrosion and irradiation. In this context, fuel performance codes, such as TRANSURANUS, are used to improve the fuel rod design. The modeling capabilities of the above mentioned toolsmore » can be significantly improved if they are coupled with a thermal-hydraulic code in order to have a better description of the flow conditions within the rod bundle. For LWR applications, a good representation of the two phase flow within the fuel assembly is necessary in order to have a best estimate calculation of the heat transfer inside the bundle. In this paper we present the coupling methodology of TRANSURANUS with KTF (Karlsruhe Two phase Flow subchannel code) as well as selected results of the coupling proof of principle. (authors)« less

  8. System Design for FEC in Aeronautical Telemetry

    DTIC Science & Technology

    2012-03-12

    rate punctured convolutional codes for soft decision Viterbi...below follows that given in [8]. The final coding rate of exactly 2/3 is achieved by puncturing the rate -1/2 code as follows. We begin with the buffer c1...concatenated convolutional code (SCCC). The contributions of this paper are on the system-design level. One major contribution is to design a SCCC code

  9. The Use of Automated SNOMED CT Clinical Coding in Clinical Decision Support Systems for Preventive Care.

    PubMed

    Al-Hablani, Bader

    2017-01-01

    The objective of this study is to discuss and analyze the use of automated SNOMED CT clinical coding in clinical decision support systems (CDSSs) for preventive care. The central question that this study seeks to answer is whether the utilization of SNOMED CT in CDSSs can improve preventive care. PubMed, Google Scholar, and Cochrane Library were searched for articles published in English between 2001 and 2012 on SNOMED CT, CDSS, and preventive care. Outcome measures were the sensitivity or specificity of SNOMED CT coded data and the positive predictive value or negative predictive value of SNOMED CT coded data. Additionally, we documented the publication year, research question, study design, results, and conclusions of these studies. The reviewed studies suggested that SNOMED CT successfully represents clinical terms and negated clinical terms. The use of SNOMED CT in CDSS can be considered to provide an answer to the problem of medical errors as well as for preventive care in general. Enhancement of the modifiers and synonyms found in SNOMED CT will be necessary to improve the expected outcome of the integration of SNOMED CT with CDSS. Moreover, the application of the tree-augmented naïve (TAN) Bayesian network method can be considered the best technique to search SNOMED CT data and, consequently, to help improve preventive health services.

  10. The Use of Automated SNOMED CT Clinical Coding in Clinical Decision Support Systems for Preventive Care

    PubMed Central

    Al-Hablani, Bader

    2017-01-01

    Objective The objective of this study is to discuss and analyze the use of automated SNOMED CT clinical coding in clinical decision support systems (CDSSs) for preventive care. The central question that this study seeks to answer is whether the utilization of SNOMED CT in CDSSs can improve preventive care. Method PubMed, Google Scholar, and Cochrane Library were searched for articles published in English between 2001 and 2012 on SNOMED CT, CDSS, and preventive care. Outcome Measures Outcome measures were the sensitivity or specificity of SNOMED CT coded data and the positive predictive value or negative predictive value of SNOMED CT coded data. Additionally, we documented the publication year, research question, study design, results, and conclusions of these studies. Results The reviewed studies suggested that SNOMED CT successfully represents clinical terms and negated clinical terms. Conclusion The use of SNOMED CT in CDSS can be considered to provide an answer to the problem of medical errors as well as for preventive care in general. Enhancement of the modifiers and synonyms found in SNOMED CT will be necessary to improve the expected outcome of the integration of SNOMED CT with CDSS. Moreover, the application of the tree-augmented naïve (TAN) Bayesian network method can be considered the best technique to search SNOMED CT data and, consequently, to help improve preventive health services. PMID:28566995

  11. Research on laser marking speed optimization by using genetic algorithm.

    PubMed

    Wang, Dongyun; Yu, Qiwei; Zhang, Yu

    2015-01-01

    Laser Marking Machine is the most common coding equipment on product packaging lines. However, the speed of laser marking has become a bottleneck of production. In order to remove this bottleneck, a new method based on a genetic algorithm is designed. On the basis of this algorithm, a controller was designed and simulations and experiments were performed. The results show that using this algorithm could effectively improve laser marking efficiency by 25%.

  12. GAME: GAlaxy Machine learning for Emission lines

    NASA Astrophysics Data System (ADS)

    Ucci, G.; Ferrara, A.; Pallottini, A.; Gallerani, S.

    2018-06-01

    We present an updated, optimized version of GAME (GAlaxy Machine learning for Emission lines), a code designed to infer key interstellar medium physical properties from emission line intensities of ultraviolet /optical/far-infrared galaxy spectra. The improvements concern (a) an enlarged spectral library including Pop III stars, (b) the inclusion of spectral noise in the training procedure, and (c) an accurate evaluation of uncertainties. We extensively validate the optimized code and compare its performance against empirical methods and other available emission line codes (PYQZ and HII-CHI-MISTRY) on a sample of 62 SDSS stacked galaxy spectra and 75 observed HII regions. Very good agreement is found for metallicity. However, ionization parameters derived by GAME tend to be higher. We show that this is due to the use of too limited libraries in the other codes. The main advantages of GAME are the simultaneous use of all the measured spectral lines and the extremely short computational times. We finally discuss the code potential and limitations.

  13. Soft-Decision-Data Reshuffle to Mitigate Pulsed Radio Frequency Interference Impact on Low-Density-Parity-Check Code Performance

    NASA Technical Reports Server (NTRS)

    Ni, Jianjun David

    2011-01-01

    This presentation briefly discusses a research effort on mitigation techniques of pulsed radio frequency interference (RFI) on a Low-Density-Parity-Check (LDPC) code. This problem is of considerable interest in the context of providing reliable communications to the space vehicle which might suffer severe degradation due to pulsed RFI sources such as large radars. The LDPC code is one of modern forward-error-correction (FEC) codes which have the decoding performance to approach the Shannon Limit. The LDPC code studied here is the AR4JA (2048, 1024) code recommended by the Consultative Committee for Space Data Systems (CCSDS) and it has been chosen for some spacecraft design. Even though this code is designed as a powerful FEC code in the additive white Gaussian noise channel, simulation data and test results show that the performance of this LDPC decoder is severely degraded when exposed to the pulsed RFI specified in the spacecraft s transponder specifications. An analysis work (through modeling and simulation) has been conducted to evaluate the impact of the pulsed RFI and a few implemental techniques have been investigated to mitigate the pulsed RFI impact by reshuffling the soft-decision-data available at the input of the LDPC decoder. The simulation results show that the LDPC decoding performance of codeword error rate (CWER) under pulsed RFI can be improved up to four orders of magnitude through a simple soft-decision-data reshuffle scheme. This study reveals that an error floor of LDPC decoding performance appears around CWER=1E-4 when the proposed technique is applied to mitigate the pulsed RFI impact. The mechanism causing this error floor remains unknown, further investigation is necessary.

  14. Dissemination and support of ARGUS for accelerator applications. Technical progress report, April 24, 1991--January 20, 1992

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    The ARGUS code is a three-dimensional code system for simulating for interactions between charged particles, electric and magnetic fields, and complex structure. It is a system of modules that share common utilities for grid and structure input, data handling, memory management, diagnostics, and other specialized functions. The code includes the fields due to the space charge and current density of the particles to achieve a self-consistent treatment of the particle dynamics. The physic modules in ARGUS include three-dimensional field solvers for electrostatics and electromagnetics, a three-dimensional electromagnetic frequency-domain module, a full particle-in-cell (PIC) simulation module, and a steady-state PIC model.more » These are described in the Appendix to this report. This project has a primary mission of developing the capabilities of ARGUS in accelerator modeling of release to the accelerator design community. Five major activities are being pursued in parallel during the first year of the project. To improve the code and/or add new modules that provide capabilities needed for accelerator design. To produce a User`s Guide that documents the use of the code for all users. To release the code and the User`s Guide to accelerator laboratories for their own use, and to obtain feed-back from the. To build an interactive user interface for setting up ARGUS calculations. To explore the use of ARGUS on high-power workstation platforms.« less

  15. Interactive Exploration for Continuously Expanding Neuron Databases.

    PubMed

    Li, Zhongyu; Metaxas, Dimitris N; Lu, Aidong; Zhang, Shaoting

    2017-02-15

    This paper proposes a novel framework to help biologists explore and analyze neurons based on retrieval of data from neuron morphological databases. In recent years, the continuously expanding neuron databases provide a rich source of information to associate neuronal morphologies with their functional properties. We design a coarse-to-fine framework for efficient and effective data retrieval from large-scale neuron databases. In the coarse-level, for efficiency in large-scale, we employ a binary coding method to compress morphological features into binary codes of tens of bits. Short binary codes allow for real-time similarity searching in Hamming space. Because the neuron databases are continuously expanding, it is inefficient to re-train the binary coding model from scratch when adding new neurons. To solve this problem, we extend binary coding with online updating schemes, which only considers the newly added neurons and update the model on-the-fly, without accessing the whole neuron databases. In the fine-grained level, we introduce domain experts/users in the framework, which can give relevance feedback for the binary coding based retrieval results. This interactive strategy can improve the retrieval performance through re-ranking the above coarse results, where we design a new similarity measure and take the feedback into account. Our framework is validated on more than 17,000 neuron cells, showing promising retrieval accuracy and efficiency. Moreover, we demonstrate its use case in assisting biologists to identify and explore unknown neurons. Copyright © 2017 Elsevier Inc. All rights reserved.

  16. Grid Generation Techniques Utilizing the Volume Grid Manipulator

    NASA Technical Reports Server (NTRS)

    Alter, Stephen J.

    1998-01-01

    This paper presents grid generation techniques available in the Volume Grid Manipulation (VGM) code. The VGM code is designed to manipulate existing line, surface and volume grids to improve the quality of the data. It embodies an easy to read rich language of commands that enables such alterations as topology changes, grid adaption and smoothing. Additionally, the VGM code can be used to construct simplified straight lines, splines, and conic sections which are common curves used in the generation and manipulation of points, lines, surfaces and volumes (i.e., grid data). These simple geometric curves are essential in the construction of domain discretizations for computational fluid dynamic simulations. By comparison to previously established methods of generating these curves interactively, the VGM code provides control of slope continuity and grid point-to-point stretchings as well as quick changes in the controlling parameters. The VGM code offers the capability to couple the generation of these geometries with an extensive manipulation methodology in a scripting language. The scripting language allows parametric studies of a vehicle geometry to be efficiently performed to evaluate favorable trends in the design process. As examples of the powerful capabilities of the VGM code, a wake flow field domain will be appended to an existing X33 Venturestar volume grid; negative volumes resulting from grid expansions to enable flow field capture on a simple geometry, will be corrected; and geometrical changes to a vehicle component of the X33 Venturestar will be shown.

  17. Evolvix BEST Names for semantic reproducibility across code2brain interfaces.

    PubMed

    Loewe, Laurence; Scheuer, Katherine S; Keel, Seth A; Vyas, Vaibhav; Liblit, Ben; Hanlon, Bret; Ferris, Michael C; Yin, John; Dutra, Inês; Pietsch, Anthony; Javid, Christine G; Moog, Cecilia L; Meyer, Jocelyn; Dresel, Jerdon; McLoone, Brian; Loberger, Sonya; Movaghar, Arezoo; Gilchrist-Scott, Morgaine; Sabri, Yazeed; Sescleifer, Dave; Pereda-Zorrilla, Ivan; Zietlow, Andrew; Smith, Rodrigo; Pietenpol, Samantha; Goldfinger, Jacob; Atzen, Sarah L; Freiberg, Erika; Waters, Noah P; Nusbaum, Claire; Nolan, Erik; Hotz, Alyssa; Kliman, Richard M; Mentewab, Ayalew; Fregien, Nathan; Loewe, Martha

    2017-01-01

    Names in programming are vital for understanding the meaning of code and big data. We define code2brain (C2B) interfaces as maps in compilers and brains between meaning and naming syntax, which help to understand executable code. While working toward an Evolvix syntax for general-purpose programming that makes accurate modeling easy for biologists, we observed how names affect C2B quality. To protect learning and coding investments, C2B interfaces require long-term backward compatibility and semantic reproducibility (accurate reproduction of computational meaning from coder-brains to reader-brains by code alone). Semantic reproducibility is often assumed until confusing synonyms degrade modeling in biology to deciphering exercises. We highlight empirical naming priorities from diverse individuals and roles of names in different modes of computing to show how naming easily becomes impossibly difficult. We present the Evolvix BEST (Brief, Explicit, Summarizing, Technical) Names concept for reducing naming priority conflicts, test it on a real challenge by naming subfolders for the Project Organization Stabilizing Tool system, and provide naming questionnaires designed to facilitate C2B debugging by improving names used as keywords in a stabilizing programming language. Our experiences inspired us to develop Evolvix using a flipped programming language design approach with some unexpected features and BEST Names at its core. © 2016 The Authors. Annals of the New York Academy of Sciences published by Wiley Periodicals, Inc. on behalf of New York Academy of Sciences.

  18. Integrated Performance of Next Generation High Data Rate Receiver and AR4JA LDPC Codec for Space Communications

    NASA Technical Reports Server (NTRS)

    Cheng, Michael K.; Lyubarev, Mark; Nakashima, Michael A.; Andrews, Kenneth S.; Lee, Dennis

    2008-01-01

    Low-density parity-check (LDPC) codes are the state-of-the-art in forward error correction (FEC) technology that exhibits capacity approaching performance. The Jet Propulsion Laboratory (JPL) has designed a family of LDPC codes that are similar in structure and therefore, leads to a single decoder implementation. The Accumulate-Repeat-by-4-Jagged- Accumulate (AR4JA) code design offers a family of codes with rates 1/2, 2/3, 4/5 and lengths 1024, 4096, 16384 information bits. Performance is less than one dB from capacity for all combinations.Integrating a stand-alone LDPC decoder with a commercial-off-the-shelf (COTS) receiver faces additional challenges than building a single receiver-decoder unit from scratch. In this work, we outline the issues and show that these additional challenges can be over-come by simple solutions. To demonstrate that an LDPC decoder can be made to work seamlessly with a COTS receiver, we interface an AR4JA LDPC decoder developed on a field-programmable gate array (FPGA) with a modern high data rate receiver and mea- sure the combined receiver-decoder performance. Through optimizations that include an improved frame synchronizer and different soft-symbol scaling algorithms, we show that a combined implementation loss of less than one dB is possible and therefore, most of the coding gain evidence in theory can also be obtained in practice. Our techniques can benefit any modem that utilizes an advanced FEC code.

  19. Ultra-wideband communication system prototype using orthogonal frequency coded SAW correlators.

    PubMed

    Gallagher, Daniel R; Kozlovski, Nikolai Y; Malocha, Donald C

    2013-03-01

    This paper presents preliminary ultra-wideband (UWB) communication system results utilizing orthogonal frequency coded SAW correlators. Orthogonal frequency coding (OFC) and pseudo-noise (PN) coding provides a means for spread-spectrum UWB. The use of OFC spectrally spreads a PN sequence beyond that of CDMA; allowing for improved correlation gain. The transceiver approach is still very similar to that of the CDMA approach, but provides greater code diversity. Use of SAW correlators eliminates many of the costly components that are typically needed in the intermediate frequency (IF) section in the transmitter and receiver, and greatly reduces the signal processing requirements. Development and results of an experimental prototype system with center frequency of 250 MHz are presented. The prototype system is configured using modular RF components and benchtop pulse generator and frequency source. The SAW correlation filters used in the test setup were designed using 7 chip frequencies within the transducer. The fractional bandwidth of approximately 29% was implemented to exceed the defined UWB specification. Discussion of the filter design and results are presented and are compared with packaged device measurements. A prototype UWB system using OFC SAW correlators is demonstrated in wired and wireless configurations. OFC-coded SAW filters are used for generation of a transmitted spread-spectrum UWB and matched filter correlated reception. Autocorrelation and cross-correlation system outputs are compared. The results demonstrate the feasibility of UWB SAW correlators for use in UWB communication transceivers.

  20. Enhancing the Benefit of the Chemical Mixture Methodology: A Report on Methodology Testing and Potential Approaches for Improving Performance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yu, Xiao-Ying; Yao, Juan; He, Hua

    2012-01-01

    Extensive testing shows that the current version of the Chemical Mixture Methodology (CMM) is meeting its intended mission to provide conservative estimates of the health effects from exposure to airborne chemical mixtures. However, the current version of the CMM could benefit from several enhancements that are designed to improve its application of Health Code Numbers (HCNs) and employ weighting factors to reduce over conservatism.

  1. Aerodynamic design of axisymmetric hypersonic wind-tunnel nozzles using least-squares/parabolized Navier-Stokes procedure

    NASA Technical Reports Server (NTRS)

    Korte, John J.

    1992-01-01

    A new procedure unifying the best of present classical design practices, CFD and optimization procedures, is demonstrated for designing the aerodynamic lines of hypersonic wind tunnel nozzles. This procedure can be employed to design hypersonic wind tunnel nozzles with thick boundary layers where the classical design procedure has been demonstrated to break down. Advantages of this procedure allow full utilization of powerful CFD codes in the design process, solves an optimization problem to determine the new contour, may be used to design new nozzles or improve sections of existing nozzles, and automatically compensates the nozzle contour for viscous effects as part of the unified design procedure.

  2. Research on compression performance of ultrahigh-definition videos

    NASA Astrophysics Data System (ADS)

    Li, Xiangqun; He, Xiaohai; Qing, Linbo; Tao, Qingchuan; Wu, Di

    2017-11-01

    With the popularization of high-definition (HD) images and videos (1920×1080 pixels and above), there are even 4K (3840×2160) television signals and 8 K (8192×4320) ultrahigh-definition videos. The demand for HD images and videos is increasing continuously, along with the increasing data volume. The storage and transmission cannot be properly solved only by virtue of the expansion capacity of hard disks and the update and improvement of transmission devices. Based on the full use of the coding standard high-efficiency video coding (HEVC), super-resolution reconstruction technology, and the correlation between the intra- and the interprediction, we first put forward a "division-compensation"-based strategy to further improve the compression performance of a single image and frame I. Then, by making use of the above thought and HEVC encoder and decoder, a video compression coding frame is designed. HEVC is used inside the frame. Last, with the super-resolution reconstruction technology, the reconstructed video quality is further improved. The experiment shows that by the proposed compression method for a single image (frame I) and video sequence here, the performance is superior to that of HEVC in a low bit rate environment.

  3. Space Radiation Transport Code Development: 3DHZETRN

    NASA Technical Reports Server (NTRS)

    Wilson, John W.; Slaba, Tony C.; Badavi, Francis F.; Reddell, Brandon D.; Bahadori, Amir A.

    2015-01-01

    The space radiation transport code, HZETRN, has been used extensively for research, vehicle design optimization, risk analysis, and related applications. One of the simplifying features of the HZETRN transport formalism is the straight-ahead approximation, wherein all particles are assumed to travel along a common axis. This reduces the governing equation to one spatial dimension allowing enormous simplification and highly efficient computational procedures to be implemented. Despite the physical simplifications, the HZETRN code is widely used for space applications and has been found to agree well with fully 3D Monte Carlo simulations in many circumstances. Recent work has focused on the development of 3D transport corrections for neutrons and light ions (Z < 2) for which the straight-ahead approximation is known to be less accurate. Within the development of 3D corrections, well-defined convergence criteria have been considered, allowing approximation errors at each stage in model development to be quantified. The present level of development assumes the neutron cross sections have an isotropic component treated within N explicit angular directions and a forward component represented by the straight-ahead approximation. The N = 1 solution refers to the straight-ahead treatment, while N = 2 represents the bi-directional model in current use for engineering design. The figure below shows neutrons, protons, and alphas for various values of N at locations in an aluminum sphere exposed to a solar particle event (SPE) spectrum. The neutron fluence converges quickly in simple geometry with N > 14 directions. The improved code, 3DHZETRN, transports neutrons, light ions, and heavy ions under space-like boundary conditions through general geometry while maintaining a high degree of computational efficiency. A brief overview of the 3D transport formalism for neutrons and light ions is given, and extensive benchmarking results with the Monte Carlo codes Geant4, FLUKA, and PHITS are provided for a variety of boundary conditions and geometries. Improvements provided by the 3D corrections are made clear in the comparisons. Developments needed to connect 3DHZETRN to vehicle design and optimization studies will be discussed. Future theoretical development will relax the forward plus isotropic interaction assumption to more general angular dependence.

  4. Multidisciplinary Modeling Software for Analysis, Design, and Optimization of HRRLS Vehicles

    NASA Technical Reports Server (NTRS)

    Spradley, Lawrence W.; Lohner, Rainald; Hunt, James L.

    2011-01-01

    The concept for Highly Reliable Reusable Launch Systems (HRRLS) under the NASA Hypersonics project is a two-stage-to-orbit, horizontal-take-off / horizontal-landing, (HTHL) architecture with an air-breathing first stage. The first stage vehicle is a slender body with an air-breathing propulsion system that is highly integrated with the airframe. The light weight slender body will deflect significantly during flight. This global deflection affects the flow over the vehicle and into the engine and thus the loads and moments on the vehicle. High-fidelity multi-disciplinary analyses that accounts for these fluid-structures-thermal interactions are required to accurately predict the vehicle loads and resultant response. These predictions of vehicle response to multi physics loads, calculated with fluid-structural-thermal interaction, are required in order to optimize the vehicle design over its full operating range. This contract with ResearchSouth addresses one of the primary objectives of the Vehicle Technology Integration (VTI) discipline: the development of high-fidelity multi-disciplinary analysis and optimization methods and tools for HRRLS vehicles. The primary goal of this effort is the development of an integrated software system that can be used for full-vehicle optimization. This goal was accomplished by: 1) integrating the master code, FEMAP, into the multidiscipline software network to direct the coupling to assure accurate fluid-structure-thermal interaction solutions; 2) loosely-coupling the Euler flow solver FEFLO to the available and proven aeroelasticity and large deformation (FEAP) code; 3) providing a coupled Euler-boundary layer capability for rapid viscous flow simulation; 4) developing and implementing improved Euler/RANS algorithms into the FEFLO CFD code to provide accurate shock capturing, skin friction, and heat-transfer predictions for HRRLS vehicles in hypersonic flow, 5) performing a Reynolds-averaged Navier-Stokes computation on an HRRLS configuration; 6) integrating the RANS solver with the FEAP code for coupled fluid-structure-thermal capability; and 7) integrating the existing NASA SRGULL propulsion flow path prediction software with the FEFLO software for quasi-3D propulsion flow path predictions, 8) improving and integrating into the network, an existing adjoint-based design optimization code.

  5. Background-Modeling-Based Adaptive Prediction for Surveillance Video Coding.

    PubMed

    Zhang, Xianguo; Huang, Tiejun; Tian, Yonghong; Gao, Wen

    2014-02-01

    The exponential growth of surveillance videos presents an unprecedented challenge for high-efficiency surveillance video coding technology. Compared with the existing coding standards that were basically developed for generic videos, surveillance video coding should be designed to make the best use of the special characteristics of surveillance videos (e.g., relative static background). To do so, this paper first conducts two analyses on how to improve the background and foreground prediction efficiencies in surveillance video coding. Following the analysis results, we propose a background-modeling-based adaptive prediction (BMAP) method. In this method, all blocks to be encoded are firstly classified into three categories. Then, according to the category of each block, two novel inter predictions are selectively utilized, namely, the background reference prediction (BRP) that uses the background modeled from the original input frames as the long-term reference and the background difference prediction (BDP) that predicts the current data in the background difference domain. For background blocks, the BRP can effectively improve the prediction efficiency using the higher quality background as the reference; whereas for foreground-background-hybrid blocks, the BDP can provide a better reference after subtracting its background pixels. Experimental results show that the BMAP can achieve at least twice the compression ratio on surveillance videos as AVC (MPEG-4 Advanced Video Coding) high profile, yet with a slightly additional encoding complexity. Moreover, for the foreground coding performance, which is crucial to the subjective quality of moving objects in surveillance videos, BMAP also obtains remarkable gains over several state-of-the-art methods.

  6. Cardinality enhancement utilizing Sequential Algorithm (SeQ) code in OCDMA system

    NASA Astrophysics Data System (ADS)

    Fazlina, C. A. S.; Rashidi, C. B. M.; Rahman, A. K.; Aljunid, S. A.

    2017-11-01

    Optical Code Division Multiple Access (OCDMA) has been important with increasing demand for high capacity and speed for communication in optical networks because of OCDMA technique high efficiency that can be achieved, hence fibre bandwidth is fully used. In this paper we will focus on Sequential Algorithm (SeQ) code with AND detection technique using Optisystem design tool. The result revealed SeQ code capable to eliminate Multiple Access Interference (MAI) and improve Bit Error Rate (BER), Phase Induced Intensity Noise (PIIN) and orthogonally between users in the system. From the results, SeQ shows good performance of BER and capable to accommodate 190 numbers of simultaneous users contrast with existing code. Thus, SeQ code have enhanced the system about 36% and 111% of FCC and DCS code. In addition, SeQ have good BER performance 10-25 at 155 Mbps in comparison with 622 Mbps, 1 Gbps and 2 Gbps bit rate. From the plot graph, 155 Mbps bit rate is suitable enough speed for FTTH and LAN networks. Resolution can be made based on the superior performance of SeQ code. Thus, these codes will give an opportunity in OCDMA system for better quality of service in an optical access network for future generation's usage

  7. Enhanced decoding for the Galileo low-gain antenna mission: Viterbi redecoding with four decoding stages

    NASA Technical Reports Server (NTRS)

    Dolinar, S.; Belongie, M.

    1995-01-01

    The Galileo low-gain antenna mission will be supported by a coding system that uses a (14,1/4) inner convolutional code concatenated with Reed-Solomon codes of four different redundancies. Decoding for this code is designed to proceed in four distinct stages of Viterbi decoding followed by Reed-Solomon decoding. In each successive stage, the Reed-Solomon decoder only tries to decode the highest redundancy codewords not yet decoded in previous stages, and the Viterbi decoder redecodes its data utilizing the known symbols from previously decoded Reed-Solomon codewords. A previous article analyzed a two-stage decoding option that was not selected by Galileo. The present article analyzes the four-stage decoding scheme and derives the near-optimum set of redundancies selected for use by Galileo. The performance improvements relative to one- and two-stage decoding systems are evaluated.

  8. 3D scene reconstruction based on multi-view distributed video coding in the Zernike domain for mobile applications

    NASA Astrophysics Data System (ADS)

    Palma, V.; Carli, M.; Neri, A.

    2011-02-01

    In this paper a Multi-view Distributed Video Coding scheme for mobile applications is presented. Specifically a new fusion technique between temporal and spatial side information in Zernike Moments domain is proposed. Distributed video coding introduces a flexible architecture that enables the design of very low complex video encoders compared to its traditional counterparts. The main goal of our work is to generate at the decoder the side information that optimally blends temporal and interview data. Multi-view distributed coding performance strongly depends on the side information quality built at the decoder. At this aim for improving its quality a spatial view compensation/prediction in Zernike moments domain is applied. Spatial and temporal motion activity have been fused together to obtain the overall side-information. The proposed method has been evaluated by rate-distortion performances for different inter-view and temporal estimation quality conditions.

  9. TOOKUIL: A case study in user interface development for safety code application

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gray, D.L.; Harkins, C.K.; Hoole, J.G.

    1997-07-01

    Traditionally, there has been a very high learning curve associated with using nuclear power plant (NPP) analysis codes. Even for seasoned plant analysts and engineers, the process of building or modifying an input model for present day NPP analysis codes is tedious, error prone, and time consuming. Current cost constraints and performance demands place an additional burden on today`s safety analysis community. Advances in graphical user interface (GUI) technology have been applied to obtain significant productivity and quality assurance improvements for the Transient Reactor Analysis Code (TRAC) input model development. KAPL Inc. has developed an X Windows-based graphical user interfacemore » named TOOKUIL which supports the design and analysis process, acting as a preprocessor, runtime editor, help system, and post processor for TRAC. This paper summarizes the objectives of the project, the GUI development process and experiences, and the resulting end product, TOOKUIL.« less

  10. Codes That Support Smart Growth Development

    EPA Pesticide Factsheets

    Provides examples of local zoning codes that support smart growth development, categorized by: unified development code, form-based code, transit-oriented development, design guidelines, street design standards, and zoning overlay.

  11. Participatory design facilitates Person Centred Nursing in service improvement with older people: a secondary directed content analysis.

    PubMed

    Wolstenholme, Daniel; Ross, Helen; Cobb, Mark; Bowen, Simon

    2017-05-01

    To explore, using the example of a project working with older people in an outpatient setting in a large UK NHS Teaching hospital, how the constructs of Person Centred Nursing are reflected in interviews from participants in a Co-design led service improvement project. Person Centred Care and Person Centred Nursing are recognised terms in healthcare. Co-design (sometimes called participatory design) is an approach that seeks to involve all stakeholders in a creative process to deliver the best result, be this a product, technology or in this case a service. Co-design practice shares some of the underpinning philosophy of Person Centred Nursing and potentially has methods to aid in Person Centred Nursing implementation. The research design was a qualitative secondary Directed analysis. Seven interview transcripts from nurses and older people who had participated in a Co-design led improvement project in a large teaching hospital were transcribed and analysed. Two researchers analysed the transcripts for codes derived from McCormack & McCance's Person Centred Nursing Framework. The four most expressed codes were as follows: from the pre-requisites: knowing self; from care processes, engagement, working with patient's beliefs and values and shared Decision-making; and from Expected outcomes, involvement in care. This study describes the Co-design theory and practice that the participants responded to in the interviews and look at how the co-design activity facilitated elements of the Person Centred Nursing framework. This study adds to the rich literature about using emancipatory and transformational approaches to Person Centred Nursing development, and is the first study exploring explicitly the potential contribution of Co-design to this area. Methods from Co-design allow older people to contribute as equals in a practice development project, co-design methods can facilitate nursing staff to engage meaningfully with older participants and develop a shared understanding and goals. The co-produced outputs of Co-design projects embody and value the expressed beliefs and values of staff and older people. © 2016 The Authors. Journal of Clinical Nursing Published by John Wiley & Sons Ltd.

  12. The Influence of Building Codes on Recreation Facility Design.

    ERIC Educational Resources Information Center

    Morrison, Thomas A.

    1989-01-01

    Implications of building codes upon design and construction of recreation facilities are investigated (national building codes, recreation facility standards, and misperceptions of design requirements). Recreation professionals can influence architectural designers to correct past deficiencies, but they must understand architectural and…

  13. CFD Code Validation of Wall Heat Fluxes for a G02/GH2 Single Element Combustor

    NASA Technical Reports Server (NTRS)

    Lin, Jeff; West, Jeff S.; Williams, Robert W.; Tucker, P. Kevin

    2005-01-01

    This paper puts forth the case for the need for improved injector design tools to meet NASA s Vision for Space Exploration goals. Requirements for this improved tool are outlined and discussed. The potential for Computational Fluid Dynamics (CFD) to meet these requirements is noted along with its current shortcomings, especially relative to demonstrated solution accuracy. The concept of verification and validation is introduced as the primary process for building and quantifying the confidence necessary for CFD to be useful as an injector design tool. The verification and validation process is considered in the context of the Marshall Space Flight Center (MSFC) Combustion Devices CFD Simulation Capability Roadmap via the Simulation Readiness Level (SRL) concept. The portion of the validation process which demonstrates the ability of a CFD code to simulate heat fluxes to a rocket engine combustor wall is the focus of the current effort. The FDNS and Loci-CHEM codes are used to simulate a shear coaxial single element G02/GH2 injector experiment. The experiment was conducted a t a chamber pressure of 750 psia using hot propellants from preburners. A measured wall temperature profile is used as a boundary condition to facilitate the calculations. Converged solutions, obtained from both codes by using wall functions with the K-E turbulence model and integrating to the wall using Mentor s baseline turbulence model, are compared to the experimental data. The initial solutions from both codes revealed significant issues with the wall function implementation associated with the recirculation zone between the shear coaxial jet and the chamber wall. The FDNS solution with a corrected implementation shows marked improvement in overall character and level of comparison to the data. With the FDNS code, integrating to the wall with Mentor s baseline turbulence model actually produce a degraded solution when compared to the wall function solution with the K--E model. The Loci-CHEM solution, produced by integrating to the wall with Mentor s baseline turbulence model, matches both the heat flux rise rate in the near injector region and the peak heat flux level very well. However, it moderately over predicts the heat fluxes downstream of the reattachment point. The Loci-CHEM solution achieved by integrating to the wall with Mentor s baseline turbulence model was clearly superior to the other solutions produced in this effort.

  14. Error Correcting Codes and Related Designs

    DTIC Science & Technology

    1990-09-30

    Theory, IT-37 (1991), 1222-1224. 6. Codes and designs, existence and uniqueness, Discrete Math ., to appear. 7. (with R. Brualdi and N. Cai), Orphan...structure of the first order Reed-Muller codes, Discrete Math ., to appear. 8. (with J. H. Conway and N.J.A. Sloane), The binary self-dual codes of length up...18, 1988. 4. "Codes and Designs," Mathematics Colloquium, Technion, Haifa, Israel, March 6, 1989. 5. "On the Covering Radius of Codes," Discrete Math . Group

  15. SBIR PHASE I FINAL REPORT: Adoption of High Performance Computational (HPC) Modeling Software for Widespread Use in the Manufacture of Welded Structures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brust, Frederick W.; Punch, Edward F.; Kurth, Elizabeth A.

    2013-12-02

    Many US manufacturing companies have moved fabrication and production facilities off shore because of cheaper labor costs. A key aspect in bringing these jobs back to the US is the use of technology to render US-made fabrications more efficient overall with higher quality. A new initiative of the current administration has the goal of enhancing competitiveness to retain manufacturing jobs in the US. One significant competitive advantage that has emerged in the US over the last two decades is the use of virtual design for fabrication of large structures in the light and heavy materials industries. Industries that have usedmore » virtual design and analysis tools have reduced material parts size, developed environmentally-friendly fabrication processes, improved product quality and performance, and reduced manufacturing costs. Indeed, Caterpillar Inc. (CAT), one of the partners in this effort, continues to have a large fabrication presence in the US because of the use of weld fabrication modeling to optimize fabrications by controlling weld residual stresses and distortions and improving fatigue, corrosion, and fracture performance. This report describes Engineering Mechanics Corporation of Columbus (Emc2's) DOE SBIR Phase I results which extended an existing, state-of-the-art software code, VFT, currently used to design and model large welded structures prior to fabrication - to a broader range of products with widespread applications for small and medium-sized enterprises (SMEs). VFT helps control distortion, can minimize and/or control residual stresses, control welding microstructure, and pre-determine welding parameters such as weld-sequencing, pre-bending, thermal-tensioning, etc. VFT uses material properties, consumable properties, etc. as inputs. Through VFT, manufacturing companies can avoid costly design changes after fabrication. This leads to the concept of joint design/fabrication where these important disciplines are intimately linked to minimize fabrication costs. VFT currently is tied to a commercial solver which makes it prohibitively expensive for use by SMEs, as there is a significant licensing cost for the solver - over and above for the relatively minimal cost for VFT. Emc2 developed this software code over a number of years in close cooperation with CAT (Peoria, IL), who currently uses this code exclusively for worldwide fabrication, product design and development activities. The use of VFT has allowed CAT to move directly from design to product fabrication and helped eliminate (to a large extent) new product prototyping and subsequent testing. Additionally, CAT has been able to eliminate/reduce costly one-of-a-kind appliances used to reduce distortion effects due to fabrication. In this context, SMEs can realize the same kind of improved product quality and reduced cost through adoption of the adapted version of VFT for design and subsequent manufacture of new products. Emc2's DOE SBIR Phase I effort successfully adapted VFT so that SMEs have access to this sophisticated and proven methodology that is quick, accurate and cost effective and available on-demand to address weld-simulation and fabrication problems prior to manufacture. The open source code, WARP3D, a high performance finite element code mainly used in fracture and damage assessment of structures, was modified so that computational weld problems can be solved efficiently on multiple processors and threads with VFT. The thermal solver for VFT, based on a series of closed form solution approximations, was enhanced for solution on multiple processors greatly increasing overall speed. In addition, the graphical user interface (GUI) has been tailored to integrate these solutions with WARP3D. The GUI is used to define all the weld pass descriptions, number of passes, material properties, consumable properties, weld speed, etc. for the structure to be modeled. The GUI was improved to make it user-friendly for engineers that are not experts in finite element modeling. Finally, a plan for porting VFT onto the Ohio Supercomputer Center (OSC) through its hosted Manufacturing and Polymer Portal has been developed. This access route will permit SMEs to perform weld modeling to improve their competitiveness at a reasonable cost. All of these improvements are detailed in this repo« less

  16. Integrated design and manufacturing for the high speed civil transport (a combined aerodynamics/propulsion optimization study)

    NASA Technical Reports Server (NTRS)

    Baecher, Juergen; Bandte, Oliver; DeLaurentis, Dan; Lewis, Kemper; Sicilia, Jose; Soboleski, Craig

    1995-01-01

    This report documents the efforts of a Georgia Tech High Speed Civil Transport (HSCT) aerospace student design team in completing a design methodology demonstration under NASA's Advanced Design Program (ADP). Aerodynamic and propulsion analyses are integrated into the synthesis code FLOPS in order to improve its prediction accuracy. Executing the integrated product and process development (IPPD) methodology proposed at the Aerospace Systems Design Laboratory (ASDL), an improved sizing process is described followed by a combined aero-propulsion optimization, where the objective function, average yield per revenue passenger mile ($/RPM), is constrained by flight stability, noise, approach speed, and field length restrictions. Primary goals include successful demonstration of the application of the response surface methodolgy (RSM) to parameter design, introduction to higher fidelity disciplinary analysis than normally feasible at the conceptual and early preliminary level, and investigations of relationships between aerodynamic and propulsion design parameters and their effect on the objective function, $/RPM. A unique approach to aircraft synthesis is developed in which statistical methods, specifically design of experiments and the RSM, are used to more efficiently search the design space for optimum configurations. In particular, two uses of these techniques are demonstrated. First, response model equations are formed which represent complex analysis in the form of a regression polynomial. Next, a second regression equation is constructed, not for modeling purposes, but instead for the purpose of optimization at the system level. Such an optimization problem with the given tools normally would be difficult due to the need for hard connections between the various complex codes involved. The statistical methodology presents an alternative and is demonstrated via an example of aerodynamic modeling and planform optimization for a HSCT.

  17. Convergence Acceleration and Documentation of CFD Codes for Turbomachinery Applications

    NASA Technical Reports Server (NTRS)

    Marquart, Jed E.

    2005-01-01

    The development and analysis of turbomachinery components for industrial and aerospace applications has been greatly enhanced in recent years through the advent of computational fluid dynamics (CFD) codes and techniques. Although the use of this technology has greatly reduced the time required to perform analysis and design, there still remains much room for improvement in the process. In particular, there is a steep learning curve associated with most turbomachinery CFD codes, and the computation times need to be reduced in order to facilitate their integration into standard work processes. Two turbomachinery codes have recently been developed by Dr. Daniel Dorney (MSFC) and Dr. Douglas Sondak (Boston University). These codes are entitled Aardvark (for 2-D and quasi 3-D simulations) and Phantom (for 3-D simulations). The codes utilize the General Equation Set (GES), structured grid methodology, and overset O- and H-grids. The codes have been used with success by Drs. Dorney and Sondak, as well as others within the turbomachinery community, to analyze engine components and other geometries. One of the primary objectives of this study was to establish a set of parametric input values which will enhance convergence rates for steady state simulations, as well as reduce the runtime required for unsteady cases. The goal is to reduce the turnaround time for CFD simulations, thus permitting more design parametrics to be run within a given time period. In addition, other code enhancements to reduce runtimes were investigated and implemented. The other primary goal of the study was to develop enhanced users manuals for Aardvark and Phantom. These manuals are intended to answer most questions for new users, as well as provide valuable detailed information for the experienced user. The existence of detailed user s manuals will enable new users to become proficient with the codes, as well as reducing the dependency of new users on the code authors. In order to achieve the objectives listed, the following tasks were accomplished: 1) Parametric Study Of Preconditioning Parameters And Other Code Inputs; 2) Code Modifications To Reduce Runtimes; 3) Investigation Of Compiler Options To Reduce Code Runtime; and 4) Development/Enhancement of Users Manuals for Aardvark and Phantom

  18. RELAP-7 Code Assessment Plan and Requirement Traceability Matrix

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yoo, Junsoo; Choi, Yong-joon; Smith, Curtis L.

    2016-10-01

    The RELAP-7, a safety analysis code for nuclear reactor system, is under development at Idaho National Laboratory (INL). Overall, the code development is directed towards leveraging the advancements in computer science technology, numerical solution methods and physical models over the last decades. Recently, INL has also been putting an effort to establish the code assessment plan, which aims to ensure an improved final product quality through the RELAP-7 development process. The ultimate goal of this plan is to propose a suitable way to systematically assess the wide range of software requirements for RELAP-7, including the software design, user interface, andmore » technical requirements, etc. To this end, we first survey the literature (i.e., international/domestic reports, research articles) addressing the desirable features generally required for advanced nuclear system safety analysis codes. In addition, the V&V (verification and validation) efforts as well as the legacy issues of several recently-developed codes (e.g., RELAP5-3D, TRACE V5.0) are investigated. Lastly, this paper outlines the Requirement Traceability Matrix (RTM) for RELAP-7 which can be used to systematically evaluate and identify the code development process and its present capability.« less

  19. Validation of the SINDA/FLUINT code using several analytical solutions

    NASA Technical Reports Server (NTRS)

    Keller, John R.

    1995-01-01

    The Systems Improved Numerical Differencing Analyzer and Fluid Integrator (SINDA/FLUINT) code has often been used to determine the transient and steady-state response of various thermal and fluid flow networks. While this code is an often used design and analysis tool, the validation of this program has been limited to a few simple studies. For the current study, the SINDA/FLUINT code was compared to four different analytical solutions. The thermal analyzer portion of the code (conduction and radiative heat transfer, SINDA portion) was first compared to two separate solutions. The first comparison examined a semi-infinite slab with a periodic surface temperature boundary condition. Next, a small, uniform temperature object (lumped capacitance) was allowed to radiate to a fixed temperature sink. The fluid portion of the code (FLUINT) was also compared to two different analytical solutions. The first study examined a tank filling process by an ideal gas in which there is both control volume work and heat transfer. The final comparison considered the flow in a pipe joining two infinite reservoirs of pressure. The results of all these studies showed that for the situations examined here, the SINDA/FLUINT code was able to match the results of the analytical solutions.

  20. Geometric modeling for computer aided design

    NASA Technical Reports Server (NTRS)

    Schwing, James L.

    1992-01-01

    The goal was the design and implementation of software to be used in the conceptual design of aerospace vehicles. Several packages and design studies were completed, including two software tools currently used in the conceptual level design of aerospace vehicles. These tools are the Solid Modeling Aerospace Research Tool (SMART) and the Environment for Software Integration and Execution (EASIE). SMART provides conceptual designers with a rapid prototyping capability and additionally provides initial mass property analysis. EASIE provides a set of interactive utilities that simplify the task of building and executing computer aided design systems consisting of diverse, stand alone analysis codes that result in the streamlining of the exchange of data between programs, reducing errors and improving efficiency.

  1. OFF, Open source Finite volume Fluid dynamics code: A free, high-order solver based on parallel, modular, object-oriented Fortran API

    NASA Astrophysics Data System (ADS)

    Zaghi, S.

    2014-07-01

    OFF, an open source (free software) code for performing fluid dynamics simulations, is presented. The aim of OFF is to solve, numerically, the unsteady (and steady) compressible Navier-Stokes equations of fluid dynamics by means of finite volume techniques: the research background is mainly focused on high-order (WENO) schemes for multi-fluids, multi-phase flows over complex geometries. To this purpose a highly modular, object-oriented application program interface (API) has been developed. In particular, the concepts of data encapsulation and inheritance available within Fortran language (from standard 2003) have been stressed in order to represent each fluid dynamics "entity" (e.g. the conservative variables of a finite volume, its geometry, etc…) by a single object so that a large variety of computational libraries can be easily (and efficiently) developed upon these objects. The main features of OFF can be summarized as follows: Programming LanguageOFF is written in standard (compliant) Fortran 2003; its design is highly modular in order to enhance simplicity of use and maintenance without compromising the efficiency; Parallel Frameworks Supported the development of OFF has been also targeted to maximize the computational efficiency: the code is designed to run on shared-memory multi-cores workstations and distributed-memory clusters of shared-memory nodes (supercomputers); the code's parallelization is based on Open Multiprocessing (OpenMP) and Message Passing Interface (MPI) paradigms; Usability, Maintenance and Enhancement in order to improve the usability, maintenance and enhancement of the code also the documentation has been carefully taken into account; the documentation is built upon comprehensive comments placed directly into the source files (no external documentation files needed): these comments are parsed by means of doxygen free software producing high quality html and latex documentation pages; the distributed versioning system referred as git has been adopted in order to facilitate the collaborative maintenance and improvement of the code; CopyrightsOFF is a free software that anyone can use, copy, distribute, study, change and improve under the GNU Public License version 3. The present paper is a manifesto of OFF code and presents the currently implemented features and ongoing developments. This work is focused on the computational techniques adopted and a detailed description of the main API characteristics is reported. OFF capabilities are demonstrated by means of one and two dimensional examples and a three dimensional real application.

  2. DataRocket: Interactive Visualisation of Data Structures

    NASA Astrophysics Data System (ADS)

    Parkes, Steve; Ramsay, Craig

    2010-08-01

    CodeRocket is a software engineering tool that provides cognitive support to the software engineer for reasoning about a method or procedure and for documenting the resulting code [1]. DataRocket is a software engineering tool designed to support visualisation and reasoning about program data structures. DataRocket is part of the CodeRocket family of software tools developed by Rapid Quality Systems [2] a spin-out company from the Space Technology Centre at the University of Dundee. CodeRocket and DataRocket integrate seamlessly with existing architectural design and coding tools and provide extensive documentation with little or no effort on behalf of the software engineer. Comprehensive, abstract, detailed design documentation is available early on in a project so that it can be used for design reviews with project managers and non expert stakeholders. Code and documentation remain fully synchronised even when changes are implemented in the code without reference to the existing documentation. At the end of a project the press of a button suffices to produce the detailed design document. Existing legacy code can be easily imported into CodeRocket and DataRocket to reverse engineer detailed design documentation making legacy code more manageable and adding substantially to its value. This paper introduces CodeRocket. It then explains the rationale for DataRocket and describes the key features of this new tool. Finally the major benefits of DataRocket for different stakeholders are considered.

  3. Space Radiation Transport Methods Development

    NASA Technical Reports Server (NTRS)

    Wilson, J. W.; Tripathi, R. K.; Qualls, G. D.; Cucinotta, F. A.; Prael, R. E.; Norbury, J. W.; Heinbockel, J. H.; Tweed, J.

    2002-01-01

    Improved spacecraft shield design requires early entry of radiation constraints into the design process to maximize performance and minimize costs. As a result, we have been investigating high-speed computational procedures to allow shield analysis from the preliminary design concepts to the final design. In particular, we will discuss the progress towards a full three-dimensional and computationally efficient deterministic code for which the current HZETRN evaluates the lowest order asymptotic term. HZETRN is the first deterministic solution to the Boltzmann equation allowing field mapping within the International Space Station (ISS) in tens of minutes using standard Finite Element Method (FEM) geometry common to engineering design practice enabling development of integrated multidisciplinary design optimization methods. A single ray trace in ISS FEM geometry requires 14 milliseconds and severely limits application of Monte Carlo methods to such engineering models. A potential means of improving the Monte Carlo efficiency in coupling to spacecraft geometry is given in terms of reconfigurable computing and could be utilized in the final design as verification of the deterministic method optimized design.

  4. Using a data entry clerk to improve data quality in primary care electronic medical records: a pilot study.

    PubMed

    Greiver, Michelle; Barnsley, Jan; Aliarzadeh, Babak; Krueger, Paul; Moineddin, Rahim; Butt, Debra A; Dolabchian, Edita; Jaakkimainen, Liisa; Keshavjee, Karim; White, David; Kaplan, David

    2011-01-01

    The quality of electronic medical record (EMR) data is known to be problematic; research on improving these data is needed. The primary objective was to explore the impact of using a data entry clerk to improve data quality in primary care EMRs. The secondary objective was to evaluate the feasibility of implementing this intervention. We used a before and after design for this pilot study. The participants were 13 community based family physicians and four allied health professionals in Toronto, Canada. Using queries programmed by a data manager, a data clerk was tasked with re-entering EMR information as coded or structured data for chronic obstructive pulmonary disease (COPD), smoking, specialist designations and interprofessional encounter headers. We measured data quality before and three to six months after the intervention. We evaluated feasibility by measuring acceptability to clinicians and workload for the clerk. After the intervention, coded COPD entries increased by 38% (P = 0.0001, 95% CI 23 to 51%); identifiable data on smoking categories increased by 27% (P = 0.0001, 95% CI 26 to 29%); referrals with specialist designations increased by 20% (P = 0.0001, 95% CI 16 to 22%); and identifiable interprofessional headers increased by 10% (P = 0.45, 95 CI -3 to 23%). Overall, the intervention was rated as being at least moderately useful and moderately usable. The data entry clerk spent 127 hours restructuring data for 11 729 patients. Utilising a data manager for queries and a data clerk to re-enter data led to improvements in EMR data quality. Clinicians found this approach to be acceptable.

  5. Update of GRASP/Ada reverse engineering tools for Ada

    NASA Technical Reports Server (NTRS)

    Cross, James H., II

    1993-01-01

    The GRASP/Ada project (Graphical Representations of Algorithms, Structures, and Processes for Ada) successfully created and prototyped a new algorithmic level graphical representation for Ada software, the Control Structure Diagram (CSD). The primary impetus for creation of the CSD was to improve the comprehension efficiency of Ada software and, as a result, improve reliability and reduce costs. The emphasis was on the automatic generation of the CSD from Ada PDL or source code to support reverse engineering and maintenance. The CSD has the potential to replace traditional pretty printed Ada source code. In Phase 1 of the GRASP/Ada project, the CSD graphical constructs were created and applied manually to several small Ada programs. A prototype CSD generator (Version 1) was designed and implemented using FLEX and BISON running under VMS on a VAX 11-780. In Phase 2, the prototype was improved and ported to the Sun 4 platform under UNIX. A user interface was designed and partially implemented using the HP widget toolkit and the X Windows System. In Phase 3, the user interface was extensively reworked using the Athena widget toolkit and X Windows. The prototype was applied successfully to numerous Ada programs ranging in size from several hundred to several thousand lines of source code. Following Phase 3,e two update phases were completed. Update'92 focused on the initial analysis of evaluation data collected from software engineering students at Auburn University and the addition of significant enhancements to the user interface. Update'93 (the current update) focused on the statistical analysis of the data collected in the previous update and preparation of Version 3.4 of the prototype for limited distribution to facilitate further evaluation. The current prototype provides the capability for the user to generate CSD's from Ada PDL or source code in a reverse engineering as well as forward engineering mode with a level of flexibility suitable for practical application. An overview of the GRASP/Ada project with an emphasis on the current update is provided.

  6. 24 CFR 941.203 - Design and construction standards.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... national building code, such as Uniform Building Code, Council of American Building Officials Code, or Building Officials Conference of America Code; (2) Applicable State and local laws, codes, ordinances, and... intended to serve. Building design and construction shall strive to encourage in residents a proprietary...

  7. 24 CFR 941.203 - Design and construction standards.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... national building code, such as Uniform Building Code, Council of American Building Officials Code, or Building Officials Conference of America Code; (2) Applicable State and local laws, codes, ordinances, and... intended to serve. Building design and construction shall strive to encourage in residents a proprietary...

  8. NASIS data base management system: IBM 360 TSS implementation. Volume 1: Installation standards

    NASA Technical Reports Server (NTRS)

    1973-01-01

    The installation standards for the (NASIS) data base management system are presented. The standard approach to preparing systems documentation and the program design and coding rules and conventions are outlines. Included are instructions for preparing all major specifications and suggestions for improving the quality and efficency of the programming task.

  9. Evaluation of Passive Vents in New Construction Multifamily Buildings

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sean Maxwell; Berger, David; Zuluaga, Marc

    Exhaust ventilation and corresponding outdoor air strategies are being implemented in high performance, new construction, multifamily buildings to meet program or code requirements for improved indoor air quality, but a lack of clear design guidance is resulting in poor performance of these systems despite the best intentions of the programs or standards.

  10. Rescuing Tomorrow Today: Fixing Training and Development for DHS Leaders

    DTIC Science & Technology

    2016-09-01

    training, personnel development, mentorship, e -learning, blended learning solutions, Hurricane Katrina 15. NUMBER OF PAGES 91 16. PRICE CODE 17...5  D.  HYPOTHESES AND ASSUMPTIONS ...................................................8  E .  RESEARCH DESIGN...19  D.  THERE IS ALWAYS ROOM FOR IMPROVEMENT.......................20  E .  KATRINA LESSONS LEARNED

  11. Research on Laser Marking Speed Optimization by Using Genetic Algorithm

    PubMed Central

    Wang, Dongyun; Yu, Qiwei; Zhang, Yu

    2015-01-01

    Laser Marking Machine is the most common coding equipment on product packaging lines. However, the speed of laser marking has become a bottleneck of production. In order to remove this bottleneck, a new method based on a genetic algorithm is designed. On the basis of this algorithm, a controller was designed and simulations and experiments were performed. The results show that using this algorithm could effectively improve laser marking efficiency by 25%. PMID:25955831

  12. Sage Simulation Model for Technology Demonstration Convertor by a Step-by-Step Approach

    NASA Technical Reports Server (NTRS)

    Demko, Rikako; Penswick, L. Barry

    2006-01-01

    The development of a Stirling model using the 1-D Saga design code was completed using a step-by-step approach. This is a method of gradually increasing the complexity of the Saga model while observing the energy balance and energy losses at each step of the development. This step-by-step model development and energy-flow analysis can clarify where the losses occur, their impact, and suggest possible opportunities for design improvement.

  13. A Novel Design of Reconfigurable Wavelength-Time Optical Codes to Enhance Security in Optical CDMA Networks

    NASA Astrophysics Data System (ADS)

    Nasaruddin; Tsujioka, Tetsuo

    An optical CDMA (OCDMA) system is a flexible technology for future broadband multiple access networks. A secure OCDMA network in broadband optical access technologies is also becoming an issue of great importance. In this paper, we propose novel reconfigurable wavelength-time (W-T) optical codes that lead to secure transmission in OCDMA networks. The proposed W-T optical codes are constructed by using quasigroups (QGs) for wavelength hopping and one-dimensional optical orthogonal codes (OOCs) for time spreading; we call them QGs/OOCs. Both QGs and OOCs are randomly generated by a computer search to ensure that an eavesdropper could not improve its interception performance by making use of the coding structure. Then, the proposed reconfigurable QGs/OOCs can provide more codewords, and many different code set patterns, which differ in both wavelength and time positions for given code parameters. Moreover, the bit error probability of the proposed codes is analyzed numerically. To realize the proposed codes, a secure system is proposed by employing reconfigurable encoders/decoders based on array waveguide gratings (AWGs), which allow the users to change their codeword patterns to protect against eavesdropping. Finally, the probability of breaking a certain codeword in the proposed system is evaluated analytically. The results show that the proposed codes and system can provide a large codeword pattern, and decrease the probability of breaking a certain codeword, to enhance OCDMA network security.

  14. Mean Line Pump Flow Model in Rocket Engine System Simulation

    NASA Technical Reports Server (NTRS)

    Veres, Joseph P.; Lavelle, Thomas M.

    2000-01-01

    A mean line pump flow modeling method has been developed to provide a fast capability for modeling turbopumps of rocket engines. Based on this method, a mean line pump flow code PUMPA has been written that can predict the performance of pumps at off-design operating conditions, given the loss of the diffusion system at the design point. The pump code can model axial flow inducers, mixed-flow and centrifugal pumps. The code can model multistage pumps in series. The code features rapid input setup and computer run time, and is an effective analysis and conceptual design tool. The map generation capability of the code provides the map information needed for interfacing with a rocket engine system modeling code. The off-design and multistage modeling capabilities of the code permit parametric design space exploration of candidate pump configurations and provide pump performance data for engine system evaluation. The PUMPA code has been integrated with the Numerical Propulsion System Simulation (NPSS) code and an expander rocket engine system has been simulated. The mean line pump flow code runs as an integral part of the NPSS rocket engine system simulation and provides key pump performance information directly to the system model at all operating conditions.

  15. Final Scientific Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hayes, Adam; Mayton, Mark; Rolland, Jannick

    2016-03-29

    Project 1: We have created a 3D optical research and design software platform for simulation and optimization, geared toward asymmetric, folded optical systems and new, enabling freeform surfaces. The software, Eikonal+, targets both institutional researchers and leading optical surface fabricators. With a modular design and the source code available to the development team at the University of Rochester, custom modules can be created for specific research interests and is accelerating the work on freeform optics currently being carried out at the Institute of Optics. With a research-based optical design environment, the fabrication, assembly, and testing industries can anticipate, innovate, andmore » retool for the future of optical systems. Targeted proposals for science and innovation in freeform optics spanning design to fabrication, assembly, and testing can proceed with a level of technical transparency that has been unachievable in this field since the 1960’s, when optics design code was commercialized and became unavailable to the research community for competitive reasons. Project 2: The University of Rochester Laboratory for Laser Energetics (LLE) with personnel from Flint Creek Resources (FCR) collaborated to develop technologies for the reclamation and reuse of cerium oxide based slurries intended for the polishing of optical components. The pilot process was evaluated and modifications were made to improve the collection of spent glass polish, to improve the efficiency and capacity of the recycling equipment, and to expand the customer base. A portable, self-contained system was developed and fabricated to recycle glass polishing compounds where the spent materials are produced.« less

  16. Numerical Design of Megawatt Gyrotron with 120 GHz Frequency and 50% Efficiency for Plasma Fusion Application

    NASA Astrophysics Data System (ADS)

    Kumar, Nitin; Singh, Udaybir; Kumar, Anil; Bhattacharya, Ranajoy; Singh, T. P.; Sinha, A. K.

    2013-02-01

    The design of 120 GHz, 1 MW gyrotron for plasma fusion application is presented in this paper. The mode selection is carried out considering the aim of minimum mode competition, minimum cavity wall heating, etc. On the basis of the selected operating mode, the interaction cavity design and beam-wave interaction computation are carried out by using the PIC code. The design of triode type Magnetron Injection Gun (MIG) is also presented. Trajectory code EGUN, synthesis code MIGSYN and data analysis code MIGANS are used in the MIG designing. Further, the design of MIG is also validated by using the another trajectory code TRAK. The design results of beam dumping system (collector) and RF window are also presented. Depressed collector is designed to enhance the overall tube efficiency. The design study confirms >1 MW output power with tube efficiency around 50% (with collector efficiency).

  17. 49 CFR 1312.7 - STB tariff designation.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... tariff designation consisting of: (1) The characters “STB”; (2) The assigned alpha code of the carrier or... tariff 1000-A could be designated 1000-B, etc. (b) Alpha codes. Alpha codes are assigned to carriers and...

  18. 49 CFR 1312.7 - STB tariff designation.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... tariff designation consisting of: (1) The characters “STB”; (2) The assigned alpha code of the carrier or... tariff 1000-A could be designated 1000-B, etc. (b) Alpha codes. Alpha codes are assigned to carriers and...

  19. 49 CFR 1312.7 - STB tariff designation.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... tariff designation consisting of: (1) The characters “STB”; (2) The assigned alpha code of the carrier or... tariff 1000-A could be designated 1000-B, etc. (b) Alpha codes. Alpha codes are assigned to carriers and...

  20. 49 CFR 1312.7 - STB tariff designation.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... tariff designation consisting of: (1) The characters “STB”; (2) The assigned alpha code of the carrier or... tariff 1000-A could be designated 1000-B, etc. (b) Alpha codes. Alpha codes are assigned to carriers and...

  1. 49 CFR 1312.7 - STB tariff designation.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... tariff designation consisting of: (1) The characters “STB”; (2) The assigned alpha code of the carrier or... tariff 1000-A could be designated 1000-B, etc. (b) Alpha codes. Alpha codes are assigned to carriers and...

  2. MILCOM '85 - Military Communications Conference, Boston, MA, October 20-23, 1985, Conference Record. Volumes 1, 2, & 3

    NASA Astrophysics Data System (ADS)

    The present conference on the development status of communications systems in the context of electronic warfare gives attention to topics in spread spectrum code acquisition, digital speech technology, fiber-optics communications, free space optical communications, the networking of HF systems, and applications and evaluation methods for digital speech. Also treated are issues in local area network system design, coding techniques and applications, technology applications for HF systems, receiver technologies, software development status, channel simultion/prediction methods, C3 networking spread spectrum networks, the improvement of communication efficiency and reliability through technical control methods, mobile radio systems, and adaptive antenna arrays. Finally, communications system cost analyses, spread spectrum performance, voice and image coding, switched networks, and microwave GaAs ICs, are considered.

  3. Running an ethical trial 60 years after the Nuremberg Code.

    PubMed

    Markman, Jonathan R; Markman, Maurie

    2007-12-01

    The Nuremberg Code has served as a foundation for ethical clinical research since its publication 60 years ago. This landmark document, developed in response to the horrors of human experimentation done by Nazi physicians and investigators, focused crucial attention on the fundamental rights of research participants and on the responsibilities of investigators. Although the Nuremberg Code has provided an important framework for discussions on the requirements of ethical clinical research, and has resulted in the development of other initiatives-eg, the Declaration of Helsinki and the Belmont Report-designed to ensure the rights and safety of human beings taking part in medical research, knowledge of both past events and the current complexity of research suggests further improvements are necessary in the existing approaches to human clinical research.

  4. HSR combustion analytical research

    NASA Technical Reports Server (NTRS)

    Nguyen, H. Lee

    1992-01-01

    Increasing the pressure and temperature of the engines of a new generation of supersonic airliners increases the emissions of nitrogen oxides (NO(x)) to a level that would have an adverse impact on the Earth's protective ozone layer. In the process of evolving and implementing low emissions combustor technologies, NASA LeRC has pursued a combustion analysis code program to guide combustor design processes, to identify potential concepts of the greatest promise, and to optimize them at low cost, with short turnaround time. The computational analyses are evaluated at actual engine operating conditions. The approach is to upgrade and apply advanced computer programs for gas turbine applications. Efforts were made in further improving the code capabilities for modeling the physics and the numerical methods of solution. Then test cases and measurements from experiments are used for code validation.

  5. COMMIX-PPC: A three-dimensional transient multicomponent computer program for analyzing performance of power plant condensers. Volume 1, Equations and numerics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chien, T.H.; Domanus, H.M.; Sha, W.T.

    1993-02-01

    The COMMIX-PPC computer pregrain is an extended and improved version of earlier COMMIX codes and is specifically designed for evaluating the thermal performance of power plant condensers. The COMMIX codes are general-purpose computer programs for the analysis of fluid flow and heat transfer in complex Industrial systems. In COMMIX-PPC, two major features have been added to previously published COMMIX codes. One feature is the incorporation of one-dimensional equations of conservation of mass, momentum, and energy on the tube stile and the proper accounting for the thermal interaction between shell and tube side through the porous-medium approach. The other added featuremore » is the extension of the three-dimensional conservation equations for shell-side flow to treat the flow of a multicomponent medium. COMMIX-PPC is designed to perform steady-state and transient. Three-dimensional analysis of fluid flow with heat transfer tn a power plant condenser. However, the code is designed in a generalized fashion so that, with some modification, it can be used to analyze processes in any heat exchanger or other single-phase engineering applications. Volume I (Equations and Numerics) of this report describes in detail the basic equations, formulation, solution procedures, and models for a phenomena. Volume II (User`s Guide and Manual) contains the input instruction, flow charts, sample problems, and descriptions of available options and boundary conditions.« less

  6. ICAN Computer Code Adapted for Building Materials

    NASA Technical Reports Server (NTRS)

    Murthy, Pappu L. N.

    1997-01-01

    The NASA Lewis Research Center has been involved in developing composite micromechanics and macromechanics theories over the last three decades. These activities have resulted in several composite mechanics theories and structural analysis codes whose applications range from material behavior design and analysis to structural component response. One of these computer codes, the Integrated Composite Analyzer (ICAN), is designed primarily to address issues related to designing polymer matrix composites and predicting their properties - including hygral, thermal, and mechanical load effects. Recently, under a cost-sharing cooperative agreement with a Fortune 500 corporation, Master Builders Inc., ICAN was adapted to analyze building materials. The high costs and technical difficulties involved with the fabrication of continuous-fiber-reinforced composites sometimes limit their use. Particulate-reinforced composites can be thought of as a viable alternative. They are as easily processed to near-net shape as monolithic materials, yet have the improved stiffness, strength, and fracture toughness that is characteristic of continuous-fiber-reinforced composites. For example, particlereinforced metal-matrix composites show great potential for a variety of automotive applications, such as disk brake rotors, connecting rods, cylinder liners, and other hightemperature applications. Building materials, such as concrete, can be thought of as one of the oldest materials in this category of multiphase, particle-reinforced materials. The adaptation of ICAN to analyze particle-reinforced composite materials involved the development of new micromechanics-based theories. A derivative of the ICAN code, ICAN/PART, was developed and delivered to Master Builders Inc. as a part of the cooperative activity.

  7. Structural design, analysis, and code evaluation of an odd-shaped pressure vessel

    NASA Astrophysics Data System (ADS)

    Rezvani, M. A.; Ziada, H. H.

    1992-12-01

    An effort to design, analyze, and evaluate a rectangular pressure vessel is described. Normally pressure vessels are designed in circular or spherical shapes to prevent stress concentrations. In this case, because of operational limitations, the choice of vessels was limited to a rectangular pressure box with a removable cover plate. The American Society of Mechanical Engineers (ASME) Boiler and Pressure Vessel Code is used as a guideline for pressure containments whose width or depth exceeds 15.24 cm (6.0 in.) and where pressures will exceed 103.4 KPa (15.0 lbf/in(sup 2)). This evaluation used Section 8 of this Code, hereafter referred to as the Code. The dimensions and working pressure of the subject vessel fall within the pressure vessel category of the Code. The Code design guidelines and rules do not directly apply to this vessel. Therefore, finite-element methodology was used to analyze the pressure vessel, and the Code then was used in qualifying the vessel to be stamped to the Code. Section 8, Division 1 of the Code was used for evaluation. This action was justified by selecting a material for which fatigue damage would not be a concern. The stress analysis results were then checked against the Code, and the thicknesses adjusted to satisfy Code requirements. Although not directly applicable, the Code design formulas for rectangular vessels were also considered and presented.

  8. A source-channel coding approach to digital image protection and self-recovery.

    PubMed

    Sarreshtedari, Saeed; Akhaee, Mohammad Ali

    2015-07-01

    Watermarking algorithms have been widely applied to the field of image forensics recently. One of these very forensic applications is the protection of images against tampering. For this purpose, we need to design a watermarking algorithm fulfilling two purposes in case of image tampering: 1) detecting the tampered area of the received image and 2) recovering the lost information in the tampered zones. State-of-the-art techniques accomplish these tasks using watermarks consisting of check bits and reference bits. Check bits are used for tampering detection, whereas reference bits carry information about the whole image. The problem of recovering the lost reference bits still stands. This paper is aimed at showing that having the tampering location known, image tampering can be modeled and dealt with as an erasure error. Therefore, an appropriate design of channel code can protect the reference bits against tampering. In the present proposed method, the total watermark bit-budget is dedicated to three groups: 1) source encoder output bits; 2) channel code parity bits; and 3) check bits. In watermark embedding phase, the original image is source coded and the output bit stream is protected using appropriate channel encoder. For image recovery, erasure locations detected by check bits help channel erasure decoder to retrieve the original source encoded image. Experimental results show that our proposed scheme significantly outperforms recent techniques in terms of image quality for both watermarked and recovered image. The watermarked image quality gain is achieved through spending less bit-budget on watermark, while image recovery quality is considerably improved as a consequence of consistent performance of designed source and channel codes.

  9. Channel modeling, signal processing and coding for perpendicular magnetic recording

    NASA Astrophysics Data System (ADS)

    Wu, Zheng

    With the increasing areal density in magnetic recording systems, perpendicular recording has replaced longitudinal recording to overcome the superparamagnetic limit. Studies on perpendicular recording channels including aspects of channel modeling, signal processing and coding techniques are presented in this dissertation. To optimize a high density perpendicular magnetic recording system, one needs to know the tradeoffs between various components of the system including the read/write transducers, the magnetic medium, and the read channel. We extend the work by Chaichanavong on the parameter optimization for systems via design curves. Different signal processing and coding techniques are studied. Information-theoretic tools are utilized to determine the acceptable region for the channel parameters when optimal detection and linear coding techniques are used. Our results show that a considerable gain can be achieved by the optimal detection and coding techniques. The read-write process in perpendicular magnetic recording channels includes a number of nonlinear effects. Nonlinear transition shift (NLTS) is one of them. The signal distortion induced by NLTS can be reduced by write precompensation during data recording. We numerically evaluate the effect of NLTS on the read-back signal and examine the effectiveness of several write precompensation schemes in combating NLTS in a channel characterized by both transition jitter noise and additive white Gaussian electronics noise. We also present an analytical method to estimate the bit-error-rate and use it to help determine the optimal write precompensation values in multi-level precompensation schemes. We propose a mean-adjusted pattern-dependent noise predictive (PDNP) detection algorithm for use on the channel with NLTS. We show that this detector can offer significant improvements in bit-error-rate (BER) compared to conventional Viterbi and PDNP detectors. Moreover, the system performance can be further improved by combining the new detector with a simple write precompensation scheme. Soft-decision decoding for algebraic codes can improve performance for magnetic recording systems. In this dissertation, we propose two soft-decision decoding methods for tensor-product parity codes. We also present a list decoding algorithm for generalized error locating codes.

  10. Quality of recording of diabetes in the UK: how does the GP's method of coding clinical data affect incidence estimates? Cross-sectional study using the CPRD database

    PubMed Central

    Tate, A Rosemary; Dungey, Sheena; Glew, Simon; Beloff, Natalia; Williams, Rachael; Williams, Tim

    2017-01-01

    Objective To assess the effect of coding quality on estimates of the incidence of diabetes in the UK between 1995 and 2014. Design A cross-sectional analysis examining diabetes coding from 1995 to 2014 and how the choice of codes (diagnosis codes vs codes which suggest diagnosis) and quality of coding affect estimated incidence. Setting Routine primary care data from 684 practices contributing to the UK Clinical Practice Research Datalink (data contributed from Vision (INPS) practices). Main outcome measure Incidence rates of diabetes and how they are affected by (1) GP coding and (2) excluding ‘poor’ quality practices with at least 10% incident patients inaccurately coded between 2004 and 2014. Results Incidence rates and accuracy of coding varied widely between practices and the trends differed according to selected category of code. If diagnosis codes were used, the incidence of type 2 increased sharply until 2004 (when the UK Quality Outcomes Framework was introduced), and then flattened off, until 2009, after which they decreased. If non-diagnosis codes were included, the numbers continued to increase until 2012. Although coding quality improved over time, 15% of the 666 practices that contributed data between 2004 and 2014 were labelled ‘poor’ quality. When these practices were dropped from the analyses, the downward trend in the incidence of type 2 after 2009 became less marked and incidence rates were higher. Conclusions In contrast to some previous reports, diabetes incidence (based on diagnostic codes) appears not to have increased since 2004 in the UK. Choice of codes can make a significant difference to incidence estimates, as can quality of recording. Codes and data quality should be checked when assessing incidence rates using GP data. PMID:28122831

  11. An Assessment of Some Design Constraints on Heat Production of a 3D Conceptual EGS Model Using an Open-Source Geothermal Reservoir Simulation Code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yidong Xia; Mitch Plummer; Robert Podgorney

    2016-02-01

    Performance of heat production process over a 30-year period is assessed in a conceptual EGS model with a geothermal gradient of 65K per km depth in the reservoir. Water is circulated through a pair of parallel wells connected by a set of single large wing fractures. The results indicate that the desirable output electric power rate and lifespan could be obtained under suitable material properties and system parameters. A sensitivity analysis on some design constraints and operation parameters indicates that 1) the fracture horizontal spacing has profound effect on the long-term performance of heat production, 2) the downward deviation anglemore » for the parallel doublet wells may help overcome the difficulty of vertical drilling to reach a favorable production temperature, and 3) the thermal energy production rate and lifespan has close dependence on water mass flow rate. The results also indicate that the heat production can be improved when the horizontal fracture spacing, well deviation angle, and production flow rate are under reasonable conditions. To conduct the reservoir modeling and simulations, an open-source, finite element based, fully implicit, fully coupled hydrothermal code, namely FALCON, has been developed and used in this work. Compared with most other existing codes that are either closed-source or commercially available in this area, this new open-source code has demonstrated a code development strategy that aims to provide an unparalleled easiness for user-customization and multi-physics coupling. Test results have shown that the FALCON code is able to complete the long-term tests efficiently and accurately, thanks to the state-of-the-art nonlinear and linear solver algorithms implemented in the code.« less

  12. Designing an Intervention for Women with Systemic Lupus Erythematosus from Medically Underserved Areas to Improve Care: A Qualitative Study

    PubMed Central

    Feldman, Candace H; Bermas, Bonnie L; Zibit, Melanie; Fraser, Patricia; Todd, Derrick J; Fortin, Paul R; Massarotti, Elena; Costenbader, Karen H

    2013-01-01

    Objective Systemic lupus erythematosus (lupus) disproportionately affects women, racial/ethnic minorities and low-income populations. We held focus groups for women from medically underserved communities to discuss interventions to improve care. Methods From our Lupus Registry, we invited 282 women, > 18 years, residing in urban, medically underserved areas. Hospital-based clinics and support groups also recruited participants. Women were randomly assigned to 3 focus groups. 75-minute sessions were recorded, transcribed and coded thematically using interpretative phenomenologic analysis and single counting methods. We categorized interventions by benefits, limitations, target populations and implementation questions. Results 29 women with lupus participated in 3 focus groups, (n=9, 9, 11). 80% were African American and 83% were from medically underserved zip codes. Themes included the desire for lupus education, isolation at the time of diagnosis, emotional and physical barriers to care, and the need for assistance navigating the healthcare system. 20 of 29 participants (69%) favored a peer support intervention; 17 (59%) also supported a lupus health passport. Newly diagnosed women were optimal intervention targets. Improvements in quality of life and mental health were proposed outcome measures. Conclusion Women with lupus from medically underserved areas have unique needs best addressed with an intervention designed through collaboration between community members and researchers. PMID:23087258

  13. ARCADIA{sup R} - A New Generation of Coupled Neutronics / Core Thermal- Hydraulics Code System at AREVA NP

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Curca-Tivig, Florin; Merk, Stephan; Pautz, Andreas

    2007-07-01

    Anticipating future needs of our customers and willing to concentrate synergies and competences existing in the company for the benefit of our customers, AREVA NP decided in 2002 to develop the next generation of coupled neutronics/ core thermal-hydraulic (TH) code systems for fuel assembly and core design calculations for both, PWR and BWR applications. The global CONVERGENCE project was born: after a feasibility study of one year (2002) and a conceptual phase of another year (2003), development was started at the beginning of 2004. The present paper introduces the CONVERGENCE project, presents the main feature of the new code systemmore » ARCADIA{sup R} and concludes on customer benefits. ARCADIA{sup R} is designed to meet AREVA NP market and customers' requirements worldwide. Besides state-of-the-art physical modeling, numerical performance and industrial functionality, the ARCADIA{sup R} system is featuring state-of-the-art software engineering. The new code system will bring a series of benefits for our customers: e.g. improved accuracy for heterogeneous cores (MOX/ UOX, Gd...), better description of nuclide chains, and access to local neutronics/ thermal-hydraulics and possibly thermal-mechanical information (3D pin by pin full core modeling). ARCADIA is a registered trademark of AREVA NP. (authors)« less

  14. Use of Existing CAD Models for Radiation Shielding Analysis

    NASA Technical Reports Server (NTRS)

    Lee, K. T.; Barzilla, J. E.; Wilson, P.; Davis, A.; Zachman, J.

    2015-01-01

    The utility of a radiation exposure analysis depends not only on the accuracy of the underlying particle transport code, but also on the accuracy of the geometric representations of both the vehicle used as radiation shielding mass and the phantom representation of the human form. The current NASA/Space Radiation Analysis Group (SRAG) process to determine crew radiation exposure in a vehicle design incorporates both output from an analytic High Z and Energy Particle Transport (HZETRN) code and the properties (i.e., material thicknesses) of a previously processed drawing. This geometry pre-process can be time-consuming, and the results are less accurate than those determined using a Monte Carlo-based particle transport code. The current work aims to improve this process. Although several Monte Carlo programs (FLUKA, Geant4) are readily available, most use an internal geometry engine. The lack of an interface with the standard CAD formats used by the vehicle designers limits the ability of the user to communicate complex geometries. Translation of native CAD drawings into a format readable by these transport programs is time consuming and prone to error. The Direct Accelerated Geometry -United (DAGU) project is intended to provide an interface between the native vehicle or phantom CAD geometry and multiple particle transport codes to minimize problem setup, computing time and analysis error.

  15. Two-dimensional over-all neutronics analysis of the ITER device

    NASA Astrophysics Data System (ADS)

    Zimin, S.; Takatsu, Hideyuki; Mori, Seiji; Seki, Yasushi; Satoh, Satoshi; Tada, Eisuke; Maki, Koichi

    1993-07-01

    The present work attempts to carry out a comprehensive neutronics analysis of the International Thermonuclear Experimental Reactor (ITER) developed during the Conceptual Design Activities (CDA). The two-dimensional cylindrical over-all calculational models of ITER CDA device including the first wall, blanket, shield, vacuum vessel, magnets, cryostat and support structures were developed for this purpose with a help of the DOGII code. Two dimensional DOT 3.5 code with the FUSION-40 nuclear data library was employed for transport calculations of neutron and gamma ray fluxes, tritium breeding ratio (TBR), and nuclear heating in reactor components. The induced activity calculational code CINAC was employed for the calculations of exposure dose rate after reactor shutdown around the ITER CDA device. The two-dimensional over-all calculational model includes the design specifics such as the pebble bed Li2O/Be layered blanket, the thin double wall vacuum vessel, the concrete cryostat integrated with the over-all ITER design, the top maintenance shield plug, the additional ring biological shield placed under the top cryostat lid around the above-mentioned top maintenance shield plug etc. All the above-mentioned design specifics were included in the employed calculational models. Some alternative design options, such as the water-rich shielding blanket instead of lithium-bearing one, the additional biological shield plug at the top zone between the poloidal field (PF) coil No. 5, and the maintenance shield plug, were calculated as well. Much efforts have been focused on analyses of obtained results. These analyses aimed to obtain necessary recommendations on improving the ITER CDA design.

  16. An integrated optimum design approach for high speed prop rotors

    NASA Technical Reports Server (NTRS)

    Chattopadhyay, Aditi; Mccarthy, Thomas R.

    1995-01-01

    The objective is to develop an optimization procedure for high-speed and civil tilt-rotors by coupling all of the necessary disciplines within a closed-loop optimization procedure. Both simplified and comprehensive analysis codes are used for the aerodynamic analyses. The structural properties are calculated using in-house developed algorithms for both isotropic and composite box beam sections. There are four major objectives of this study. (1) Aerodynamic optimization: The effects of blade aerodynamic characteristics on cruise and hover performance of prop-rotor aircraft are investigated using the classical blade element momentum approach with corrections for the high lift capability of rotors/propellers. (2) Coupled aerodynamic/structures optimization: A multilevel hybrid optimization technique is developed for the design of prop-rotor aircraft. The design problem is decomposed into a level for improved aerodynamics with continuous design variables and a level with discrete variables to investigate composite tailoring. The aerodynamic analysis is based on that developed in objective 1 and the structural analysis is performed using an in-house code which models a composite box beam. The results are compared to both a reference rotor and the optimum rotor found in the purely aerodynamic formulation. (3) Multipoint optimization: The multilevel optimization procedure of objective 2 is extended to a multipoint design problem. Hover, cruise, and take-off are the three flight conditions simultaneously maximized. (4) Coupled rotor/wing optimization: Using the comprehensive rotary wing code CAMRAD, an optimization procedure is developed for the coupled rotor/wing performance in high speed tilt-rotor aircraft. The developed procedure contains design variables which define the rotor and wing planforms.

  17. SOCR Analyses - an Instructional Java Web-based Statistical Analysis Toolkit.

    PubMed

    Chu, Annie; Cui, Jenny; Dinov, Ivo D

    2009-03-01

    The Statistical Online Computational Resource (SOCR) designs web-based tools for educational use in a variety of undergraduate courses (Dinov 2006). Several studies have demonstrated that these resources significantly improve students' motivation and learning experiences (Dinov et al. 2008). SOCR Analyses is a new component that concentrates on data modeling and analysis using parametric and non-parametric techniques supported with graphical model diagnostics. Currently implemented analyses include commonly used models in undergraduate statistics courses like linear models (Simple Linear Regression, Multiple Linear Regression, One-Way and Two-Way ANOVA). In addition, we implemented tests for sample comparisons, such as t-test in the parametric category; and Wilcoxon rank sum test, Kruskal-Wallis test, Friedman's test, in the non-parametric category. SOCR Analyses also include several hypothesis test models, such as Contingency tables, Friedman's test and Fisher's exact test.The code itself is open source (http://socr.googlecode.com/), hoping to contribute to the efforts of the statistical computing community. The code includes functionality for each specific analysis model and it has general utilities that can be applied in various statistical computing tasks. For example, concrete methods with API (Application Programming Interface) have been implemented in statistical summary, least square solutions of general linear models, rank calculations, etc. HTML interfaces, tutorials, source code, activities, and data are freely available via the web (www.SOCR.ucla.edu). Code examples for developers and demos for educators are provided on the SOCR Wiki website.In this article, the pedagogical utilization of the SOCR Analyses is discussed, as well as the underlying design framework. As the SOCR project is on-going and more functions and tools are being added to it, these resources are constantly improved. The reader is strongly encouraged to check the SOCR site for most updated information and newly added models.

  18. Directed educational training improves coding and billing skills for residents.

    PubMed

    Benke, James R; Lin, Sandra Y; Ishman, Stacey L

    2013-03-01

    To determine if coding and billing acumen improves after a single directed educational training session. Case-control series. Fourteen otolaryngology practitioners including trainees each completed two clinical scenarios before and after a directed educational session covering basic skills and common mistakes in otolaryngology billing and coding. Ten practitioners had never coded before; while, four regularly billed and coded in a clinical setting. Individuals with no previous billing experience had a mean score of 54% (median 55%) before the educational session which was significantly lower than that of the experienced billers who averaged 82% (median 83%, p=0.002). After the educational billing and coding session, the inexperienced billers mean score improved to 62% (median, 67%) which was still statistically lower than that of the experienced billers who averaged 76% (median 75%, p=0.039). The inexperienced billers demonstrated a significant improvement in their total score after the intervention (P=0.019); however, the change observed in experienced billers before and after the educational intervention was not significant (P=0.469). Billing and coding skill was improved after a single directed education session. Residents, who are not responsible for regular billing and coding, were found to have the greatest improvement in skill. However, providers who regularly bill and code had no significant improvement after this session. These data suggest that a single 90min billing and coding education session is effective in preparing those with limited experience to competently bill and code. Copyright © 2012. Published by Elsevier Ireland Ltd.

  19. A Silent Revolution: From Sketching to Coding--A Case Study on Code-Based Design Tool Learning

    ERIC Educational Resources Information Center

    Xu, Song; Fan, Kuo-Kuang

    2017-01-01

    Along with the information technology rising, Computer Aided Design activities are becoming more modern and more complex. But learning how to operation these new design tools has become the main problem lying in front of each designer. This study was purpose on finding problems encountered during code-based design tools learning period of…

  20. Development and Implementation of CFD-Informed Models for the Advanced Subchannel Code CTF

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Blyth, Taylor S.; Avramova, Maria

    The research described in this PhD thesis contributes to the development of efficient methods for utilization of high-fidelity models and codes to inform low-fidelity models and codes in the area of nuclear reactor core thermal-hydraulics. The objective is to increase the accuracy of predictions of quantities of interests using high-fidelity CFD models while preserving the efficiency of low-fidelity subchannel core calculations. An original methodology named Physics- based Approach for High-to-Low Model Information has been further developed and tested. The overall physical phenomena and corresponding localized effects, which are introduced by the presence of spacer grids in light water reactor (LWR)more » cores, are dissected in corresponding four building basic processes, and corresponding models are informed using high-fidelity CFD codes. These models are a spacer grid-directed cross-flow model, a grid-enhanced turbulent mixing model, a heat transfer enhancement model, and a spacer grid pressure loss model. The localized CFD-models are developed and tested using the CFD code STAR-CCM+, and the corresponding global model development and testing in sub-channel formulation is performed in the thermal- hydraulic subchannel code CTF. The improved CTF simulations utilize data-files derived from CFD STAR-CCM+ simulation results covering the spacer grid design desired for inclusion in the CTF calculation. The current implementation of these models is examined and possibilities for improvement and further development are suggested. The validation experimental database is extended by including the OECD/NRC PSBT benchmark data. The outcome is an enhanced accuracy of CTF predictions while preserving the computational efficiency of a low-fidelity subchannel code.« less

  1. The international implications of national and local coordination on building energy codes: Case studies in six cities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Evans, Meredydd; Yu, Sha; Staniszewski, Aaron

    Building energy efficiency is an important strategy for reducing greenhouse gas emissions globally. In fact, 55 countries have included building energy efficiency in their Nationally Determined Contributions (NDCs) under the Paris Agreement. This research uses building energy code implementation in six cities across different continents as case studies to assess what it may take for countries to implement the ambitions of their energy efficiency goals. Specifically, we look at the cases of Bogota, Colombia; Da Nang, Vietnam; Eskisehir, Turkey; Mexico City, Mexico; Rajkot, India; and Tshwane, South Africa, all of which are “deep dive” cities under the Sustainable Energy formore » All's Building Efficiency Accelerator. The research focuses on understanding the baseline with existing gaps in implementation and coordination. The methodology used a combination of surveys on code status and interviews with stakeholders at the local and national level, as well as review of published documents. We looked at code development, implementation, and evaluation. The cities are all working to improve implementation, however, the challenges they currently face include gaps in resources, capacity, tools, and institutions to check for compliance. Better coordination between national and local governments could help improve implementation, but that coordination is not yet well established. For example, all six of the cities reported that there was little to no involvement of local stakeholders in development of the national code; only one city reported that it had access to national funding to support code implementation. More robust coordination could better link cities with capacity building and funding for compliance, and ensure that the code reflects local priorities. By understanding gaps in implementation, it can also help in designing more targeted interventions to scale up energy savings.« less

  2. The international implications of national and local coordination on building energy codes: Case studies in six cities

    DOE PAGES

    Evans, Meredydd; Yu, Sha; Staniszewski, Aaron; ...

    2018-04-17

    Building energy efficiency is an important strategy for reducing greenhouse gas emissions globally. In fact, 55 countries have included building energy efficiency in their Nationally Determined Contributions (NDCs) under the Paris Agreement. This research uses building energy code implementation in six cities across different continents as case studies to assess what it may take for countries to implement the ambitions of their energy efficiency goals. Specifically, we look at the cases of Bogota, Colombia; Da Nang, Vietnam; Eskisehir, Turkey; Mexico City, Mexico; Rajkot, India; and Tshwane, South Africa, all of which are “deep dive” cities under the Sustainable Energy formore » All's Building Efficiency Accelerator. The research focuses on understanding the baseline with existing gaps in implementation and coordination. The methodology used a combination of surveys on code status and interviews with stakeholders at the local and national level, as well as review of published documents. We looked at code development, implementation, and evaluation. The cities are all working to improve implementation, however, the challenges they currently face include gaps in resources, capacity, tools, and institutions to check for compliance. Better coordination between national and local governments could help improve implementation, but that coordination is not yet well established. For example, all six of the cities reported that there was little to no involvement of local stakeholders in development of the national code; only one city reported that it had access to national funding to support code implementation. More robust coordination could better link cities with capacity building and funding for compliance, and ensure that the code reflects local priorities. By understanding gaps in implementation, it can also help in designing more targeted interventions to scale up energy savings.« less

  3. Development and Implementation of CFD-Informed Models for the Advanced Subchannel Code CTF

    NASA Astrophysics Data System (ADS)

    Blyth, Taylor S.

    The research described in this PhD thesis contributes to the development of efficient methods for utilization of high-fidelity models and codes to inform low-fidelity models and codes in the area of nuclear reactor core thermal-hydraulics. The objective is to increase the accuracy of predictions of quantities of interests using high-fidelity CFD models while preserving the efficiency of low-fidelity subchannel core calculations. An original methodology named Physics-based Approach for High-to-Low Model Information has been further developed and tested. The overall physical phenomena and corresponding localized effects, which are introduced by the presence of spacer grids in light water reactor (LWR) cores, are dissected in corresponding four building basic processes, and corresponding models are informed using high-fidelity CFD codes. These models are a spacer grid-directed cross-flow model, a grid-enhanced turbulent mixing model, a heat transfer enhancement model, and a spacer grid pressure loss model. The localized CFD-models are developed and tested using the CFD code STAR-CCM+, and the corresponding global model development and testing in sub-channel formulation is performed in the thermal-hydraulic subchannel code CTF. The improved CTF simulations utilize data-files derived from CFD STAR-CCM+ simulation results covering the spacer grid design desired for inclusion in the CTF calculation. The current implementation of these models is examined and possibilities for improvement and further development are suggested. The validation experimental database is extended by including the OECD/NRC PSBT benchmark data. The outcome is an enhanced accuracy of CTF predictions while preserving the computational efficiency of a low-fidelity subchannel code.

  4. Multi-point optimization of recirculation flow type casing treatment in centrifugal compressors

    NASA Astrophysics Data System (ADS)

    Tun, Min Thaw; Sakaguchi, Daisaku

    2016-06-01

    High-pressure ratio and wide operating range are highly required for a turbocharger in diesel engines. A recirculation flow type casing treatment is effective for flow range enhancement of centrifugal compressors. Two ring grooves on a suction pipe and a shroud casing wall are connected by means of an annular passage and stable recirculation flow is formed at small flow rates from the downstream groove toward the upstream groove through the annular bypass. The shape of baseline recirculation flow type casing is modified and optimized by using a multi-point optimization code with a metamodel assisted evolutionary algorithm embedding a commercial CFD code CFX from ANSYS. The numerical optimization results give the optimized design of casing with improving adiabatic efficiency in wide operating flow rate range. Sensitivity analysis of design parameters as a function of efficiency has been performed. It is found that the optimized casing design provides optimized recirculation flow rate, in which an increment of entropy rise is minimized at grooves and passages of the rotating impeller.

  5. Convection and chemistry effects in CVD: A 3-D analysis for silicon deposition

    NASA Technical Reports Server (NTRS)

    Gokoglu, S. A.; Kuczmarski, M. A.; Tsui, P.; Chait, A.

    1989-01-01

    The computational fluid dynamics code FLUENT has been adopted to simulate the entire rectangular-channel-like (3-D) geometry of an experimental CVD reactor designed for Si deposition. The code incorporated the effects of both homogeneous (gas phase) and heterogeneous (surface) chemistry with finite reaction rates of important species existing in silane dissociation. The experiments were designed to elucidate the effects of gravitationally-induced buoyancy-driven convection flows on the quality of the grown Si films. This goal is accomplished by contrasting the results obtained from a carrier gas mixture of H2/Ar with the ones obtained from the same molar mixture ratio of H2/He, without any accompanying change in the chemistry. Computationally, these cases are simulated in the terrestrial gravitational field and in the absence of gravity. The numerical results compare favorably with experiments. Powerful computational tools provide invaluable insights into the complex physicochemical phenomena taking place in CVD reactors. Such information is essential for the improved design and optimization of future CVD reactors.

  6. The openEHR Java reference implementation project.

    PubMed

    Chen, Rong; Klein, Gunnar

    2007-01-01

    The openEHR foundation has developed an innovative design for interoperable and future-proof Electronic Health Record (EHR) systems based on a dual model approach with a stable reference information model complemented by archetypes for specific clinical purposes.A team from Sweden has implemented all the stable specifications in the Java programming language and donated the source code to the openEHR foundation. It was adopted as the openEHR Java Reference Implementation in March 2005 and released under open source licenses. This encourages early EHR implementation projects around the world and a number of groups have already started to use this code. The early Java implementation experience has also led to the publication of the openEHR Java Implementation Technology Specification. A number of design changes to the specifications and important minor corrections have been directly initiated by the implementation project over the last two years. The Java Implementation has been important for the validation and improvement of the openEHR design specifications and provides building blocks for future EHR systems.

  7. SAFETY IN THE DESIGN OF SCIENCE LABORATORIES AND BUILDING CODES.

    ERIC Educational Resources Information Center

    HOROWITZ, HAROLD

    THE DESIGN OF COLLEGE AND UNIVERSITY BUILDINGS USED FOR SCIENTIFIC RESEARCH AND EDUCATION IS DISCUSSED IN TERMS OF LABORATORY SAFETY AND BUILDING CODES AND REGULATIONS. MAJOR TOPIC AREAS ARE--(1) SAFETY RELATED DESIGN FEATURES OF SCIENCE LABORATORIES, (2) LABORATORY SAFETY AND BUILDING CODES, AND (3) EVIDENCE OF UNSAFE DESIGN. EXAMPLES EMPHASIZE…

  8. Geometric modeling for computer aided design

    NASA Technical Reports Server (NTRS)

    Schwing, James L.; Olariu, Stephen

    1995-01-01

    The primary goal of this grant has been the design and implementation of software to be used in the conceptual design of aerospace vehicles particularly focused on the elements of geometric design, graphical user interfaces, and the interaction of the multitude of software typically used in this engineering environment. This has resulted in the development of several analysis packages and design studies. These include two major software systems currently used in the conceptual level design of aerospace vehicles. These tools are SMART, the Solid Modeling Aerospace Research Tool, and EASIE, the Environment for Software Integration and Execution. Additional software tools were designed and implemented to address the needs of the engineer working in the conceptual design environment. SMART provides conceptual designers with a rapid prototyping capability and several engineering analysis capabilities. In addition, SMART has a carefully engineered user interface that makes it easy to learn and use. Finally, a number of specialty characteristics have been built into SMART which allow it to be used efficiently as a front end geometry processor for other analysis packages. EASIE provides a set of interactive utilities that simplify the task of building and executing computer aided design systems consisting of diverse, stand-alone, analysis codes. Resulting in a streamlining of the exchange of data between programs reducing errors and improving the efficiency. EASIE provides both a methodology and a collection of software tools to ease the task of coordinating engineering design and analysis codes.

  9. Hardware-efficient bosonic quantum error-correcting codes based on symmetry operators

    NASA Astrophysics Data System (ADS)

    Niu, Murphy Yuezhen; Chuang, Isaac L.; Shapiro, Jeffrey H.

    2018-03-01

    We establish a symmetry-operator framework for designing quantum error-correcting (QEC) codes based on fundamental properties of the underlying system dynamics. Based on this framework, we propose three hardware-efficient bosonic QEC codes that are suitable for χ(2 )-interaction based quantum computation in multimode Fock bases: the χ(2 ) parity-check code, the χ(2 ) embedded error-correcting code, and the χ(2 ) binomial code. All of these QEC codes detect photon-loss or photon-gain errors by means of photon-number parity measurements, and then correct them via χ(2 ) Hamiltonian evolutions and linear-optics transformations. Our symmetry-operator framework provides a systematic procedure for finding QEC codes that are not stabilizer codes, and it enables convenient extension of a given encoding to higher-dimensional qudit bases. The χ(2 ) binomial code is of special interest because, with m ≤N identified from channel monitoring, it can correct m -photon-loss errors, or m -photon-gain errors, or (m -1 )th -order dephasing errors using logical qudits that are encoded in O (N ) photons. In comparison, other bosonic QEC codes require O (N2) photons to correct the same degree of bosonic errors. Such improved photon efficiency underscores the additional error-correction power that can be provided by channel monitoring. We develop quantum Hamming bounds for photon-loss errors in the code subspaces associated with the χ(2 ) parity-check code and the χ(2 ) embedded error-correcting code, and we prove that these codes saturate their respective bounds. Our χ(2 ) QEC codes exhibit hardware efficiency in that they address the principal error mechanisms and exploit the available physical interactions of the underlying hardware, thus reducing the physical resources required for implementing their encoding, decoding, and error-correction operations, and their universal encoded-basis gate sets.

  10. Xyce parallel electronic simulator users guide, version 6.1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Keiter, Eric R; Mei, Ting; Russo, Thomas V.

    This manual describes the use of the Xyce Parallel Electronic Simulator. Xyce has been designed as a SPICE-compatible, high-performance analog circuit simulator, and has been written to support the simulation needs of the Sandia National Laboratories electrical designers. This development has focused on improving capability over the current state-of-the-art in the following areas; Capability to solve extremely large circuit problems by supporting large-scale parallel computing platforms (up to thousands of processors). This includes support for most popular parallel and serial computers; A differential-algebraic-equation (DAE) formulation, which better isolates the device model package from solver algorithms. This allows one to developmore » new types of analysis without requiring the implementation of analysis-specific device models; Device models that are specifically tailored to meet Sandia's needs, including some radiationaware devices (for Sandia users only); and Object-oriented code design and implementation using modern coding practices. Xyce is a parallel code in the most general sense of the phrase-a message passing parallel implementation-which allows it to run efficiently a wide range of computing platforms. These include serial, shared-memory and distributed-memory parallel platforms. Attention has been paid to the specific nature of circuit-simulation problems to ensure that optimal parallel efficiency is achieved as the number of processors grows.« less

  11. Xyce parallel electronic simulator users' guide, Version 6.0.1.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Keiter, Eric R; Mei, Ting; Russo, Thomas V.

    This manual describes the use of the Xyce Parallel Electronic Simulator. Xyce has been designed as a SPICE-compatible, high-performance analog circuit simulator, and has been written to support the simulation needs of the Sandia National Laboratories electrical designers. This development has focused on improving capability over the current state-of-the-art in the following areas: Capability to solve extremely large circuit problems by supporting large-scale parallel computing platforms (up to thousands of processors). This includes support for most popular parallel and serial computers. A differential-algebraic-equation (DAE) formulation, which better isolates the device model package from solver algorithms. This allows one to developmore » new types of analysis without requiring the implementation of analysis-specific device models. Device models that are specifically tailored to meet Sandias needs, including some radiationaware devices (for Sandia users only). Object-oriented code design and implementation using modern coding practices. Xyce is a parallel code in the most general sense of the phrase a message passing parallel implementation which allows it to run efficiently a wide range of computing platforms. These include serial, shared-memory and distributed-memory parallel platforms. Attention has been paid to the specific nature of circuit-simulation problems to ensure that optimal parallel efficiency is achieved as the number of processors grows.« less

  12. Xyce parallel electronic simulator users guide, version 6.0.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Keiter, Eric R; Mei, Ting; Russo, Thomas V.

    This manual describes the use of the Xyce Parallel Electronic Simulator. Xyce has been designed as a SPICE-compatible, high-performance analog circuit simulator, and has been written to support the simulation needs of the Sandia National Laboratories electrical designers. This development has focused on improving capability over the current state-of-the-art in the following areas: Capability to solve extremely large circuit problems by supporting large-scale parallel computing platforms (up to thousands of processors). This includes support for most popular parallel and serial computers. A differential-algebraic-equation (DAE) formulation, which better isolates the device model package from solver algorithms. This allows one to developmore » new types of analysis without requiring the implementation of analysis-specific device models. Device models that are specifically tailored to meet Sandias needs, including some radiationaware devices (for Sandia users only). Object-oriented code design and implementation using modern coding practices. Xyce is a parallel code in the most general sense of the phrase a message passing parallel implementation which allows it to run efficiently a wide range of computing platforms. These include serial, shared-memory and distributed-memory parallel platforms. Attention has been paid to the specific nature of circuit-simulation problems to ensure that optimal parallel efficiency is achieved as the number of processors grows.« less

  13. Toward enhancing the distributed video coder under a multiview video codec framework

    NASA Astrophysics Data System (ADS)

    Lee, Shih-Chieh; Chen, Jiann-Jone; Tsai, Yao-Hong; Chen, Chin-Hua

    2016-11-01

    The advance of video coding technology enables multiview video (MVV) or three-dimensional television (3-D TV) display for users with or without glasses. For mobile devices or wireless applications, a distributed video coder (DVC) can be utilized to shift the encoder complexity to decoder under the MVV coding framework, denoted as multiview distributed video coding (MDVC). We proposed to exploit both inter- and intraview video correlations to enhance side information (SI) and improve the MDVC performance: (1) based on the multiview motion estimation (MVME) framework, a categorized block matching prediction with fidelity weights (COMPETE) was proposed to yield a high quality SI frame for better DVC reconstructed images. (2) The block transform coefficient properties, i.e., DCs and ACs, were exploited to design the priority rate control for the turbo code, such that the DVC decoding can be carried out with fewest parity bits. In comparison, the proposed COMPETE method demonstrated lower time complexity, while presenting better reconstructed video quality. Simulations show that the proposed COMPETE can reduce the time complexity of MVME to 1.29 to 2.56 times smaller, as compared to previous hybrid MVME methods, while the image peak signal to noise ratios (PSNRs) of a decoded video can be improved 0.2 to 3.5 dB, as compared to H.264/AVC intracoding.

  14. Novel Coatings for Enhancement of Light-Emitting Diodes (LEDs)

    DTIC Science & Technology

    2006-10-28

    quantum efficiency of LEDs. SAIC’s strength is this area is a proprietary nonimaging optics code. In consultation with Lumileds, SAIC developed designs for...five different optical ele- ments that collect and project light from a LED. The simulations showed that the designs achieve a significant improvement... optical microscope at 100x power the coating is not visible. Fire sample 6DecO4P1 in tube furnace As a first test, sample 6Dec04P1 was progressively

  15. Numerically-Based Ducted Propeller Design Using Vortex Lattice Lifting Line Theory

    DTIC Science & Technology

    2008-01-01

    greatly improved data visualization which includes graphic output and three-dimensional renderings. OpenProp was designed to perform two primary ...MATLAB® Code B.1 Q2half.m %Q2half: Legendre fuction of the second kind and positive half order %Ref: Handbook of Math Functions, Abramowitz and...134035, %Q2half(6)=.0382887, Q2half(8.4)=.0229646, Q2half(10)=.0176449 B.2 Q2Mhalf.m %Q2Mhalf: Legendre fuction of the second kind and minus half

  16. Nuclear Data Uncertainty Propagation to Reactivity Coefficients of a Sodium Fast Reactor

    NASA Astrophysics Data System (ADS)

    Herrero, J. J.; Ochoa, R.; Martínez, J. S.; Díez, C. J.; García-Herranz, N.; Cabellos, O.

    2014-04-01

    The assessment of the uncertainty levels on the design and safety parameters for the innovative European Sodium Fast Reactor (ESFR) is mandatory. Some of these relevant safety quantities are the Doppler and void reactivity coefficients, whose uncertainties are quantified. Besides, the nuclear reaction data where an improvement will certainly benefit the design accuracy are identified. This work has been performed with the SCALE 6.1 codes suite and its multigroups cross sections library based on ENDF/B-VII.0 evaluation.

  17. Development of Improved Design and 3D Printing Manufacture of Cross-Flow Fan Rotor

    DTIC Science & Technology

    2016-06-01

    the design study, each solver run was monitored. Plotting the value of the mass flows, as well as the torque on the rotor blades , allowed a simple...DISTRIBUTION CODE A 13. ABSTRACT (maximum 200 words) This study determined the optimum blade stagger angle for a cross-flow fan rotor and evaluated the...parametric study determined optimum blade stagger angle using thrust, power, and thrust-to-power ratio as desired output variables. A MarkForged Mark One 3D

  18. Engine dynamic analysis with general nonlinear finite element codes

    NASA Technical Reports Server (NTRS)

    Adams, M. L.; Padovan, J.; Fertis, D. G.

    1991-01-01

    A general engine dynamic analysis as a standard design study computational tool is described for the prediction and understanding of complex engine dynamic behavior. Improved definition of engine dynamic response provides valuable information and insights leading to reduced maintenance and overhaul costs on existing engine configurations. Application of advanced engine dynamic simulation methods provides a considerable cost reduction in the development of new engine designs by eliminating some of the trial and error process done with engine hardware development.

  19. Transition Models for Engineering Calculations

    NASA Technical Reports Server (NTRS)

    Fraser, C. J.

    2007-01-01

    While future theoretical and conceptual developments may promote a better understanding of the physical processes involved in the latter stages of boundary layer transition, the designers of rotodynamic machinery and other fluid dynamic devices need effective transition models now. This presentation will therefore center around the development of of some transition models which have been developed as design aids to improve the prediction codes used in the performance evaluation of gas turbine blading. All models are based on Narasimba's concentrated breakdown and spot growth.

  20. Android Protection Mechanism: A Signed Code Security Mechanism for Smartphone Applications

    DTIC Science & Technology

    2011-03-01

    status registers, exceptions, endian support, unaligned access support, synchronization primitives , the Jazelle Extension, and saturated integer...supports comprehensive non-blocking shared-memory synchronization primitives that scale for multiple-processor system designs. This is an improvement... synchronization . Memory semaphores can be loaded and altered without interruption because the load and store operations are atomic. Processor

  1. The Role of Scaffolding in CSCL in General and in Specific Environments

    ERIC Educational Resources Information Center

    Verdú, N.; Sanuy, J.

    2014-01-01

    This paper aims to analyse if virtual forums set up in an environment specifically designed to improve collaborative learning can effectively influence students' discourse quality and learning when compared with those forums set up in a general environment. Following a coding schema based upon the set of scaffolds offered in the Knowledge…

  2. BIT BY BIT: A Game Simulating Natural Language Processing in Computers

    ERIC Educational Resources Information Center

    Kato, Taichi; Arakawa, Chuichi

    2008-01-01

    BIT BY BIT is an encryption game that is designed to improve students' understanding of natural language processing in computers. Participants encode clear words into binary code using an encryption key and exchange them in the game. BIT BY BIT enables participants who do not understand the concept of binary numbers to perform the process of…

  3. Design and Evaluation of Energy Efficient Modular Classroom Structures.

    ERIC Educational Resources Information Center

    Brown, G. Z.; And Others

    This paper describes a study that developed innovations that would enable modular builders to improve the energy performance of their classrooms without increasing their first cost. The Modern Building Systems' classroom building conforms to the stringent Oregon and Washington energy codes, and, at $18 per square foot, it is at the low end of the…

  4. Managing World Wide Web Information in a Frames Environment: A Guide to Constructing Web Pages Using Frames.

    ERIC Educational Resources Information Center

    Gilstrap, Donald L.

    1998-01-01

    Explains how to build World Wide Web home pages using frames-based HTML so that librarians can manage Web-based information and improve their home pages. Provides descriptions and 15 examples for writing frames-HTML code, including advanced concepts and additional techniques for home-page design. (Author/LRW)

  5. Flow field description of the Space Shuttle Vernier reaction control system exhaust plumes

    NASA Technical Reports Server (NTRS)

    Cerimele, Mary P.; Alred, John W.

    1987-01-01

    The flow field for the Vernier Reaction Control System (VRCS) jets of the Space Shuttle Orbiter has been calculated from the nozzle throat to the far-field region. The calculations involved the use of recently improved rocket engine nozzle/plume codes. The flow field is discussed, and a brief overview of the calculation techniques is presented. In addition, a proposed on-orbit plume measurement experiment, designed to improve future estimations of the Vernier flow field, is addressed.

  6. Performance improvement of optical CDMA networks with stochastic artificial bee colony optimization technique

    NASA Astrophysics Data System (ADS)

    Panda, Satyasen

    2018-05-01

    This paper proposes a modified artificial bee colony optimization (ABC) algorithm based on levy flight swarm intelligence referred as artificial bee colony levy flight stochastic walk (ABC-LFSW) optimization for optical code division multiple access (OCDMA) network. The ABC-LFSW algorithm is used to solve asset assignment problem based on signal to noise ratio (SNR) optimization in OCDM networks with quality of service constraints. The proposed optimization using ABC-LFSW algorithm provides methods for minimizing various noises and interferences, regulating the transmitted power and optimizing the network design for improving the power efficiency of the optical code path (OCP) from source node to destination node. In this regard, an optical system model is proposed for improving the network performance with optimized input parameters. The detailed discussion and simulation results based on transmitted power allocation and power efficiency of OCPs are included. The experimental results prove the superiority of the proposed network in terms of power efficiency and spectral efficiency in comparison to networks without any power allocation approach.

  7. Maximization Network Throughput Based on Improved Genetic Algorithm and Network Coding for Optical Multicast Networks

    NASA Astrophysics Data System (ADS)

    Wei, Chengying; Xiong, Cuilian; Liu, Huanlin

    2017-12-01

    Maximal multicast stream algorithm based on network coding (NC) can improve the network's throughput for wavelength-division multiplexing (WDM) networks, which however is far less than the network's maximal throughput in terms of theory. And the existing multicast stream algorithms do not give the information distribution pattern and routing in the meantime. In the paper, an improved genetic algorithm is brought forward to maximize the optical multicast throughput by NC and to determine the multicast stream distribution by hybrid chromosomes construction for multicast with single source and multiple destinations. The proposed hybrid chromosomes are constructed by the binary chromosomes and integer chromosomes, while the binary chromosomes represent optical multicast routing and the integer chromosomes indicate the multicast stream distribution. A fitness function is designed to guarantee that each destination can receive the maximum number of decoding multicast streams. The simulation results showed that the proposed method is far superior over the typical maximal multicast stream algorithms based on NC in terms of network throughput in WDM networks.

  8. Data Sciences Summer Institute Topology Optimization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Watts, Seth

    DSSI_TOPOPT is a 2D topology optimization code that designs stiff structures made of a single linear elastic material and void space. The code generates a finite element mesh of a rectangular design domain on which the user specifies displacement and load boundary conditions. The code iteratively designs a structure that minimizes the compliance (maximizes the stiffness) of the structure under the given loading, subject to an upper bound on the amount of material used. Depending on user options, the code can evaluate the performance of a user-designed structure, or create a design from scratch. Output includes the finite element mesh,more » design, and visualizations of the design.« less

  9. STRING: A new drifter for HF radar validation.

    NASA Astrophysics Data System (ADS)

    Rammou, Anna-Maria; Zervakis, Vassilis; Bellomo, Lucio; Kokkini, Zoi; Quentin, Celine; Mantovani, Carlo; Kalampokis, Alkiviadis

    2015-04-01

    High-Frequency radars (HFR) are an effective mean of remotely monitoring sea-surface currents, based on recording the Doppler-shift of radio-waves backscattered on the sea surface. Validation of HFRs' measurements takes place via comparisons either with in-situ Eulerian velocity data (usually obtained by surface current-meters attached on moorings) or to Lagrangian velocity fields (recorded by surface drifters). The most common surface drifter used for this purpose is the CODE-type drifter (Davis, 1985), an industry-standard design to record the vertical average velocity of the upper 1 m layer of the water column. In this work we claim that the observed differences between the HFR-derived velocities and Lagrangian measurements can be attributed not just to the different spatial scales recorded by the above instruments but also due to the fact that while the HFR-derived velocity corresponds to exponentially weighted vertical average of the velocity field from the surface to 1 m depth (Stewart and Joy, 1974) the velocity estimated by the CODE drifters corresponds to boxcar-type weighted vertical average due to the orthogonal shape of the CODE drifters' sails. After analyzing the theoretical behavior of a drifter under the influence of wind and current, we proceed to propose a new design of exponentially-shaped sails for the drogues of CODE-based drifters, so that the HFR-derived velocities and the drifter-based velocities will be directly comparable, regarding the way of vertically averaging the velocity field.The new drifter, codenamed STRING, exhibits identical behavior to the classical CODE design under relatively homogeneous conditions in the upper 1 m layer, however it is expected to follow a significantly different track in conditions of high vertical shear and stratification. Thus, we suggest that the new design is the instrument of choice for validation of HFR installations, as it can be used in all conditions and behaves identically to CODEs when vertical shear is insignificant. Finally, we present results from three experiments using both drifter types in HFR-covered regions of the Eastern Mediterranean. More experiments are planned, incorporating design improvements dictated by the results of the preliminary field tests. This work was held in the framework of the project "Specifically Targeted for Radars INnovative Gauge (STRING)", funded by the Greek-French collaboration programme "Plato".

  10. Characterizing the Properties of a Woven SiC/SiC Composite Using W-CEMCAN Computer Code

    NASA Technical Reports Server (NTRS)

    Murthy, Pappu L. N.; Mital, Subodh K.; DiCarlo, James A.

    1999-01-01

    A micromechanics based computer code to predict the thermal and mechanical properties of woven ceramic matrix composites (CMC) is developed. This computer code, W-CEMCAN (Woven CEramic Matrix Composites ANalyzer), predicts the properties of two-dimensional woven CMC at any temperature and takes into account various constituent geometries and volume fractions. This computer code is used to predict the thermal and mechanical properties of an advanced CMC composed of 0/90 five-harness (5 HS) Sylramic fiber which had been chemically vapor infiltrated (CVI) with boron nitride (BN) and SiC interphase coatings and melt-infiltrated (MI) with SiC. The predictions, based on the bulk constituent properties from the literature, are compared with measured experimental data. Based on the comparison. improved or calibrated properties for the constituent materials are then developed for use by material developers/designers. The computer code is then used to predict the properties of a composite with the same constituents but with different fiber volume fractions. The predictions are compared with measured data and a good agreement is achieved.

  11. Mandating better buildings: a global review of building codes and prospects for improvement in the United States

    DOE PAGES

    Sun, Xiaojing; Brown, Marilyn A.; Cox, Matt; ...

    2015-03-11

    This paper provides a global overview of the design, implementation, and evolution of building energy codes. Reflecting alternative policy goals, building energy codes differ significantly across the United States, the European Union, and China. This review uncovers numerous innovative practices including greenhouse gas emissions caps per square meter of building space, energy performance certificates with retrofit recommendations, and inclusion of renewable energy to achieve “nearly zero-energy buildings”. These innovations motivated an assessment of an aggressive commercial building code applied to all US states, requiring both new construction and buildings with major modifications to comply with the latest version of themore » ASHRAE 90.1 Standards. Using the National Energy Modeling System (NEMS), we estimate that by 2035, such building codes in the United States could reduce energy for space heating, cooling, water heating and lighting in commercial buildings by 16%, 15%, 20% and 5%, respectively. Impacts on different fuels and building types, energy rates and bills as well as pollution emission reductions are also examined.« less

  12. Physical and numerical sources of computational inefficiency in integration of chemical kinetic rate equations: Etiology, treatment and prognosis

    NASA Technical Reports Server (NTRS)

    Pratt, D. T.; Radhakrishnan, K.

    1986-01-01

    The design of a very fast, automatic black-box code for homogeneous, gas-phase chemical kinetics problems requires an understanding of the physical and numerical sources of computational inefficiency. Some major sources reviewed in this report are stiffness of the governing ordinary differential equations (ODE's) and its detection, choice of appropriate method (i.e., integration algorithm plus step-size control strategy), nonphysical initial conditions, and too frequent evaluation of thermochemical and kinetic properties. Specific techniques are recommended (and some advised against) for improving or overcoming the identified problem areas. It is argued that, because reactive species increase exponentially with time during induction, and all species exhibit asymptotic, exponential decay with time during equilibration, exponential-fitted integration algorithms are inherently more accurate for kinetics modeling than classical, polynomial-interpolant methods for the same computational work. But current codes using the exponential-fitted method lack the sophisticated stepsize-control logic of existing black-box ODE solver codes, such as EPISODE and LSODE. The ultimate chemical kinetics code does not exist yet, but the general characteristics of such a code are becoming apparent.

  13. Extension of the XGC code for global gyrokinetic simulations in stellarator geometry

    NASA Astrophysics Data System (ADS)

    Cole, Michael; Moritaka, Toseo; White, Roscoe; Hager, Robert; Ku, Seung-Hoe; Chang, Choong-Seock

    2017-10-01

    In this work, the total-f, gyrokinetic particle-in-cell code XGC is extended to treat stellarator geometries. Improvements to meshing tools and the code itself have enabled the first physics studies, including single particle tracing and flux surface mapping in the magnetic geometry of the heliotron LHD and quasi-isodynamic stellarator Wendelstein 7-X. These have provided the first successful test cases for our approach. XGC is uniquely placed to model the complex edge physics of stellarators. A roadmap to such a global confinement modeling capability will be presented. Single particle studies will include the physics of energetic particles' global stochastic motions and their effect on confinement. Good confinement of energetic particles is vital for a successful stellarator reactor design. These results can be compared in the core region with those of other codes, such as ORBIT3d. In subsequent work, neoclassical transport and turbulence can then be considered and compared to results from codes such as EUTERPE and GENE. After sufficient verification in the core region, XGC will move into the stellarator edge region including the material wall and neutral particle recycling.

  14. Validation of a Monte Carlo code system for grid evaluation with interference effect on Rayleigh scattering

    NASA Astrophysics Data System (ADS)

    Zhou, Abel; White, Graeme L.; Davidson, Rob

    2018-02-01

    Anti-scatter grids are commonly used in x-ray imaging systems to reduce scatter radiation reaching the image receptor. Anti-scatter grid performance and validation can be simulated through use of Monte Carlo (MC) methods. Our recently reported work has modified existing MC codes resulting in improved performance when simulating x-ray imaging. The aim of this work is to validate the transmission of x-ray photons in grids from the recently reported new MC codes against experimental results and results previously reported in other literature. The results of this work show that the scatter-to-primary ratio (SPR), the transmissions of primary (T p), scatter (T s), and total (T t) radiation determined using this new MC code system have strong agreement with the experimental results and the results reported in the literature. T p, T s, T t, and SPR determined in this new MC simulation code system are valid. These results also show that the interference effect on Rayleigh scattering should not be neglected in both mammographic and general grids’ evaluation. Our new MC simulation code system has been shown to be valid and can be used for analysing and evaluating the designs of grids.

  15. An Empirical Evaluation of the US Beer Institute’s Self-Regulation Code Governing the Content of Beer Advertising

    PubMed Central

    Xuan, Ziming; Damon, Donna; Noel, Jonathan

    2013-01-01

    Objectives. We evaluated advertising code violations using the US Beer Institute guidelines for responsible advertising. Methods. We applied the Delphi rating technique to all beer ads (n = 289) broadcast in national markets between 1999 and 2008 during the National Collegiate Athletic Association basketball tournament games. Fifteen public health professionals completed ratings using quantitative scales measuring the content of alcohol advertisements (e.g., perceived actor age, portrayal of excessive drinking) according to 1997 and 2006 versions of the Beer Institute Code. Results. Depending on the code version, exclusion criteria, and scoring method, expert raters found that between 35% and 74% of the ads had code violations. There were significant differences among producers in the frequency with which ads with violations were broadcast, but not in the proportions of unique ads with violations. Guidelines most likely to be violated included the association of beer drinking with social success and the use of content appealing to persons younger than 21 years. Conclusions. The alcohol industry’s current self-regulatory framework is ineffective at preventing content violations but could be improved by the use of new rating procedures designed to better detect content code violations. PMID:23947318

  16. An empirical evaluation of the US Beer Institute's self-regulation code governing the content of beer advertising.

    PubMed

    Babor, Thomas F; Xuan, Ziming; Damon, Donna; Noel, Jonathan

    2013-10-01

    We evaluated advertising code violations using the US Beer Institute guidelines for responsible advertising. We applied the Delphi rating technique to all beer ads (n = 289) broadcast in national markets between 1999 and 2008 during the National Collegiate Athletic Association basketball tournament games. Fifteen public health professionals completed ratings using quantitative scales measuring the content of alcohol advertisements (e.g., perceived actor age, portrayal of excessive drinking) according to 1997 and 2006 versions of the Beer Institute Code. Depending on the code version, exclusion criteria, and scoring method, expert raters found that between 35% and 74% of the ads had code violations. There were significant differences among producers in the frequency with which ads with violations were broadcast, but not in the proportions of unique ads with violations. Guidelines most likely to be violated included the association of beer drinking with social success and the use of content appealing to persons younger than 21 years. The alcohol industry's current self-regulatory framework is ineffective at preventing content violations but could be improved by the use of new rating procedures designed to better detect content code violations.

  17. Reliability enhancement of Navier-Stokes codes through convergence acceleration

    NASA Technical Reports Server (NTRS)

    Merkle, Charles L.; Dulikravich, George S.

    1995-01-01

    Methods for enhancing the reliability of Navier-Stokes computer codes through improving convergence characteristics are presented. The improving of these characteristics decreases the likelihood of code unreliability and user interventions in a design environment. The problem referred to as a 'stiffness' in the governing equations for propulsion-related flowfields is investigated, particularly in regard to common sources of equation stiffness that lead to convergence degradation of CFD algorithms. Von Neumann stability theory is employed as a tool to study the convergence difficulties involved. Based on the stability results, improved algorithms are devised to ensure efficient convergence in different situations. A number of test cases are considered to confirm a correlation between stability theory and numerical convergence. The examples of turbulent and reacting flow are presented, and a generalized form of the preconditioning matrix is derived to handle these problems, i.e., the problems involving additional differential equations for describing the transport of turbulent kinetic energy, dissipation rate and chemical species. Algorithms for unsteady computations are considered. The extension of the preconditioning techniques and algorithms derived for Navier-Stokes computations to three-dimensional flow problems is discussed. New methods to accelerate the convergence of iterative schemes for the numerical integration of systems of partial differential equtions are developed, with a special emphasis on the acceleration of convergence on highly clustered grids.

  18. Research on pre-processing of QR Code

    NASA Astrophysics Data System (ADS)

    Sun, Haixing; Xia, Haojie; Dong, Ning

    2013-10-01

    QR code encodes many kinds of information because of its advantages: large storage capacity, high reliability, full arrange of utter-high-speed reading, small printing size and high-efficient representation of Chinese characters, etc. In order to obtain the clearer binarization image from complex background, and improve the recognition rate of QR code, this paper researches on pre-processing methods of QR code (Quick Response Code), and shows algorithms and results of image pre-processing for QR code recognition. Improve the conventional method by changing the Souvola's adaptive text recognition method. Additionally, introduce the QR code Extraction which adapts to different image size, flexible image correction approach, and improve the efficiency and accuracy of QR code image processing.

  19. When Content Matters: The Role of Processing Code in Tactile Display Design.

    PubMed

    Ferris, Thomas K; Sarter, Nadine

    2010-01-01

    The distribution of tasks and stimuli across multiple modalities has been proposed as a means to support multitasking in data-rich environments. Recently, the tactile channel and, more specifically, communication via the use of tactile/haptic icons have received considerable interest. Past research has examined primarily the impact of concurrent task modality on the effectiveness of tactile information presentation. However, it is not well known to what extent the interpretation of iconic tactile patterns is affected by another attribute of information: the information processing codes of concurrent tasks. In two driving simulation studies (n = 25 for each), participants decoded icons composed of either spatial or nonspatial patterns of vibrations (engaging spatial and nonspatial processing code resources, respectively) while concurrently interpreting spatial or nonspatial visual task stimuli. As predicted by Multiple Resource Theory, performance was significantly worse (approximately 5-10 percent worse) when the tactile icons and visual tasks engaged the same processing code, with the overall worst performance in the spatial-spatial task pairing. The findings from these studies contribute to an improved understanding of information processing and can serve as input to multidimensional quantitative models of timesharing performance. From an applied perspective, the results suggest that competition for processing code resources warrants consideration, alongside other factors such as the naturalness of signal-message mapping, when designing iconic tactile displays. Nonspatially encoded tactile icons may be preferable in environments which already rely heavily on spatial processing, such as car cockpits.

  20. [Critical analysis of French DRG based information system (PMSI) databases for the epidemiology of cancer: a longitudinal approach becomes possible].

    PubMed

    Olive, F; Gomez, F; Schott, A-M; Remontet, L; Bossard, N; Mitton, N; Polazzi, S; Colonna, M; Trombert-Paviot, B

    2011-02-01

    Use of French Diagnosis Related Groups (DRGs) program databases, apart from financial purposes, has recently been improved since a unique anonymous patient identification number has been created for each inpatient in administrative case mix database. Based on the work of the group for cancer epidemiological observation in the Rhône-Alpes area, (ONC-EPI group), we review the remaining difficulties in the use of DRG data for epidemiological purposes and we consider a longitudinal approach based on analysis of database over several years. We also discuss limitations of this approach. The main problems are related to a lack of quality of administrative data, especially coding of diagnoses. These errors come from missing or inappropriate codes, or not being in accordance with prioritization rules (causing an over- or under-reporting or inconsistencies in coding over time). One difficulty, partly due to the hierarchy of coding and the type of cancer, is the choice of an extraction algorithm. In two studies designed to estimate the incidence of cancer cared in hospitals (breast, colon-rectum, kidney, ovaries), a first algorithm, including a code of cancer as principal diagnosis with a selection of surgical procedures less performed than the second one including a code of cancer as principal diagnosis only, for which the number of hospitalizations per patient ratio was stable across time and space. The chaining over several years allows, by tracing the trajectory of the patient, to detect and correct inaccuracies, errors and missing values, and for incidence studies, to correct incident cases by removing prevalent cases. However, linkage, complete only since 2007, does not correct data in all cases. Ways of future improvement certainly pass through improved algorithms for case identification and especially by linking DRG data with other databases. Copyright © 2010 Elsevier Masson SAS. All rights reserved.

  1. Dynamic Forms. Part 1: Functions

    NASA Technical Reports Server (NTRS)

    Meyer, George; Smith, G. Allan

    1993-01-01

    The formalism of dynamic forms is developed as a means for organizing and systematizing the design control systems. The formalism allows the designer to easily compute derivatives to various orders of large composite functions that occur in flight-control design. Such functions involve many function-of-a-function calls that may be nested to many levels. The component functions may be multiaxis, nonlinear, and they may include rotation transformations. A dynamic form is defined as a variable together with its time derivatives up to some fixed but arbitrary order. The variable may be a scalar, a vector, a matrix, a direction cosine matrix, Euler angles, or Euler parameters. Algorithms for standard elementary functions and operations of scalar dynamic forms are developed first. Then vector and matrix operations and transformations between parameterization of rotations are developed in the next level in the hierarchy. Commonly occurring algorithms in control-system design, including inversion of pure feedback systems, are developed in the third level. A large-angle, three-axis attitude servo and other examples are included to illustrate the effectiveness of the developed formalism. All algorithms were implemented in FORTRAN code. Practical experience shows that the proposed formalism may significantly improve the productivity of the design and coding process.

  2. Performance optimization for rotors in hover and axial flight

    NASA Technical Reports Server (NTRS)

    Quackenbush, T. R.; Wachspress, D. A.; Kaufman, A. E.; Bliss, D. B.

    1989-01-01

    Performance optimization for rotors in hover and axial flight is a topic of continuing importance to rotorcraft designers. The aim of this Phase 1 effort has been to demonstrate that a linear optimization algorithm could be coupled to an existing influence coefficient hover performance code. This code, dubbed EHPIC (Evaluation of Hover Performance using Influence Coefficients), uses a quasi-linear wake relaxation to solve for the rotor performance. The coupling was accomplished by expanding of the matrix of linearized influence coefficients in EHPIC to accommodate design variables and deriving new coefficients for linearized equations governing perturbations in power and thrust. These coefficients formed the input to a linear optimization analysis, which used the flow tangency conditions on the blade and in the wake to impose equality constraints on the expanded system of equations; user-specified inequality contraints were also employed to bound the changes in the design. It was found that this locally linearized analysis could be invoked to predict a design change that would produce a reduction in the power required by the rotor at constant thrust. Thus, an efficient search for improved versions of the baseline design can be carried out while retaining the accuracy inherent in a free wake/lifting surface performance analysis.

  3. Study of structural reliability of existing concrete structures

    NASA Astrophysics Data System (ADS)

    Druķis, P.; Gaile, L.; Valtere, K.; Pakrastiņš, L.; Goremikins, V.

    2017-10-01

    Structural reliability of buildings has become an important issue after the collapse of a shopping center in Riga 21.11.2013, caused the death of 54 people. The reliability of a building is the practice of designing, constructing, operating, maintaining and removing buildings in ways that ensure maintained health, ward suffered injuries or death due to use of the building. Evaluation and improvement of existing buildings is becoming more and more important. For a large part of existing buildings, the design life has been reached or will be reached in the near future. The structures of these buildings need to be reassessed in order to find out whether the safety requirements are met. The safety requirements provided by the Eurocodes are a starting point for the assessment of safety. However, it would be uneconomical to require all existing buildings and structures to comply fully with these new codes and corresponding safety levels, therefore the assessment of existing buildings differs with each design situation. This case study describes the simple and practical procedure of determination of minimal reliability index β of existing concrete structures designed by different codes than Eurocodes and allows to reassess the actual reliability level of different structural elements of existing buildings under design load.

  4. Xyce Parallel Electronic Simulator : users' guide, version 2.0.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hoekstra, Robert John; Waters, Lon J.; Rankin, Eric Lamont

    2004-06-01

    This manual describes the use of the Xyce Parallel Electronic Simulator. Xyce has been designed as a SPICE-compatible, high-performance analog circuit simulator capable of simulating electrical circuits at a variety of abstraction levels. Primarily, Xyce has been written to support the simulation needs of the Sandia National Laboratories electrical designers. This development has focused on improving capability the current state-of-the-art in the following areas: {sm_bullet} Capability to solve extremely large circuit problems by supporting large-scale parallel computing platforms (up to thousands of processors). Note that this includes support for most popular parallel and serial computers. {sm_bullet} Improved performance for allmore » numerical kernels (e.g., time integrator, nonlinear and linear solvers) through state-of-the-art algorithms and novel techniques. {sm_bullet} Device models which are specifically tailored to meet Sandia's needs, including many radiation-aware devices. {sm_bullet} A client-server or multi-tiered operating model wherein the numerical kernel can operate independently of the graphical user interface (GUI). {sm_bullet} Object-oriented code design and implementation using modern coding practices that ensure that the Xyce Parallel Electronic Simulator will be maintainable and extensible far into the future. Xyce is a parallel code in the most general sense of the phrase - a message passing of computing platforms. These include serial, shared-memory and distributed-memory parallel implementation - which allows it to run efficiently on the widest possible number parallel as well as heterogeneous platforms. Careful attention has been paid to the specific nature of circuit-simulation problems to ensure that optimal parallel efficiency is achieved as the number of processors grows. One feature required by designers is the ability to add device models, many specific to the needs of Sandia, to the code. To this end, the device package in the Xyce These input formats include standard analytical models, behavioral models look-up Parallel Electronic Simulator is designed to support a variety of device model inputs. tables, and mesh-level PDE device models. Combined with this flexible interface is an architectural design that greatly simplifies the addition of circuit models. One of the most important feature of Xyce is in providing a platform for computational research and development aimed specifically at the needs of the Laboratory. With Xyce, Sandia now has an 'in-house' capability with which both new electrical (e.g., device model development) and algorithmic (e.g., faster time-integration methods) research and development can be performed. Ultimately, these capabilities are migrated to end users.« less

  5. Evaluation of the DRAGON code for VHTR design analysis.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Taiwo, T. A.; Kim, T. K.; Nuclear Engineering Division

    2006-01-12

    This letter report summarizes three activities that were undertaken in FY 2005 to gather information on the DRAGON code and to perform limited evaluations of the code performance when used in the analysis of the Very High Temperature Reactor (VHTR) designs. These activities include: (1) Use of the code to model the fuel elements of the helium-cooled and liquid-salt-cooled VHTR designs. Results were compared to those from another deterministic lattice code (WIMS8) and a Monte Carlo code (MCNP). (2) The preliminary assessment of the nuclear data library currently used with the code and libraries that have been provided by themore » IAEA WIMS-D4 Library Update Project (WLUP). (3) DRAGON workshop held to discuss the code capabilities for modeling the VHTR.« less

  6. Building a Better Campus: An Update on Building Codes.

    ERIC Educational Resources Information Center

    Madden, Michael J.

    2002-01-01

    Discusses the implications for higher education institutions in terms of facility planning, design, construction, and renovation of the move from regionally-developed model-building codes to two international sets of codes. Also addresses the new performance-based design option within the codes. (EV)

  7. Quality improvement of International Classification of Diseases, 9th revision, diagnosis coding in radiation oncology: single-institution prospective study at University of California, San Francisco.

    PubMed

    Chen, Chien P; Braunstein, Steve; Mourad, Michelle; Hsu, I-Chow J; Haas-Kogan, Daphne; Roach, Mack; Fogh, Shannon E

    2015-01-01

    Accurate International Classification of Diseases (ICD) diagnosis coding is critical for patient care, billing purposes, and research endeavors. In this single-institution study, we evaluated our baseline ICD-9 (9th revision) diagnosis coding accuracy, identified the most common errors contributing to inaccurate coding, and implemented a multimodality strategy to improve radiation oncology coding. We prospectively studied ICD-9 coding accuracy in our radiation therapy--specific electronic medical record system. Baseline ICD-9 coding accuracy was obtained from chart review targeting ICD-9 coding accuracy of all patients treated at our institution between March and June of 2010. To improve performance an educational session highlighted common coding errors, and a user-friendly software tool, RadOnc ICD Search, version 1.0, for coding radiation oncology specific diagnoses was implemented. We then prospectively analyzed ICD-9 coding accuracy for all patients treated from July 2010 to June 2011, with the goal of maintaining 80% or higher coding accuracy. Data on coding accuracy were analyzed and fed back monthly to individual providers. Baseline coding accuracy for physicians was 463 of 661 (70%) cases. Only 46% of physicians had coding accuracy above 80%. The most common errors involved metastatic cases, whereby primary or secondary site ICD-9 codes were either incorrect or missing, and special procedures such as stereotactic radiosurgery cases. After implementing our project, overall coding accuracy rose to 92% (range, 86%-96%). The median accuracy for all physicians was 93% (range, 77%-100%) with only 1 attending having accuracy below 80%. Incorrect primary and secondary ICD-9 codes in metastatic cases showed the most significant improvement (10% vs 2% after intervention). Identifying common coding errors and implementing both education and systems changes led to significantly improved coding accuracy. This quality assurance project highlights the potential problem of ICD-9 coding accuracy by physicians and offers an approach to effectively address this shortcoming. Copyright © 2015. Published by Elsevier Inc.

  8. Qualitative Data Analysis for Health Services Research: Developing Taxonomy, Themes, and Theory

    PubMed Central

    Bradley, Elizabeth H; Curry, Leslie A; Devers, Kelly J

    2007-01-01

    Objective To provide practical strategies for conducting and evaluating analyses of qualitative data applicable for health services researchers. Data Sources and Design We draw on extant qualitative methodological literature to describe practical approaches to qualitative data analysis. Approaches to data analysis vary by discipline and analytic tradition; however, we focus on qualitative data analysis that has as a goal the generation of taxonomy, themes, and theory germane to health services research. Principle Findings We describe an approach to qualitative data analysis that applies the principles of inductive reasoning while also employing predetermined code types to guide data analysis and interpretation. These code types (conceptual, relationship, perspective, participant characteristics, and setting codes) define a structure that is appropriate for generation of taxonomy, themes, and theory. Conceptual codes and subcodes facilitate the development of taxonomies. Relationship and perspective codes facilitate the development of themes and theory. Intersectional analyses with data coded for participant characteristics and setting codes can facilitate comparative analyses. Conclusions Qualitative inquiry can improve the description and explanation of complex, real-world phenomena pertinent to health services research. Greater understanding of the processes of qualitative data analysis can be helpful for health services researchers as they use these methods themselves or collaborate with qualitative researchers from a wide range of disciplines. PMID:17286625

  9. Qualitative data analysis for health services research: developing taxonomy, themes, and theory.

    PubMed

    Bradley, Elizabeth H; Curry, Leslie A; Devers, Kelly J

    2007-08-01

    To provide practical strategies for conducting and evaluating analyses of qualitative data applicable for health services researchers. DATA SOURCES AND DESIGN: We draw on extant qualitative methodological literature to describe practical approaches to qualitative data analysis. Approaches to data analysis vary by discipline and analytic tradition; however, we focus on qualitative data analysis that has as a goal the generation of taxonomy, themes, and theory germane to health services research. We describe an approach to qualitative data analysis that applies the principles of inductive reasoning while also employing predetermined code types to guide data analysis and interpretation. These code types (conceptual, relationship, perspective, participant characteristics, and setting codes) define a structure that is appropriate for generation of taxonomy, themes, and theory. Conceptual codes and subcodes facilitate the development of taxonomies. Relationship and perspective codes facilitate the development of themes and theory. Intersectional analyses with data coded for participant characteristics and setting codes can facilitate comparative analyses. Qualitative inquiry can improve the description and explanation of complex, real-world phenomena pertinent to health services research. Greater understanding of the processes of qualitative data analysis can be helpful for health services researchers as they use these methods themselves or collaborate with qualitative researchers from a wide range of disciplines.

  10. A Lossless Multichannel Bio-Signal Compression Based on Low-Complexity Joint Coding Scheme for Portable Medical Devices

    PubMed Central

    Kim, Dong-Sun; Kwon, Jin-San

    2014-01-01

    Research on real-time health systems have received great attention during recent years and the needs of high-quality personal multichannel medical signal compression for personal medical product applications are increasing. The international MPEG-4 audio lossless coding (ALS) standard supports a joint channel-coding scheme for improving compression performance of multichannel signals and it is very efficient compression method for multi-channel biosignals. However, the computational complexity of such a multichannel coding scheme is significantly greater than that of other lossless audio encoders. In this paper, we present a multichannel hardware encoder based on a low-complexity joint-coding technique and shared multiplier scheme for portable devices. A joint-coding decision method and a reference channel selection scheme are modified for a low-complexity joint coder. The proposed joint coding decision method determines the optimized joint-coding operation based on the relationship between the cross correlation of residual signals and the compression ratio. The reference channel selection is designed to select a channel for the entropy coding of the joint coding. The hardware encoder operates at a 40 MHz clock frequency and supports two-channel parallel encoding for the multichannel monitoring system. Experimental results show that the compression ratio increases by 0.06%, whereas the computational complexity decreases by 20.72% compared to the MPEG-4 ALS reference software encoder. In addition, the compression ratio increases by about 11.92%, compared to the single channel based bio-signal lossless data compressor. PMID:25237900

  11. Three-dimensional integral imaging displays using a quick-response encoded elemental image array: an overview

    NASA Astrophysics Data System (ADS)

    Markman, A.; Javidi, B.

    2016-06-01

    Quick-response (QR) codes are barcodes that can store information such as numeric data and hyperlinks. The QR code can be scanned using a QR code reader, such as those built into smartphone devices, revealing the information stored in the code. Moreover, the QR code is robust to noise, rotation, and illumination when scanning due to error correction built in the QR code design. Integral imaging is an imaging technique used to generate a three-dimensional (3D) scene by combining the information from two-dimensional (2D) elemental images (EIs) each with a different perspective of a scene. Transferring these 2D images in a secure manner can be difficult. In this work, we overview two methods to store and encrypt EIs in multiple QR codes. The first method uses run-length encoding with Huffman coding and the double-random-phase encryption (DRPE) to compress and encrypt an EI. This information is then stored in a QR code. An alternative compression scheme is to perform photon-counting on the EI prior to compression. Photon-counting is a non-linear transformation of data that creates redundant information thus improving image compression. The compressed data is encrypted using the DRPE. Once information is stored in the QR codes, it is scanned using a smartphone device. The information scanned is decompressed and decrypted and an EI is recovered. Once all EIs have been recovered, a 3D optical reconstruction is generated.

  12. Determinant Computation on the GPU using the Condensation Method

    NASA Astrophysics Data System (ADS)

    Anisul Haque, Sardar; Moreno Maza, Marc

    2012-02-01

    We report on a GPU implementation of the condensation method designed by Abdelmalek Salem and Kouachi Said for computing the determinant of a matrix. We consider two types of coefficients: modular integers and floating point numbers. We evaluate the performance of our code by measuring its effective bandwidth and argue that it is numerical stable in the floating point number case. In addition, we compare our code with serial implementation of determinant computation from well-known mathematical packages. Our results suggest that a GPU implementation of the condensation method has a large potential for improving those packages in terms of running time and numerical stability.

  13. Object-oriented productivity metrics

    NASA Technical Reports Server (NTRS)

    Connell, John L.; Eller, Nancy

    1992-01-01

    Software productivity metrics are useful for sizing and costing proposed software and for measuring development productivity. Estimating and measuring source lines of code (SLOC) has proven to be a bad idea because it encourages writing more lines of code and using lower level languages. Function Point Analysis is an improved software metric system, but it is not compatible with newer rapid prototyping and object-oriented approaches to software development. A process is presented here for counting object-oriented effort points, based on a preliminary object-oriented analysis. It is proposed that this approach is compatible with object-oriented analysis, design, programming, and rapid prototyping. Statistics gathered on actual projects are presented to validate the approach.

  14. System Design Description for the TMAD Code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Finfrock, S.H.

    This document serves as the System Design Description (SDD) for the TMAD Code System, which includes the TMAD code and the LIBMAKR code. The SDD provides a detailed description of the theory behind the code, and the implementation of that theory. It is essential for anyone who is attempting to review or modify the code or who otherwise needs to understand the internal workings of the code. In addition, this document includes, in Appendix A, the System Requirements Specification for the TMAD System.

  15. Image gathering and coding for digital restoration: Information efficiency and visual quality

    NASA Technical Reports Server (NTRS)

    Huck, Friedrich O.; John, Sarah; Mccormick, Judith A.; Narayanswamy, Ramkumar

    1989-01-01

    Image gathering and coding are commonly treated as tasks separate from each other and from the digital processing used to restore and enhance the images. The goal is to develop a method that allows us to assess quantitatively the combined performance of image gathering and coding for the digital restoration of images with high visual quality. Digital restoration is often interactive because visual quality depends on perceptual rather than mathematical considerations, and these considerations vary with the target, the application, and the observer. The approach is based on the theoretical treatment of image gathering as a communication channel (J. Opt. Soc. Am. A2, 1644(1985);5,285(1988). Initial results suggest that the practical upper limit of the information contained in the acquired image data range typically from approximately 2 to 4 binary information units (bifs) per sample, depending on the design of the image-gathering system. The associated information efficiency of the transmitted data (i.e., the ratio of information over data) ranges typically from approximately 0.3 to 0.5 bif per bit without coding to approximately 0.5 to 0.9 bif per bit with lossless predictive compression and Huffman coding. The visual quality that can be attained with interactive image restoration improves perceptibly as the available information increases to approximately 3 bifs per sample. However, the perceptual improvements that can be attained with further increases in information are very subtle and depend on the target and the desired enhancement.

  16. Trellis coding techniques for mobile communications

    NASA Technical Reports Server (NTRS)

    Divsalar, D.; Simon, M. K.; Jedrey, T.

    1988-01-01

    A criterion for designing optimum trellis codes to be used over fading channels is given. A technique is shown for reducing certain multiple trellis codes, optimally designed for the fading channel, to conventional (i.e., multiplicity one) trellis codes. The computational cutoff rate R0 is evaluated for MPSK transmitted over fading channels. Examples of trellis codes optimally designed for the Rayleigh fading channel are given and compared with respect to R0. Two types of modulation/demodulation techniques are considered, namely coherent (using pilot tone-aided carrier recovery) and differentially coherent with Doppler frequency correction. Simulation results are given for end-to-end performance of two trellis-coded systems.

  17. Xyce parallel electronic simulator : users' guide.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mei, Ting; Rankin, Eric Lamont; Thornquist, Heidi K.

    2011-05-01

    This manual describes the use of the Xyce Parallel Electronic Simulator. Xyce has been designed as a SPICE-compatible, high-performance analog circuit simulator, and has been written to support the simulation needs of the Sandia National Laboratories electrical designers. This development has focused on improving capability over the current state-of-the-art in the following areas: (1) Capability to solve extremely large circuit problems by supporting large-scale parallel computing platforms (up to thousands of processors). Note that this includes support for most popular parallel and serial computers; (2) Improved performance for all numerical kernels (e.g., time integrator, nonlinear and linear solvers) through state-of-the-artmore » algorithms and novel techniques. (3) Device models which are specifically tailored to meet Sandia's needs, including some radiation-aware devices (for Sandia users only); and (4) Object-oriented code design and implementation using modern coding practices that ensure that the Xyce Parallel Electronic Simulator will be maintainable and extensible far into the future. Xyce is a parallel code in the most general sense of the phrase - a message passing parallel implementation - which allows it to run efficiently on the widest possible number of computing platforms. These include serial, shared-memory and distributed-memory parallel as well as heterogeneous platforms. Careful attention has been paid to the specific nature of circuit-simulation problems to ensure that optimal parallel efficiency is achieved as the number of processors grows. The development of Xyce provides a platform for computational research and development aimed specifically at the needs of the Laboratory. With Xyce, Sandia has an 'in-house' capability with which both new electrical (e.g., device model development) and algorithmic (e.g., faster time-integration methods, parallel solver algorithms) research and development can be performed. As a result, Xyce is a unique electrical simulation capability, designed to meet the unique needs of the laboratory.« less

  18. Information theoretical assessment of digital imaging systems

    NASA Technical Reports Server (NTRS)

    John, Sarah; Rahman, Zia-Ur; Huck, Friedrich O.; Reichenbach, Stephen E.

    1990-01-01

    The end-to-end performance of image gathering, coding, and restoration as a whole is considered. This approach is based on the pivotal relationship that exists between the spectral information density of the transmitted signal and the restorability of images from this signal. The information-theoretical assessment accounts for (1) the information density and efficiency of the acquired signal as a function of the image-gathering system design and the radiance-field statistics, and (2) the improvement in information efficiency and data compression that can be gained by combining image gathering with coding to reduce the signal redundancy and irrelevancy. It is concluded that images can be restored with better quality and from fewer data as the information efficiency of the data is increased. The restoration correctly explains the image gathering and coding processes and effectively suppresses the image-display degradations.

  19. Information theoretical assessment of digital imaging systems

    NASA Astrophysics Data System (ADS)

    John, Sarah; Rahman, Zia-Ur; Huck, Friedrich O.; Reichenbach, Stephen E.

    1990-10-01

    The end-to-end performance of image gathering, coding, and restoration as a whole is considered. This approach is based on the pivotal relationship that exists between the spectral information density of the transmitted signal and the restorability of images from this signal. The information-theoretical assessment accounts for (1) the information density and efficiency of the acquired signal as a function of the image-gathering system design and the radiance-field statistics, and (2) the improvement in information efficiency and data compression that can be gained by combining image gathering with coding to reduce the signal redundancy and irrelevancy. It is concluded that images can be restored with better quality and from fewer data as the information efficiency of the data is increased. The restoration correctly explains the image gathering and coding processes and effectively suppresses the image-display degradations.

  20. GLOBECOM '88 - IEEE Global Telecommunications Conference and Exhibition, Hollywood, FL, Nov. 28-Dec. 1, 1988, Conference Record. Volumes 1, 2, & 3

    NASA Astrophysics Data System (ADS)

    Various papers on communications for the information age are presented. Among the general topics considered are: telematic services and terminals, satellite communications, telecommunications mangaement network, control of integrated broadband networks, advances in digital radio systems, the intelligent network, broadband networks and services deployment, future switch architectures, performance analysis of computer networks, advances in spread spectrum, optical high-speed LANs, and broadband switching and networks. Also addressed are: multiple access protocols, video coding techniques, modulation and coding, photonic switching, SONET terminals and applications, standards for video coding, digital switching, progress in MANs, mobile and portable radio, software design for improved maintainability, multipath propagation and advanced countermeasure, data communication, network control and management, fiber in the loop, network algorithm and protocols, and advances in computer communications.

  1. Parallel-vector computation for linear structural analysis and non-linear unconstrained optimization problems

    NASA Technical Reports Server (NTRS)

    Nguyen, D. T.; Al-Nasra, M.; Zhang, Y.; Baddourah, M. A.; Agarwal, T. K.; Storaasli, O. O.; Carmona, E. A.

    1991-01-01

    Several parallel-vector computational improvements to the unconstrained optimization procedure are described which speed up the structural analysis-synthesis process. A fast parallel-vector Choleski-based equation solver, pvsolve, is incorporated into the well-known SAP-4 general-purpose finite-element code. The new code, denoted PV-SAP, is tested for static structural analysis. Initial results on a four processor CRAY 2 show that using pvsolve reduces the equation solution time by a factor of 14-16 over the original SAP-4 code. In addition, parallel-vector procedures for the Golden Block Search technique and the BFGS method are developed and tested for nonlinear unconstrained optimization. A parallel version of an iterative solver and the pvsolve direct solver are incorporated into the BFGS method. Preliminary results on nonlinear unconstrained optimization test problems, using pvsolve in the analysis, show excellent parallel-vector performance indicating that these parallel-vector algorithms can be used in a new generation of finite-element based structural design/analysis-synthesis codes.

  2. Radiative Transfer Modeling in Proto-planetary Disks

    NASA Astrophysics Data System (ADS)

    Kasper, David; Jang-Condell, Hannah; Kloster, Dylan

    2016-01-01

    Young Stellar Objects (YSOs) are rich astronomical research environments. Planets form in circumstellar disks of gas and dust around YSOs. With ever increasing capabilities of the observational instruments designed to look at these proto-planetary disks, most notably GPI, SPHERE, and ALMA, more accurate interfaces must be made to connect modeling of the disks with observation. PaRTY (Parallel Radiative Transfer in YSOs) is a code developed previously to model the observable density and temperature structure of such a disk by self-consistently calculating the structure of the disk based on radiative transfer physics. We present upgrades we are implementing to the PaRTY code to improve its accuracy and flexibility. These upgrades include: creating a two-sided disk model, implementing a spherical coordinate system, and implementing wavelength-dependent opacities. These upgrades will address problems in the PaRTY code of infinite optical thickness, calculation under/over-resolution, and wavelength-independent photon penetration depths, respectively. The upgraded code will be used to better model disk perturbations resulting from planet formation.

  3. Analytical and Experimental Evaluation of the Heat Transfer Distribution over the Surfaces of Turbine Vanes

    NASA Technical Reports Server (NTRS)

    Hylton, L. D.; Mihelc, M. S.; Turner, E. R.; Nealy, D. A.; York, R. E.

    1983-01-01

    Three airfoil data sets were selected for use in evaluating currently available analytical models for predicting airfoil surface heat transfer distributions in a 2-D flow field. Two additional airfoils, representative of highly loaded, low solidity airfoils currently being designed, were selected for cascade testing at simulated engine conditions. Some 2-D analytical methods were examined and a version of the STAN5 boundary layer code was chosen for modification. The final form of the method utilized a time dependent, transonic inviscid cascade code coupled to a modified version of the STAN5 boundary layer code featuring zero order turbulence modeling. The boundary layer code is structured to accommodate a full spectrum of empirical correlations addressing the coupled influences of pressure gradient, airfoil curvature, and free-stream turbulence on airfoil surface heat transfer distribution and boundary layer transitional behavior. Comparison of pedictions made with the model to the data base indicates a significant improvement in predictive capability.

  4. Analytical and experimental evaluation of the heat transfer distribution over the surfaces of turbine vanes

    NASA Astrophysics Data System (ADS)

    Hylton, L. D.; Mihelc, M. S.; Turner, E. R.; Nealy, D. A.; York, R. E.

    1983-05-01

    Three airfoil data sets were selected for use in evaluating currently available analytical models for predicting airfoil surface heat transfer distributions in a 2-D flow field. Two additional airfoils, representative of highly loaded, low solidity airfoils currently being designed, were selected for cascade testing at simulated engine conditions. Some 2-D analytical methods were examined and a version of the STAN5 boundary layer code was chosen for modification. The final form of the method utilized a time dependent, transonic inviscid cascade code coupled to a modified version of the STAN5 boundary layer code featuring zero order turbulence modeling. The boundary layer code is structured to accommodate a full spectrum of empirical correlations addressing the coupled influences of pressure gradient, airfoil curvature, and free-stream turbulence on airfoil surface heat transfer distribution and boundary layer transitional behavior. Comparison of pedictions made with the model to the data base indicates a significant improvement in predictive capability.

  5. Computer program BL2D for solving two-dimensional and axisymmetric boundary layers

    NASA Technical Reports Server (NTRS)

    Iyer, Venkit

    1995-01-01

    This report presents the formulation, validation, and user's manual for the computer program BL2D. The program is a fourth-order-accurate solution scheme for solving two-dimensional or axisymmetric boundary layers in speed regimes that range from low subsonic to hypersonic Mach numbers. A basic implementation of the transition zone and turbulence modeling is also included. The code is a result of many improvements made to the program VGBLP, which is described in NASA TM-83207 (February 1982), and can effectively supersede it. The code BL2D is designed to be modular, user-friendly, and portable to any machine with a standard fortran77 compiler. The report contains the new formulation adopted and the details of its implementation. Five validation cases are presented. A detailed user's manual with the input format description and instructions for running the code is included. Adequate information is presented in the report to enable the user to modify or customize the code for specific applications.

  6. RELAP-7 Development Updates

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Hongbin; Zhao, Haihua; Gleicher, Frederick Nathan

    RELAP-7 is a nuclear systems safety analysis code being developed at the Idaho National Laboratory, and is the next generation tool in the RELAP reactor safety/systems analysis application series. RELAP-7 development began in 2011 to support the Risk Informed Safety Margins Characterization (RISMC) Pathway of the Light Water Reactor Sustainability (LWRS) program. The overall design goal of RELAP-7 is to take advantage of the previous thirty years of advancements in computer architecture, software design, numerical methods, and physical models in order to provide capabilities needed for the RISMC methodology and to support nuclear power safety analysis. The code is beingmore » developed based on Idaho National Laboratory’s modern scientific software development framework – MOOSE (the Multi-Physics Object-Oriented Simulation Environment). The initial development goal of the RELAP-7 approach focused primarily on the development of an implicit algorithm capable of strong (nonlinear) coupling of the dependent hydrodynamic variables contained in the 1-D/2-D flow models with the various 0-D system reactor components that compose various boiling water reactor (BWR) and pressurized water reactor nuclear power plants (NPPs). During Fiscal Year (FY) 2015, the RELAP-7 code has been further improved with expanded capability to support boiling water reactor (BWR) and pressurized water reactor NPPs analysis. The accumulator model has been developed. The code has also been coupled with other MOOSE-based applications such as neutronics code RattleSnake and fuel performance code BISON to perform multiphysics analysis. A major design requirement for the implicit algorithm in RELAP-7 is that it is capable of second-order discretization accuracy in both space and time, which eliminates the traditional first-order approximation errors. The second-order temporal is achieved by a second-order backward temporal difference, and the one-dimensional second-order accurate spatial discretization is achieved with the Galerkin approximation of Lagrange finite elements. During FY-2015, we have done numerical verification work to verify that the RELAP-7 code indeed achieves 2nd-order accuracy in both time and space for single phase models at the system level.« less

  7. Centrifugal and Axial Pump Design and Off-Design Performance Prediction

    NASA Technical Reports Server (NTRS)

    Veres, Joseph P.

    1995-01-01

    A meanline pump-flow modeling method has been developed to provide a fast capability for modeling pumps of cryogenic rocket engines. Based on this method, a meanline pump-flow code PUMPA was written that can predict the performance of pumps at off-design operating conditions, given the loss of the diffusion system at the design point. The design-point rotor efficiency and slip factors are obtained from empirical correlations to rotor-specific speed and geometry. The pump code can model axial, inducer, mixed-flow, and centrifugal pumps and can model multistage pumps in series. The rapid input setup and computer run time for this meanline pump flow code make it an effective analysis and conceptual design tool. The map-generation capabilities of the code provide the information needed for interfacing with a rocket engine system modeling code. The off-design and multistage modeling capabilities of PUMPA permit the user to do parametric design space exploration of candidate pump configurations and to provide head-flow maps for engine system evaluation.

  8. Mode-dependent templates and scan order for H.264/AVC-based intra lossless coding.

    PubMed

    Gu, Zhouye; Lin, Weisi; Lee, Bu-Sung; Lau, Chiew Tong; Sun, Ming-Ting

    2012-09-01

    In H.264/advanced video coding (AVC), lossless coding and lossy coding share the same entropy coding module. However, the entropy coders in the H.264/AVC standard were original designed for lossy video coding and do not yield adequate performance for lossless video coding. In this paper, we analyze the problem with the current lossless coding scheme and propose a mode-dependent template (MD-template) based method for intra lossless coding. By exploring the statistical redundancy of the prediction residual in the H.264/AVC intra prediction modes, more zero coefficients are generated. By designing a new scan order for each MD-template, the scanned coefficients sequence fits the H.264/AVC entropy coders better. A fast implementation algorithm is also designed. With little computation increase, experimental results confirm that the proposed fast algorithm achieves about 7.2% bit saving compared with the current H.264/AVC fidelity range extensions high profile.

  9. Short Duration Base Heating Test Improvements

    NASA Technical Reports Server (NTRS)

    Bender, Robert L.; Dagostino, Mark G.; Engel, Bradley A.; Engel, Carl D.

    1999-01-01

    Significant improvements have been made to a short duration space launch vehicle base heating test technique. This technique was first developed during the 1960's to investigate launch vehicle plume induced convective environments. Recent improvements include the use of coiled nitrogen buffer gas lines upstream of the hydrogen / oxygen propellant charge tubes, fast acting solenoid valves, stand alone gas delivery and data acquisition systems, and an integrated model design code. Technique improvements were successfully demonstrated during a 2.25% scale X-33 base heating test conducted in the NASA/MSFC Nozzle Test Facility in early 1999. Cost savings of approximately an order of magnitude over previous tests were realized due in large part to these improvements.

  10. Volume accumulator design analysis computer codes

    NASA Technical Reports Server (NTRS)

    Whitaker, W. D.; Shimazaki, T. T.

    1973-01-01

    The computer codes, VANEP and VANES, were written and used to aid in the design and performance calculation of the volume accumulator units (VAU) for the 5-kwe reactor thermoelectric system. VANEP computes the VAU design which meets the primary coolant loop VAU volume and pressure performance requirements. VANES computes the performance of the VAU design, determined from the VANEP code, at the conditions of the secondary coolant loop. The codes can also compute the performance characteristics of the VAU's under conditions of possible modes of failure which still permit continued system operation.

  11. Cross-Layer Design for Space-Time coded MIMO Systems over Rice Fading Channel

    NASA Astrophysics Data System (ADS)

    Yu, Xiangbin; Zhou, Tingting; Liu, Xiaoshuai; Yin, Xin

    A cross-layer design (CLD) scheme for space-time coded MIMO systems over Rice fading channel is presented by combining adaptive modulation and automatic repeat request, and the corresponding system performance is investigated well. The fading gain switching thresholds subject to a target packet error rate (PER) and fixed power constraint are derived. According to these results, and using the generalized Marcum Q-function, the calculation formulae of the average spectrum efficiency (SE) and PER of the system with CLD are derived. As a result, closed-form expressions for average SE and PER are obtained. These expressions include some existing expressions in Rayleigh channel as special cases. With these expressions, the system performance in Rice fading channel is evaluated effectively. Numerical results verify the validity of the theoretical analysis. The results show that the system performance in Rice channel is effectively improved as Rice factor increases, and outperforms that in Rayleigh channel.

  12. A multi-detector neutron spectrometer with nearly isotropic response for environmental and workplace monitoring

    NASA Astrophysics Data System (ADS)

    Gómez-Ros, J. M.; Bedogni, R.; Moraleda, M.; Delgado, A.; Romero, A.; Esposito, A.

    2010-01-01

    This communication describes an improved design for a neutron spectrometer consisting of 6Li thermoluminescent dosemeters located at selected positions within a single moderating polyethylene sphere. The spatial arrangement of the dosemeters has been designed using the MCNPX Monte Carlo code to calculate the response matrix for 56 log-equidistant energies from 10 -9 to 100 MeV, looking for a configuration that permits to obtain a nearly isotropic response for neutrons in the energy range from thermal to 20 MeV. The feasibility of the proposed spectrometer and the isotropy of its response have been evaluated by simulating exposures to different reference and workplace neutron fields. The FRUIT code has been used for unfolding purposes. The results of the simulations as well as the experimental tests confirm the suitability of the prototype for environmental and workplace monitoring applications.

  13. Analysis of soft-decision FEC on non-AWGN channels.

    PubMed

    Cho, Junho; Xie, Chongjin; Winzer, Peter J

    2012-03-26

    Soft-decision forward error correction (SD-FEC) schemes are typically designed for additive white Gaussian noise (AWGN) channels. In a fiber-optic communication system, noise may be neither circularly symmetric nor Gaussian, thus violating an important assumption underlying SD-FEC design. This paper quantifies the impact of non-AWGN noise on SD-FEC performance for such optical channels. We use a conditionally bivariate Gaussian noise model (CBGN) to analyze the impact of correlations among the signal's two quadrature components, and assess the effect of CBGN on SD-FEC performance using the density evolution of low-density parity-check (LDPC) codes. On a CBGN channel generating severely elliptic noise clouds, it is shown that more than 3 dB of coding gain are attainable by utilizing correlation information. Our analyses also give insights into potential improvements of the detection performance for fiber-optic transmission systems assisted by SD-FEC.

  14. User-Defined Data Distributions in High-Level Programming Languages

    NASA Technical Reports Server (NTRS)

    Diaconescu, Roxana E.; Zima, Hans P.

    2006-01-01

    One of the characteristic features of today s high performance computing systems is a physically distributed memory. Efficient management of locality is essential for meeting key performance requirements for these architectures. The standard technique for dealing with this issue has involved the extension of traditional sequential programming languages with explicit message passing, in the context of a processor-centric view of parallel computation. This has resulted in complex and error-prone assembly-style codes in which algorithms and communication are inextricably interwoven. This paper presents a high-level approach to the design and implementation of data distributions. Our work is motivated by the need to improve the current parallel programming methodology by introducing a paradigm supporting the development of efficient and reusable parallel code. This approach is currently being implemented in the context of a new programming language called Chapel, which is designed in the HPCS project Cascade.

  15. Using the FLUKA Monte Carlo Code to Simulate the Interactions of Ionizing Radiation with Matter to Assist and Aid Our Understanding of Ground Based Accelerator Testing, Space Hardware Design, and Secondary Space Radiation Environments

    NASA Technical Reports Server (NTRS)

    Reddell, Brandon

    2015-01-01

    Designing hardware to operate in the space radiation environment is a very difficult and costly activity. Ground based particle accelerators can be used to test for exposure to the radiation environment, one species at a time, however, the actual space environment cannot be duplicated because of the range of energies and isotropic nature of space radiation. The FLUKA Monte Carlo code is an integrated physics package based at CERN that has been under development for the last 40+ years and includes the most up-to-date fundamental physics theory and particle physics data. This work presents an overview of FLUKA and how it has been used in conjunction with ground based radiation testing for NASA and improve our understanding of secondary particle environments resulting from the interaction of space radiation with matter.

  16. Joint Source-Channel Coding by Means of an Oversampled Filter Bank Code

    NASA Astrophysics Data System (ADS)

    Marinkovic, Slavica; Guillemot, Christine

    2006-12-01

    Quantized frame expansions based on block transforms and oversampled filter banks (OFBs) have been considered recently as joint source-channel codes (JSCCs) for erasure and error-resilient signal transmission over noisy channels. In this paper, we consider a coding chain involving an OFB-based signal decomposition followed by scalar quantization and a variable-length code (VLC) or a fixed-length code (FLC). This paper first examines the problem of channel error localization and correction in quantized OFB signal expansions. The error localization problem is treated as an[InlineEquation not available: see fulltext.]-ary hypothesis testing problem. The likelihood values are derived from the joint pdf of the syndrome vectors under various hypotheses of impulse noise positions, and in a number of consecutive windows of the received samples. The error amplitudes are then estimated by solving the syndrome equations in the least-square sense. The message signal is reconstructed from the corrected received signal by a pseudoinverse receiver. We then improve the error localization procedure by introducing a per-symbol reliability information in the hypothesis testing procedure of the OFB syndrome decoder. The per-symbol reliability information is produced by the soft-input soft-output (SISO) VLC/FLC decoders. This leads to the design of an iterative algorithm for joint decoding of an FLC and an OFB code. The performance of the algorithms developed is evaluated in a wavelet-based image coding system.

  17. A space radiation transport method development

    NASA Technical Reports Server (NTRS)

    Wilson, J. W.; Tripathi, R. K.; Qualls, G. D.; Cucinotta, F. A.; Prael, R. E.; Norbury, J. W.; Heinbockel, J. H.; Tweed, J.

    2004-01-01

    Improved spacecraft shield design requires early entry of radiation constraints into the design process to maximize performance and minimize costs. As a result, we have been investigating high-speed computational procedures to allow shield analysis from the preliminary design concepts to the final design. In particular, we will discuss the progress towards a full three-dimensional and computationally efficient deterministic code for which the current HZETRN evaluates the lowest-order asymptotic term. HZETRN is the first deterministic solution to the Boltzmann equation allowing field mapping within the International Space Station (ISS) in tens of minutes using standard finite element method (FEM) geometry common to engineering design practice enabling development of integrated multidisciplinary design optimization methods. A single ray trace in ISS FEM geometry requires 14 ms and severely limits application of Monte Carlo methods to such engineering models. A potential means of improving the Monte Carlo efficiency in coupling to spacecraft geometry is given in terms of re-configurable computing and could be utilized in the final design as verification of the deterministic method optimized design. Published by Elsevier Ltd on behalf of COSPAR.

  18. 76 FR 11432 - Coding of Design Marks in Registrations

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-03-02

    ... on the old paper search designations. The USPTO will continue to code all pending applications that... system, the Trademark Search Facility Classification Code Index (``TC Index''), stems from its... infrequent use of the TC Index codes in searches by the public; and its costliness to maintain, especially in...

  19. Layer-based buffer aware rate adaptation design for SHVC video streaming

    NASA Astrophysics Data System (ADS)

    Gudumasu, Srinivas; Hamza, Ahmed; Asbun, Eduardo; He, Yong; Ye, Yan

    2016-09-01

    This paper proposes a layer based buffer aware rate adaptation design which is able to avoid abrupt video quality fluctuation, reduce re-buffering latency and improve bandwidth utilization when compared to a conventional simulcast based adaptive streaming system. The proposed adaptation design schedules DASH segment requests based on the estimated bandwidth, dependencies among video layers and layer buffer fullness. Scalable HEVC video coding is the latest state-of-art video coding technique that can alleviate various issues caused by simulcast based adaptive video streaming. With scalable coded video streams, the video is encoded once into a number of layers representing different qualities and/or resolutions: a base layer (BL) and one or more enhancement layers (EL), each incrementally enhancing the quality of the lower layers. Such layer based coding structure allows fine granularity rate adaptation for the video streaming applications. Two video streaming use cases are presented in this paper. The first use case is to stream HD SHVC video over a wireless network where available bandwidth varies, and the performance comparison between proposed layer-based streaming approach and conventional simulcast streaming approach is provided. The second use case is to stream 4K/UHD SHVC video over a hybrid access network that consists of a 5G millimeter wave high-speed wireless link and a conventional wired or WiFi network. The simulation results verify that the proposed layer based rate adaptation approach is able to utilize the bandwidth more efficiently. As a result, a more consistent viewing experience with higher quality video content and minimal video quality fluctuations can be presented to the user.

  20. Flow range enhancement by secondary flow effect in low solidity circular cascade diffusers

    NASA Astrophysics Data System (ADS)

    Sakaguchi, Daisaku; Tun, Min Thaw; Mizokoshi, Kanata; Kishikawa, Daiki

    2014-08-01

    High-pressure ratio and wide operating range are highly required for compressors and blowers. The technical issue of the design is achievement of suppression of flow separation at small flow rate without deteriorating the efficiency at design flow rate. A numerical simulation is very effective in design procedure, however, cost of the numerical simulation is generally high during the practical design process, and it is difficult to confirm the optimal design which is combined with many parameters. A multi-objective optimization technique is the idea that has been proposed for solving the problem in practical design process. In this study, a Low Solidity circular cascade Diffuser (LSD) in a centrifugal blower is successfully designed by means of multi-objective optimization technique. An optimization code with a meta-model assisted evolutionary algorithm is used with a commercial CFD code ANSYS-CFX. The optimization is aiming at improving the static pressure coefficient at design point and at low flow rate condition while constraining the slope of the lift coefficient curve. Moreover, a small tip clearance of the LSD blade was applied in order to activate and to stabilize the secondary flow effect at small flow rate condition. The optimized LSD blade has an extended operating range of 114 % towards smaller flow rate as compared to the baseline design without deteriorating the diffuser pressure recovery at design point. The diffuser pressure rise and operating flow range of the optimized LSD blade are experimentally verified by overall performance test. The detailed flow in the diffuser is also confirmed by means of a Particle Image Velocimeter. Secondary flow is clearly captured by PIV and it spreads to the whole area of LSD blade pitch. It is found that the optimized LSD blade shows good improvement of the blade loading in the whole operating range, while at small flow rate the flow separation on the LSD blade has been successfully suppressed by the secondary flow effect.

  1. Design geometry and design/off-design performance computer codes for compressors and turbines

    NASA Technical Reports Server (NTRS)

    Glassman, Arthur J.

    1995-01-01

    This report summarizes some NASA Lewis (i.e., government owned) computer codes capable of being used for airbreathing propulsion system studies to determine the design geometry and to predict the design/off-design performance of compressors and turbines. These are not CFD codes; velocity-diagram energy and continuity computations are performed fore and aft of the blade rows using meanline, spanline, or streamline analyses. Losses are provided by empirical methods. Both axial-flow and radial-flow configurations are included.

  2. GCS component development cycle

    NASA Astrophysics Data System (ADS)

    Rodríguez, Jose A.; Macias, Rosa; Molgo, Jordi; Guerra, Dailos; Pi, Marti

    2012-09-01

    The GTC1 is an optical-infrared 10-meter segmented mirror telescope at the ORM observatory in Canary Islands (Spain). First light was at 13/07/2007 and since them it is in the operation phase. The GTC control system (GCS) is a distributed object & component oriented system based on RT-CORBA8 and it is responsible for the management and operation of the telescope, including its instrumentation. GCS has used the Rational Unified process (RUP9) in its development. RUP is an iterative software development process framework. After analysing (use cases) and designing (UML10) any of GCS subsystems, an initial component description of its interface is obtained and from that information a component specification is written. In order to improve the code productivity, GCS has adopted the code generation to transform this component specification into the skeleton of component classes based on a software framework, called Device Component Framework. Using the GCS development tools, based on javadoc and gcc, in only one step, the component is generated, compiled and deployed to be tested for the first time through our GUI inspector. The main advantages of this approach are the following: It reduces the learning curve of new developers and the development error rate, allows a systematic use of design patterns in the development and software reuse, speeds up the deliverables of the software product and massively increase the timescale, design consistency and design quality, and eliminates the future refactoring process required for the code.

  3. Strategies to Improve Efficiency and Specificity of Degenerate Primers in PCR.

    PubMed

    Campos, Maria Jorge; Quesada, Alberto

    2017-01-01

    PCR with degenerate primers can be used to identify the coding sequence of an unknown protein or to detect a genetic variant within a gene family. These primers, which are complex mixtures of slightly different oligonucleotide sequences, can be optimized to increase the efficiency and/or specificity of PCR in the amplification of a sequence of interest by the introduction of mismatches with the target sequence and balancing their position toward the primers 5'- or 3'-ends. In this work, we explain in detail examples of rational design of primers in two different applications, including the use of specific determinants at the 3'-end, to: (1) improve PCR efficiency with coding sequences for members of a protein family by fully degeneration at a core box of conserved genetic information, with the reduction of degeneration at the 5'-end, and (2) optimize specificity of allelic discrimination of closely related orthologous by 5'-end degenerate primers.

  4. Design and Implementation of User-Created Information Systems with Mobile RFID

    NASA Astrophysics Data System (ADS)

    Lee, Jae Kwoen; Chin, Sungho; Kim, Hee Cheon; Chung, Kwang Sik

    RFID (Radio Frequency Identification) has been usually applied at physical distribution field. The Mobile RFID can be the only technology that we can lead the market. In our country, ETRI standardizes MOBION (MOBile Identification ON), and the mobile-telecommunication companies provide the trial-mobile RFID service from 2006. In the trial-mobile RFID services, the Broker model is used to decode the mobile RFID code. However, the Broker model has some problems, such as communication overhead caused by the frequent ODS query, service performance, and various services for users. In this paper, we developed device application that is capable for filtering unrelated code from RFID service to improve the decoding performance. We also improve the performance through simplifying connection process between device application and the broker. Finally, we propose and develop the user-created information system to widely distribute the Mobile RFID service.

  5. Star 48 solid rocket motor nozzle analyses and instrumented firings

    NASA Technical Reports Server (NTRS)

    Porter, R. L.

    1986-01-01

    The analyses and testing performed by NASA in support of an expanded and improved nozzle design data base for use by the U.S. solid rocket motor industry is presented. A production nozzle with a history of one ground failure and two flight failures was selected for analyses and testing. The stress analysis was performed with the Champion computer code developed by the U.S. Navy. Several improvements were made to the code. Strain predictions were made and compared to test data. Two short duration motor firings were conducted with highly instrumented nozzles. The first nozzle had 58 thermocouples, 66 strain gages, and 8 bondline pressure measurements. The second nozzle had 59 thermocouples, 68 strain measurements, and 8 bondline pressure measurements. Most of this instrumentation was on the nonmetallic parts, and provided significantly more thermal and strain data on the nonmetallic components of a nozzle than has been accumulated in a solid rocket motor test to date.

  6. CEMCAN Software Enhanced for Predicting the Properties of Woven Ceramic Matrix Composites

    NASA Technical Reports Server (NTRS)

    Murthy, Pappu L. N.; Mital, Subodh K.; DiCarlo, James A.

    2000-01-01

    Major advancements are needed in current high-temperature materials to meet the requirements of future space and aeropropulsion structural components. Ceramic matrix composites (CMC's) are one class of materials that are being evaluated as candidate materials for many high-temperature applications. Past efforts to improve the performance of CMC's focused primarily on improving the properties of the fiber, interfacial coatings, and matrix constituents as individual phases. Design and analysis tools must take into consideration the complex geometries, microstructures, and fabrication processes involved in these composites and must allow the composite properties to be tailored for optimum performance. Major accomplishments during the past year include the development and inclusion of woven CMC micromechanics methodology into the CEMCAN (Ceramic Matrix Composites Analyzer) computer code. The code enables one to calibrate a consistent set of constituent properties as a function of temperature with the aid of experimentally measured data.

  7. QRAC-the-Code: a comprehension monitoring strategy for middle school social studies textbooks.

    PubMed

    Berkeley, Sheri; Riccomini, Paul J

    2013-01-01

    Requirements for reading and ascertaining information from text increase as students advance through the educational system, especially in content-rich classes; hence, monitoring comprehension is especially important. However, this is a particularly challenging skill for many students who struggle with reading comprehension, including students with learning disabilities. A randomized pre-post experimental design was employed to investigate the effectiveness of a comprehension monitoring strategy (QRAC-the-Code) for improving the reading comprehension of 323 students in grades 6 and 7 in inclusive social studies classes. Findings indicated that both general education students and students with learning disabilities who were taught a simple comprehension monitoring strategy improved their comprehension of textbook content compared to students who read independently and noted important points. In addition, students in the comprehension monitoring condition reported using more reading strategies after the intervention. Implications for research and practice are discussed.

  8. NASA Lewis Stirling SPRE testing and analysis with reduced number of cooler tubes

    NASA Technical Reports Server (NTRS)

    Wong, Wayne A.; Cairelli, James E.; Swec, Diane M.; Doeberling, Thomas J.; Lakatos, Thomas F.; Madi, Frank J.

    1992-01-01

    Free-piston Stirling power converters are candidates for high capacity space power applications. The Space Power Research Engine (SPRE), a free-piston Stirling engine coupled with a linear alternator, is being tested at the NASA Lewis Research Center in support of the Civil Space Technology Initiative. The SPRE is used as a test bed for evaluating converter modifications which have the potential to improve the converter performance and for validating computer code predictions. Reducing the number of cooler tubes on the SPRE has been identified as a modification with the potential to significantly improve power and efficiency. Experimental tests designed to investigate the effects of reducing the number of cooler tubes on converter power, efficiency and dynamics are described. Presented are test results from the converter operating with a reduced number of cooler tubes and comparisons between this data and both baseline test data and computer code predictions.

  9. On the design of turbo codes

    NASA Technical Reports Server (NTRS)

    Divsalar, D.; Pollara, F.

    1995-01-01

    In this article, we design new turbo codes that can achieve near-Shannon-limit performance. The design criterion for random interleavers is based on maximizing the effective free distance of the turbo code, i.e., the minimum output weight of codewords due to weight-2 input sequences. An upper bound on the effective free distance of a turbo code is derived. This upper bound can be achieved if the feedback connection of convolutional codes uses primitive polynomials. We review multiple turbo codes (parallel concatenation of q convolutional codes), which increase the so-called 'interleaving gain' as q and the interleaver size increase, and a suitable decoder structure derived from an approximation to the maximum a posteriori probability decision rule. We develop new rate 1/3, 2/3, 3/4, and 4/5 constituent codes to be used in the turbo encoder structure. These codes, for from 2 to 32 states, are designed by using primitive polynomials. The resulting turbo codes have rates b/n (b = 1, 2, 3, 4 and n = 2, 3, 4, 5, 6), and include random interleavers for better asymptotic performance. These codes are suitable for deep-space communications with low throughput and for near-Earth communications where high throughput is desirable. The performance of these codes is within 1 dB of the Shannon limit at a bit-error rate of 10(exp -6) for throughputs from 1/15 up to 4 bits/s/Hz.

  10. Rocketdyne/Westinghouse nuclear thermal rocket engine modeling

    NASA Technical Reports Server (NTRS)

    Glass, James F.

    1993-01-01

    The topics are presented in viewgraph form and include the following: systems approach needed for nuclear thermal rocket (NTR) design optimization; generic NTR engine power balance codes; rocketdyne nuclear thermal system code; software capabilities; steady state model; NTR engine optimizer code-logic; reactor power calculation logic; sample multi-component configuration; NTR design code output; generic NTR code at Rocketdyne; Rocketdyne NTR model; and nuclear thermal rocket modeling directions.

  11. Review Of Piping And Pressure Vessel Code Design Criteria. Technical Report 217.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None, None

    1969-04-18

    This Technical Report summarizes a review of the design philosophies and criteria of the ASME Boiler and Pressure Vessel Code and the USASI Code for Pressure Piping. It traces the history of the Codes since their inception and critically reviews their present status. Recommendations are made concerning the applicability of the Codes to the special needs of LMFBR liquid sodium piping.

  12. Perspectives On Dilution Jet Mixing

    NASA Technical Reports Server (NTRS)

    Holdeman, J. D.; Srinivasan, R.

    1990-01-01

    NASA recently completed program of measurements and modeling of mixing of transverse jets with ducted crossflow, motivated by need to design or tailor temperature pattern at combustor exit in gas turbine engines. Objectives of program to identify dominant physical mechanisms governing mixing, extend empirical models to provide near-term predictive capability, and compare numerical code calculations with data to guide future analysis improvement efforts.

  13. NASIS data base management system - IBM 360/370 OS MVT implementation. 1: Installation standards

    NASA Technical Reports Server (NTRS)

    1973-01-01

    The installation standards for the NASA Aerospace Safety Information System (NASIS) data base management system are presented. The standard approach to preparing systems documentation and the program design and coding rules and conventions are outlined. Included are instructions for preparing all major specifications and suggestions for improving the quality and efficiency of the programming task.

  14. Improvements to the Sandia CTH Hydro-Code to Support Blast Analysis and Protective Design of Military Vehicles

    DTIC Science & Technology

    2014-04-15

    used for advertising or product endorsement purposes. 6.0 REFERENCES [1] McGlaun, J., Thompson, S. and Elrick, M. “CTH: A Three-Dimensional Shock-Wave...Validation of a Loading Model for Simulating Blast Mine Effects on Armoured Vehicles,” 7 th International LS-DYNA Users Conference, Detroit, MI 2002. [14

  15. Coding and Analysing Behaviour Strategies of Instructors in University Science Laboratories to Improve Science Teachers Training

    ERIC Educational Resources Information Center

    Ajaja, Patrick Osawaru

    2013-01-01

    The intention of this study was to determine how science instructors in the university laboratories spend time on instruction. The study, was guided by three research questions and two hypotheses tested at 0.05 level of significance. The study employed a non-participant observation case study design. 48 instructors teaching lower and higher levels…

  16. New imaging systems in nuclear medicine. Final report, January 1, 1993--December 31, 1995

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1995-12-31

    The aim of this program has been to improve the performance of positron emission tomography (PET) to achieve high resolution with high sensitivity. Towards this aim, the authors have carried out the following studies: (1) explored new techniques for detection of annihilation radiation including new detector materials and system geometries, specific areas that they have studied include--exploration of factors related to resolution and sensitivity of PET instrumentation including geometry, detection materials and coding, and the exploration of technique to improve the image quality by use of depth of interaction and increased sampling; (2) complete much of the final testing ofmore » PCR-II, an analog-coded cylindrical positron tomograph, developed and constructed during the current funding period; (3) developed the design of a positron microtomograph with mm resolution for quantitative studies in small animals, a single slice version of this device has been designed and studied by use of computer simulation; (4) continued and expanded the program of biological studies in animal models. Current studies have included imaging of animal models of Parkinson`s and Huntington`s disease and cancer. These studies have included new radiopharmaceuticals and techniques involving molecular biology.« less

  17. Video Monitoring a Simulation-Based Quality Improvement Program in Bihar, India.

    PubMed

    Dyer, Jessica; Spindler, Hilary; Christmas, Amelia; Shah, Malay Bharat; Morgan, Melissa; Cohen, Susanna R; Sterne, Jason; Mahapatra, Tanmay; Walker, Dilys

    2018-04-01

    Simulation-based training has become an accepted clinical training andragogy in high-resource settings with its use increasing in low-resource settings. Video recordings of simulated scenarios are commonly used by facilitators. Beyond using the videos during debrief sessions, researchers can also analyze the simulation videos to quantify technical and nontechnical skills during simulated scenarios over time. Little is known about the feasibility and use of large-scale systems to video record and analyze simulation and debriefing data for monitoring and evaluation in low-resource settings. This manuscript describes the process of designing and implementing a large-scale video monitoring system. Mentees and Mentors were consented and all simulations and debriefs conducted at 320 Primary Health Centers (PHCs) were video recorded. The system design, number of video recordings, and inter-rater reliability of the coded videos were assessed. The final dataset included a total of 11,278 videos. Overall, a total of 2,124 simulation videos were coded and 183 (12%) were blindly double-coded. For the double-coded sample, the average inter-rater reliability (IRR) scores were 80% for nontechnical skills, and 94% for clinical technical skills. Among 4,450 long debrief videos received, 216 were selected for coding and all were double-coded. Data quality of simulation videos was found to be very good in terms of recorded instances of "unable to see" and "unable to hear" in Phases 1 and 2. This study demonstrates that video monitoring systems can be effectively implemented at scale in resource limited settings. Further, video monitoring systems can play several vital roles within program implementation, including monitoring and evaluation, provision of actionable feedback to program implementers, and assurance of program fidelity.

  18. Designing and Testing a Blended Wing Body with Boundary Layer Ingestion Nacelles

    NASA Technical Reports Server (NTRS)

    Carter, Melissa B.; Campbell, Richard L.; Pendergraft, Odis C.; Underwood, Pamela J.; Friedman, Douglas M.; Serrano, Leonel

    2006-01-01

    A knowledge-based aerodynamic design method coupled with an unstructured grid Navier-Stokes flow solver was used to improve the propulsion/airframe integration for a Blended Wing Body with boundary-layer ingestion nacelles. A new zonal design capability was used that significantly reduced the time required to achieve a successful design for each nacelle and the elevon between them. A wind tunnel model was built with interchangeable parts reflecting the baseline and redesigned configurations and tested in the National Transonic Facility (NTF). Most of the testing was done at the cruise design conditions (Mach number = 0.85, Reynolds number = 75 million). In general, the predicted improvements in forces and moments as well as the changes in wing pressures between the baseline and redesign were confirmed by the wind tunnel results. The effectiveness of elevons between the nacelles was also predicted surprisingly well considering the crudeness in the modeling of the control surfaces in the flow code.

  19. Multi-level of Fidelity Multi-Disciplinary Design Optimization of Small, Solid-Propellant Launch Vehicles

    NASA Astrophysics Data System (ADS)

    Roshanian, Jafar; Jodei, Jahangir; Mirshams, Mehran; Ebrahimi, Reza; Mirzaee, Masood

    A new automated multi-level of fidelity Multi-Disciplinary Design Optimization (MDO) methodology has been developed at the MDO Laboratory of K.N. Toosi University of Technology. This paper explains a new design approach by formulation of developed disciplinary modules. A conceptual design for a small, solid-propellant launch vehicle was considered at two levels of fidelity structure. Low and medium level of fidelity disciplinary codes were developed and linked. Appropriate design and analysis codes were defined according to their effect on the conceptual design process. Simultaneous optimization of the launch vehicle was performed at the discipline level and system level. Propulsion, aerodynamics, structure and trajectory disciplinary codes were used. To reach the minimum launch weight, the Low LoF code first searches the whole design space to achieve the mission requirements. Then the medium LoF code receives the output of the low LoF and gives a value near the optimum launch weight with more details and higher fidelity.

  20. Off-design computer code for calculating the aerodynamic performance of axial-flow fans and compressors

    NASA Technical Reports Server (NTRS)

    Schmidt, James F.

    1995-01-01

    An off-design axial-flow compressor code is presented and is available from COSMIC for predicting the aerodynamic performance maps of fans and compressors. Steady axisymmetric flow is assumed and the aerodynamic solution reduces to solving the two-dimensional flow field in the meridional plane. A streamline curvature method is used for calculating this flow-field outside the blade rows. This code allows for bleed flows and the first five stators can be reset for each rotational speed, capabilities which are necessary for large multistage compressors. The accuracy of the off-design performance predictions depend upon the validity of the flow loss and deviation correlation models. These empirical correlations for the flow loss and deviation are used to model the real flow effects and the off-design code will compute through small reverse flow regions. The input to this off-design code is fully described and a user's example case for a two-stage fan is included with complete input and output data sets. Also, a comparison of the off-design code predictions with experimental data is included which generally shows good agreement.

  1. W-026, Waste Receiving and Processing Facility data management system validation and verification report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Palmer, M.E.

    1997-12-05

    This V and V Report includes analysis of two revisions of the DMS [data management system] System Requirements Specification (SRS) and the Preliminary System Design Document (PSDD); the source code for the DMS Communication Module (DMSCOM) messages; the source code for selected DMS Screens, and the code for the BWAS Simulator. BDM Federal analysts used a series of matrices to: compare the requirements in the System Requirements Specification (SRS) to the specifications found in the System Design Document (SDD), to ensure the design supports the business functions, compare the discreet parts of the SDD with each other, to ensure thatmore » the design is consistent and cohesive, compare the source code of the DMS Communication Module with the specifications, to ensure that the resultant messages will support the design, compare the source code of selected screens to the specifications to ensure that resultant system screens will support the design, compare the source code of the BWAS simulator with the requirements to interface with DMS messages and data transfers relating to the BWAS operations.« less

  2. A Fixed Point VHDL Component Library for a High Efficiency Reconfigurable Radio Design Methodology

    NASA Technical Reports Server (NTRS)

    Hoy, Scott D.; Figueiredo, Marco A.

    2006-01-01

    Advances in Field Programmable Gate Array (FPGA) technologies enable the implementation of reconfigurable radio systems for both ground and space applications. The development of such systems challenges the current design paradigms and requires more robust design techniques to meet the increased system complexity. Among these techniques is the development of component libraries to reduce design cycle time and to improve design verification, consequently increasing the overall efficiency of the project development process while increasing design success rates and reducing engineering costs. This paper describes the reconfigurable radio component library developed at the Software Defined Radio Applications Research Center (SARC) at Goddard Space Flight Center (GSFC) Microwave and Communications Branch (Code 567). The library is a set of fixed-point VHDL components that link the Digital Signal Processing (DSP) simulation environment with the FPGA design tools. This provides a direct synthesis path based on the latest developments of the VHDL tools as proposed by the BEE VBDL 2004 which allows for the simulation and synthesis of fixed-point math operations while maintaining bit and cycle accuracy. The VHDL Fixed Point Reconfigurable Radio Component library does not require the use of the FPGA vendor specific automatic component generators and provide a generic path from high level DSP simulations implemented in Mathworks Simulink to any FPGA device. The access to the component synthesizable, source code provides full design verification capability:

  3. 76 FR 53912 - FDA's Public Database of Products With Orphan-Drug Designation: Replacing Non-Informative Code...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-08-30

    ...] FDA's Public Database of Products With Orphan-Drug Designation: Replacing Non-Informative Code Names... replaced non- informative code names with descriptive identifiers on its public database of products that... on our public database with non-informative code names. After careful consideration of this matter...

  4. Full-custom design of split-set data weighted averaging with output register for jitter suppression

    NASA Astrophysics Data System (ADS)

    Jubay, M. C.; Gerasta, O. J.

    2015-06-01

    A full-custom design of an element selection algorithm, named as Split-set Data Weighted Averaging (SDWA) is implemented in 90nm CMOS Technology Synopsys Library. SDWA is applied in seven unit elements (3-bit) using a thermometer-coded input. Split-set DWA is an improved DWA algorithm which caters the requirement for randomization along with long-term equal element usage. Randomization and equal element-usage improve the spectral response of the unit elements due to higher Spurious-free dynamic range (SFDR) and without significantly degrading signal-to-noise ratio (SNR). Since a full-custom, the design is brought to transistor-level and the chip custom layout is also provided, having a total area of 0.3mm2, a power consumption of 0.566 mW, and simulated at 50MHz clock frequency. On this implementation, SDWA is successfully derived and improved by introducing a register at the output that suppresses the jitter introduced at the final stage due to switching loops and successive delays.

  5. Automated Concurrent Blackboard System Generation in C++

    NASA Technical Reports Server (NTRS)

    Kaplan, J. A.; McManus, J. W.; Bynum, W. L.

    1999-01-01

    In his 1992 Ph.D. thesis, "Design and Analysis Techniques for Concurrent Blackboard Systems", John McManus defined several performance metrics for concurrent blackboard systems and developed a suite of tools for creating and analyzing such systems. These tools allow a user to analyze a concurrent blackboard system design and predict the performance of the system before any code is written. The design can be modified until simulated performance is satisfactory. Then, the code generator can be invoked to generate automatically all of the code required for the concurrent blackboard system except for the code implementing the functionality of each knowledge source. We have completed the port of the source code generator and a simulator for a concurrent blackboard system. The source code generator generates the necessary C++ source code to implement the concurrent blackboard system using Parallel Virtual Machine (PVM) running on a heterogeneous network of UNIX(trademark) workstations. The concurrent blackboard simulator uses the blackboard specification file to predict the performance of the concurrent blackboard design. The only part of the source code for the concurrent blackboard system that the user must supply is the code implementing the functionality of the knowledge sources.

  6. Design applications for supercomputers

    NASA Technical Reports Server (NTRS)

    Studerus, C. J.

    1987-01-01

    The complexity of codes for solutions of real aerodynamic problems has progressed from simple two-dimensional models to three-dimensional inviscid and viscous models. As the algorithms used in the codes increased in accuracy, speed and robustness, the codes were steadily incorporated into standard design processes. The highly sophisticated codes, which provide solutions to the truly complex flows, require computers with large memory and high computational speed. The advent of high-speed supercomputers, such that the solutions of these complex flows become more practical, permits the introduction of the codes into the design system at an earlier stage. The results of several codes which either were already introduced into the design process or are rapidly in the process of becoming so, are presented. The codes fall into the area of turbomachinery aerodynamics and hypersonic propulsion. In the former category, results are presented for three-dimensional inviscid and viscous flows through nozzle and unducted fan bladerows. In the latter category, results are presented for two-dimensional inviscid and viscous flows for hypersonic vehicle forebodies and engine inlets.

  7. Study on Design of High Efficiency and Light Weight Composite Propeller Blade for a Regional Turboprop Aircraft

    NASA Astrophysics Data System (ADS)

    Kong, Changduk; Lee, Kyungsun

    2013-03-01

    In this study, aerodynamic and structural design of the composite propeller blade for a regional turboprop aircraft is performed. The thin and wide chord propeller blade of high speed turboprop aircraft should have proper strength and stiffness to carry various kinds of loads such as high aerodynamic bending and twisting moments and centrifugal forces. Therefore the skin-spar-foam sandwich structure using high strength and stiffness carbon/epoxy composite materials is used to improve the lightness. A specific design procedure is proposed in this work as follows; firstly the aerodynamic configuration design, which is acceptable for the design requirements, is carried out using the in-house code developed by authors, secondly the structure design loads are determined through the aerodynamic load case analysis, thirdly the spar flange and the skin are preliminarily sized by consideration of major bending moments and shear forces using both the netting rule and the rule of mixture, and finally, the stress analysis is performed to confirm the structural safety and stability using finite element analysis commercial code, MSC. NASTRAN/PATRAN. Furthermore the additional analysis is performed to confirm the structural safety due to bird strike impact on the blade during flight operation using a commercial code, ANSYS. To realize the proposed propeller design, the prototype blades are manufactured by the following procedure; the carbon/epoxy composite fabric prepregs are laid up for skin and spar on a mold using the hand lay-up method and consolidated with a proper temperature and vacuum in the oven. To finalize the structural design, the full-scale static structural test is performed under the simulated aerodynamic loads using 3 point loading method. From the experimental results, it is found that the designed blade has a good structural integrity, and the measured results agree well with the analytical results as well.

  8. Creep and Creep-Fatigue Crack Growth at Structural Discontinuities and Welds

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dr. F. W. Brust; Dr. G. M. Wilkowski; Dr. P. Krishnaswamy

    2010-01-27

    The subsection ASME NH high temperature design procedure does not admit crack-like defects into the structural components. The US NRC identified the lack of treatment of crack growth within NH as a limitation of the code and thus this effort was undertaken. This effort is broken into two parts. Part 1, summarized here, involved examining all high temperature creep-fatigue crack growth codes being used today and from these, the task objective was to choose a methodology that is appropriate for possible implementation within NH. The second part of this task, which has just started, is to develop design rules formore » possible implementation within NH. This second part is a challenge since all codes require step-by-step analysis procedures to be undertaken in order to assess the crack growth and life of the component. Simple rules for design do not exist in any code at present. The codes examined in this effort included R5, RCC-MR (A16), BS 7910, API 579, and ATK (and some lesser known codes). There are several reasons that the capability for assessing cracks in high temperature nuclear components is desirable. These include: (1) Some components that are part of GEN IV reactors may have geometries that have sharp corners - which are essentially cracks. Design of these components within the traditional ASME NH procedure is quite challenging. It is natural to ensure adequate life design by modeling these features as cracks within a creep-fatigue crack growth procedure. (2) Workmanship flaws in welds sometimes occur and are accepted in some ASME code sections. It can be convenient to consider these as flaws when making a design life assessment. (3) Non-destructive Evaluation (NDE) and inspection methods after fabrication are limited in the size of the crack or flaw that can be detected. It is often convenient to perform a life assessment using a flaw of a size that represents the maximum size that can elude detection. (4) Flaws that are observed using in-service detection methods often need to be addressed as plants age. Shutdown inspection intervals can only be designed using creep and creep-fatigue crack growth techniques. (5) The use of crack growth procedures can aid in examining the seriousness of creep damage in structural components. How cracks grow can be used to assess margins on components and lead to further safe operation. After examining the pros and cons of all these methods, the R5 code was chosen as the most up-to-date and validated high temperature creep and creep fatigue code currently used in the world at present. R5 is considered the leader because the code: (1) has well established and validated rules, (2) has a team of experts continually improving and updating it, (3) has software that can be used by designers, (4) extensive validation in many parts with available data from BE resources as well as input from Imperial college's database, and (5) was specifically developed for use in nuclear plants. R5 was specifically developed for use in gas cooled nuclear reactors which operate in the UK and much of the experience is based on materials and temperatures which are experienced in these reactors. If the next generation advanced reactors to be built in the US used these same materials within the same temperature ranges as these reactors, then R5 may be appropriate for consideration of direct implementation within ASME code NH or Section XI. However, until more verification and validation of these creep/fatigue crack growth rules for the specific materials and temperatures to be used in the GEN IV reactors is complete, ASME should consider delaying this implementation. With this in mind, it is this authors opinion that R5 methods are the best available for code use today. The focus of this work was to examine the literature for creep and creep-fatigue crack growth procedures that are well established in codes in other countries and choose a procedure to consider implementation into ASME NH. It is very important to recognize that all creep and creep fatigue crack growth procedures that are part of high temperature design codes are related and very similar. This effort made no attempt to develop a new creep-fatigue crack growth predictive methodology. Rather examination of current procedures was the only goal. The uncertainties in the R5 crack growth methods and recommendations for more work are summarized here also.« less

  9. BESAFE II: Accident safety analysis code for MFE reactor designs

    NASA Astrophysics Data System (ADS)

    Sevigny, Lawrence Michael

    The viability of controlled thermonuclear fusion as an alternative energy source hinges on its desirability from an economic and an environmental and safety standpoint. It is the latter which is the focus of this thesis. For magnetic fusion energy (MFE) devices, the safety concerns equate to a design's behavior during a worst-case accident scenario which is the loss of coolant accident (LOCA). In this dissertation, we examine the behavior of MFE devices during a LOCA and how this behavior relates to the safety characteristics of the machine; in particular the acute, whole-body, early dose. In doing so, we have produced an accident safety code, BESAFE II, now available to the fusion reactor design community. The Appendix constitutes the User's Manual for BESAFE II. The theory behind early dose calculations including the mobilization of activation products is presented in Chapter 2. Since mobilization of activation products is a strong function of temperature, it becomes necessary to calculate the thermal response of a design during a LOCA in order to determine the fraction of the activation products which are mobilized and thus become the source for the dose. The code BESAFE II is designed to determine the temperature history of each region of a design and determine the resulting mobilization of activation products at each point in time during the LOCA. The BESAFE II methodology is discussed in Chapter 4, followed by demonstrations of its use for two reference design cases: a PCA-Li tokamak and a SiC-He tokamak. Of these two cases, it is shown that the SiC-He tokamak is a better design from an accident safety standpoint than the PCA-Li tokamak. It is also found that doses derived from temperature-dependent mobilization data are different than those predicted using set mobilization categories such as those that involve Piet fractions. This demonstrates the need for more experimental data on fusion materials. The possibility for future improvements and modifications to BESAFE II is discussed in Chapter 6, for example, by adding additional environmental indices such as a waste disposal index. The biggest improvement to BESAFE II would be an increase in the database of activation product mobilization for a larger spectrum of fusion reactor materials. The ultimate goal we have is for BESAFE II to become part of a systems design program which would include economic factors and allow both safety and the cost of electricity to influence design.

  10. Investigation of Navier-Stokes Code Verification and Design Optimization

    NASA Technical Reports Server (NTRS)

    Vaidyanathan, Rajkumar

    2004-01-01

    With rapid progress made in employing computational techniques for various complex Navier-Stokes fluid flow problems, design optimization problems traditionally based on empirical formulations and experiments are now being addressed with the aid of computational fluid dynamics (CFD). To be able to carry out an effective CFD-based optimization study, it is essential that the uncertainty and appropriate confidence limits of the CFD solutions be quantified over the chosen design space. The present dissertation investigates the issues related to code verification, surrogate model-based optimization and sensitivity evaluation. For Navier-Stokes (NS) CFD code verification a least square extrapolation (LSE) method is assessed. This method projects numerically computed NS solutions from multiple, coarser base grids onto a freer grid and improves solution accuracy by minimizing the residual of the discretized NS equations over the projected grid. In this dissertation, the finite volume (FV) formulation is focused on. The interplay between the xi concepts and the outcome of LSE, and the effects of solution gradients and singularities, nonlinear physics, and coupling of flow variables on the effectiveness of LSE are investigated. A CFD-based design optimization of a single element liquid rocket injector is conducted with surrogate models developed using response surface methodology (RSM) based on CFD solutions. The computational model consists of the NS equations, finite rate chemistry, and the k-6 turbulence closure. With the aid of these surrogate models, sensitivity and trade-off analyses are carried out for the injector design whose geometry (hydrogen flow angle, hydrogen and oxygen flow areas and oxygen post tip thickness) is optimized to attain desirable goals in performance (combustion length) and life/survivability (the maximum temperatures on the oxidizer post tip and injector face and a combustion chamber wall temperature). A preliminary multi-objective optimization study is carried out using a geometric mean approach. Following this, sensitivity analyses with the aid of variance-based non-parametric approach and partial correlation coefficients are conducted using data available from surrogate models of the objectives and the multi-objective optima to identify the contribution of the design variables to the objective variability and to analyze the variability of the design variables and the objectives. In summary the present dissertation offers insight into an improved coarse to fine grid extrapolation technique for Navier-Stokes computations and also suggests tools for a designer to conduct design optimization study and related sensitivity analyses for a given design problem.

  11. Evaluating the performance of two neutron spectrum unfolding codes based on iterative procedures and artificial neural networks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ortiz-Rodriguez, J. M.; Reyes Alfaro, A.; Reyes Haro, A.

    In this work the performance of two neutron spectrum unfolding codes based on iterative procedures and artificial neural networks is evaluated. The first one code based on traditional iterative procedures and called Neutron spectrometry and dosimetry from the Universidad Autonoma de Zacatecas (NSDUAZ) use the SPUNIT iterative algorithm and was designed to unfold neutron spectrum and calculate 15 dosimetric quantities and 7 IAEA survey meters. The main feature of this code is the automated selection of the initial guess spectrum trough a compendium of neutron spectrum compiled by the IAEA. The second one code known as Neutron spectrometry and dosimetrymore » with artificial neural networks (NDSann) is a code designed using neural nets technology. The artificial intelligence approach of neural net does not solve mathematical equations. By using the knowledge stored at synaptic weights on a neural net properly trained, the code is capable to unfold neutron spectrum and to simultaneously calculate 15 dosimetric quantities, needing as entrance data, only the rate counts measured with a Bonner spheres system. Similarities of both NSDUAZ and NSDann codes are: they follow the same easy and intuitive user's philosophy and were designed in a graphical interface under the LabVIEW programming environment. Both codes unfold the neutron spectrum expressed in 60 energy bins, calculate 15 dosimetric quantities and generate a full report in HTML format. Differences of these codes are: NSDUAZ code was designed using classical iterative approaches and needs an initial guess spectrum in order to initiate the iterative procedure. In NSDUAZ, a programming routine was designed to calculate 7 IAEA instrument survey meters using the fluence-dose conversion coefficients. NSDann code use artificial neural networks for solving the ill-conditioned equation system of neutron spectrometry problem through synaptic weights of a properly trained neural network. Contrary to iterative procedures, in neural net approach it is possible to reduce the rate counts used to unfold the neutron spectrum. To evaluate these codes a computer tool called Neutron Spectrometry and dosimetry computer tool was designed. The results obtained with this package are showed. The codes here mentioned are freely available upon request to the authors.« less

  12. Evaluating the performance of two neutron spectrum unfolding codes based on iterative procedures and artificial neural networks

    NASA Astrophysics Data System (ADS)

    Ortiz-Rodríguez, J. M.; Reyes Alfaro, A.; Reyes Haro, A.; Solís Sánches, L. O.; Miranda, R. Castañeda; Cervantes Viramontes, J. M.; Vega-Carrillo, H. R.

    2013-07-01

    In this work the performance of two neutron spectrum unfolding codes based on iterative procedures and artificial neural networks is evaluated. The first one code based on traditional iterative procedures and called Neutron spectrometry and dosimetry from the Universidad Autonoma de Zacatecas (NSDUAZ) use the SPUNIT iterative algorithm and was designed to unfold neutron spectrum and calculate 15 dosimetric quantities and 7 IAEA survey meters. The main feature of this code is the automated selection of the initial guess spectrum trough a compendium of neutron spectrum compiled by the IAEA. The second one code known as Neutron spectrometry and dosimetry with artificial neural networks (NDSann) is a code designed using neural nets technology. The artificial intelligence approach of neural net does not solve mathematical equations. By using the knowledge stored at synaptic weights on a neural net properly trained, the code is capable to unfold neutron spectrum and to simultaneously calculate 15 dosimetric quantities, needing as entrance data, only the rate counts measured with a Bonner spheres system. Similarities of both NSDUAZ and NSDann codes are: they follow the same easy and intuitive user's philosophy and were designed in a graphical interface under the LabVIEW programming environment. Both codes unfold the neutron spectrum expressed in 60 energy bins, calculate 15 dosimetric quantities and generate a full report in HTML format. Differences of these codes are: NSDUAZ code was designed using classical iterative approaches and needs an initial guess spectrum in order to initiate the iterative procedure. In NSDUAZ, a programming routine was designed to calculate 7 IAEA instrument survey meters using the fluence-dose conversion coefficients. NSDann code use artificial neural networks for solving the ill-conditioned equation system of neutron spectrometry problem through synaptic weights of a properly trained neural network. Contrary to iterative procedures, in neural net approach it is possible to reduce the rate counts used to unfold the neutron spectrum. To evaluate these codes a computer tool called Neutron Spectrometry and dosimetry computer tool was designed. The results obtained with this package are showed. The codes here mentioned are freely available upon request to the authors.

  13. New cochlear implant research coding strategy based on the MP3(000™) strategy to reintroduce the virtual channel effect.

    PubMed

    Neben, Nicole; Lenarz, Thomas; Schuessler, Mark; Harpel, Theo; Buechner, Andreas

    2013-05-01

    Results for speech recognition in noise tests when using a new research coding strategy designed to introduce the virtual channel effect provided no advantage over MP3(000™). Although statistically significant smaller just noticeable differences (JNDs) were obtained, the findings for pitch ranking proved to have little clinical impact. The aim of this study was to explore whether modifications to MP3000 by including sequential virtual channel stimulation would lead to further improvements in hearing, particularly for speech recognition in background noise and in competing-talker conditions, and to compare results for pitch perception and melody recognition, as well as informally collect subjective impressions on strategy preference. Nine experienced cochlear implant subjects were recruited for the prospective study. Two variants of the experimental strategy were compared to MP3000. The study design was a single-blinded ABCCBA cross-over trial paradigm with 3 weeks of take-home experience for each user condition. Comparing results of pitch-ranking, a significantly reduced JND was identified. No significant effect of coding strategy on speech understanding in noise or competing-talker materials was found. Melody recognition skills were the same under all user conditions.

  14. Architecture design and performance evaluation of multigranularity optical networks based on optical code division multiplexing

    NASA Astrophysics Data System (ADS)

    Huang, Shaowei; Baba, Ken-Ichi; Murata, Masayuki; Kitayama, Ken-Ichi

    2006-12-01

    In traditional lambda-based multigranularity optical networks, a lambda is always treated as the basic routing unit, resulting in low wavelength utilization. On the basis of optical code division multiplexing (OCDM) technology, a novel OCDM-based multigranularity optical cross-connect (MG-OXC) is proposed. Compared with the traditional lambda-based MG-OXC, its switching capability has been extended to support fiber switching, waveband switching, lambda switching, and OCDM switching. In a network composed of OCDM-based MG-OXCs, a single wavelength can be shared by distinct label switched paths (LSPs) called OCDM-LSPs, and OCDM-LSP switching can be implemented in the optical domain. To improve the network flexibility for an OCDM-LSP provisioning, two kinds of switches enabling hybrid optical code (OC)-wavelength conversion are designed. Simulation results indicate that a blocking probability reduction of 2 orders can be obtained by deploying only five OCs to a single wavelength. Furthermore, compared with time-division-multiplexing LSP (TDM-LSP), owing to the asynchronous accessibility and the OC conversion, OCDM-LSPs have been shown to permit a simpler switch architecture and achieve better blocking performance than TDM-LSPs.

  15. Application of probabilistic analysis/design methods in space programs - The approaches, the status, and the needs

    NASA Technical Reports Server (NTRS)

    Ryan, Robert S.; Townsend, John S.

    1993-01-01

    The prospective improvement of probabilistic methods for space program analysis/design entails the further development of theories, codes, and tools which match specific areas of application, the drawing of lessons from previous uses of probability and statistics data bases, the enlargement of data bases (especially in the field of structural failures), and the education of engineers and managers on the advantages of these methods. An evaluation is presently made of the current limitations of probabilistic engineering methods. Recommendations are made for specific applications.

  16. Aircraft Design Software

    NASA Technical Reports Server (NTRS)

    1997-01-01

    Successful commercialization of the AirCraft SYNThesis (ACSYNT) tool has resulted in the creation of Phoenix Integration, Inc. ACSYNT has been exclusively licensed to the company, an outcome of a seven year, $3 million effort to provide unique software technology to a focused design engineering market. Ames Research Center formulated ACSYNT and in working with the Virginia Polytechnic Institute CAD Laboratory, began to design and code a computer-aided design for ACSYNT. Using a Joint Sponsored Research Agreement, Ames formed an industry-government-university alliance to improve and foster research and development for the software. As a result of the ACSYNT Institute, the software is becoming a predominant tool for aircraft conceptual design. ACSYNT has been successfully applied to high- speed civil transport configuration, subsonic transports, and supersonic fighters.

  17. Clinical code set engineering for reusing EHR data for research: A review.

    PubMed

    Williams, Richard; Kontopantelis, Evangelos; Buchan, Iain; Peek, Niels

    2017-06-01

    The construction of reliable, reusable clinical code sets is essential when re-using Electronic Health Record (EHR) data for research. Yet code set definitions are rarely transparent and their sharing is almost non-existent. There is a lack of methodological standards for the management (construction, sharing, revision and reuse) of clinical code sets which needs to be addressed to ensure the reliability and credibility of studies which use code sets. To review methodological literature on the management of sets of clinical codes used in research on clinical databases and to provide a list of best practice recommendations for future studies and software tools. We performed an exhaustive search for methodological papers about clinical code set engineering for re-using EHR data in research. This was supplemented with papers identified by snowball sampling. In addition, a list of e-phenotyping systems was constructed by merging references from several systematic reviews on this topic, and the processes adopted by those systems for code set management was reviewed. Thirty methodological papers were reviewed. Common approaches included: creating an initial list of synonyms for the condition of interest (n=20); making use of the hierarchical nature of coding terminologies during searching (n=23); reviewing sets with clinician input (n=20); and reusing and updating an existing code set (n=20). Several open source software tools (n=3) were discovered. There is a need for software tools that enable users to easily and quickly create, revise, extend, review and share code sets and we provide a list of recommendations for their design and implementation. Research re-using EHR data could be improved through the further development, more widespread use and routine reporting of the methods by which clinical codes were selected. Copyright © 2017 The Author(s). Published by Elsevier Inc. All rights reserved.

  18. 13 CFR 134.304 - Commencement of appeals from size determinations and NAICS code designations.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 13 Business Credit and Assistance 1 2010-01-01 2010-01-01 false Commencement of appeals from size determinations and NAICS code designations. 134.304 Section 134.304 Business Credit and Assistance SMALL BUSINESS... Appeals From Size Determinations and NAICS Code Designations § 134.304 Commencement of appeals from size...

  19. Investigating the Simulink Auto-Coding Process

    NASA Technical Reports Server (NTRS)

    Gualdoni, Matthew J.

    2016-01-01

    Model based program design is the most clear and direct way to develop algorithms and programs for interfacing with hardware. While coding "by hand" results in a more tailored product, the ever-growing size and complexity of modern-day applications can cause the project work load to quickly become unreasonable for one programmer. This has generally been addressed by splitting the product into separate modules to allow multiple developers to work in parallel on the same project, however this introduces new potentials for errors in the process. The fluidity, reliability and robustness of the code relies on the abilities of the programmers to communicate their methods to one another; furthermore, multiple programmers invites multiple potentially differing coding styles into the same product, which can cause a loss of readability or even module incompatibility. Fortunately, Mathworks has implemented an auto-coding feature that allows programmers to design their algorithms through the use of models and diagrams in the graphical programming environment Simulink, allowing the designer to visually determine what the hardware is to do. From here, the auto-coding feature handles converting the project into another programming language. This type of approach allows the designer to clearly see how the software will be directing the hardware without the need to try and interpret large amounts of code. In addition, it speeds up the programming process, minimizing the amount of man-hours spent on a single project, thus reducing the chance of human error as well as project turnover time. One such project that has benefited from the auto-coding procedure is Ramses, a portion of the GNC flight software on-board Orion that has been implemented primarily in Simulink. Currently, however, auto-coding Ramses into C++ requires 5 hours of code generation time. This causes issues if the tool ever needs to be debugged, as this code generation will need to occur with each edit to any part of the program; additionally, this is lost time that could be spent testing and analyzing the code. This is one of the more prominent issues with the auto-coding process, and while much information is available with regard to optimizing Simulink designs to produce efficient and reliable C++ code, not much research has been made public on how to reduce the code generation time. It is of interest to develop some insight as to what causes code generation times to be so significant, and determine if there are architecture guidelines or a desirable auto-coding configuration set to assist in streamlining this step of the design process for particular applications. To address the issue at hand, the Simulink coder was studied at a foundational level. For each different component type made available by the software, the features, auto-code generation time, and the format of the generated code were analyzed and documented. Tools were developed and documented to expedite these studies, particularly in the area of automating sequential builds to ensure accurate data was obtained. Next, the Ramses model was examined in an attempt to determine the composition and the types of technologies used in the model. This enabled the development of a model that uses similar technologies, but takes a fraction of the time to auto-code to reduce the turnaround time for experimentation. Lastly, the model was used to run a wide array of experiments and collect data to obtain knowledge about where to search for bottlenecks in the Ramses model. The resulting contributions of the overall effort consist of an experimental model for further investigation into the subject, as well as several automation tools to assist in analyzing the model, and a reference document offering insight to the auto-coding process, including documentation of the tools used in the model analysis, data illustrating some potential problem areas in the auto-coding process, and recommendations on areas or practices in the current Ramses model that should be further investigated. Several skills were required to be built up over the course of the internship project. First and foremost, my Simulink skills have improved drastically, as much of my experience had been modeling electronic circuits as opposed to software models. Furthermore, I am now comfortable working with the Simulink Auto-coder, a tool I had never used until this summer; this tool also tested my critical thinking and C++ knowledge as I had to interpret the C++ code it was generating and attempt to understand how the Simulink model affected the generated code. I had come into the internship with a solid understanding of Matlab code, but had done very little in using it to automate tasks, particularly Simulink tasks; along the same lines, I had rarely used shell script to automate and interface with programs, which I gained a fair amount of experience with this summer, including how to use regular expression. Lastly, soft-skills are an area everyone can continuously improve on; having never worked with NASA engineers, which to me seem to be a completely different breed than what I am used to (commercial electronic engineers), I learned to utilize the wealth of knowledge present at JSC. I wish I had come into the internship knowing exactly how helpful everyone in my branch would be, as I would have picked up on this sooner. I hope that having gained such a strong foundation in Simulink over this summer will open the opportunity to return to work on this project, or potentially other opportunities within the division. The idea of leaving a project I devoted ten weeks to is a hard one to cope with, so having the chance to pick up where I left off sounds appealing; alternatively, I am interested to see if there are any opening in the future that would allow me to work on a project that is more in-line with my research in estimation algorithms. Regardless, this summer has been a milestone in my professional career, and I hope this has started a long-term relationship between JSC and myself. I really enjoy the thought of building on my experience here over future summers while I work to complete my PhD at Missouri University of Science and Technology.

  20. Monte Carlo simulations in Nuclear Medicine

    NASA Astrophysics Data System (ADS)

    Loudos, George K.

    2007-11-01

    Molecular imaging technologies provide unique abilities to localise signs of disease before symptoms appear, assist in drug testing, optimize and personalize therapy, and assess the efficacy of treatment regimes for different types of cancer. Monte Carlo simulation packages are used as an important tool for the optimal design of detector systems. In addition they have demonstrated potential to improve image quality and acquisition protocols. Many general purpose (MCNP, Geant4, etc) or dedicated codes (SimSET etc) have been developed aiming to provide accurate and fast results. Special emphasis will be given to GATE toolkit. The GATE code currently under development by the OpenGATE collaboration is the most accurate and promising code for performing realistic simulations. The purpose of this article is to introduce the non expert reader to the current status of MC simulations in nuclear medicine and briefly provide examples of current simulated systems, and present future challenges that include simulation of clinical studies and dosimetry applications.

Top