Science.gov

Sample records for consequence code system

  1. MELCOR Accident Consequence Code System (MACCS)

    SciTech Connect

    Rollstin, J.A. ); Chanin, D.I. ); Jow, H.N. )

    1990-02-01

    This report describes the MACCS computer code. The purpose of this code is to simulate the impact of severe accidents at nuclear power plants on the surrounding environment. MACCS has been developed for the US Nuclear Regulatory Commission to replace the previously used CRAC2 code, and it incorporates many improvements in modeling flexibility in comparison to CRAC2. The principal phenomena considered in MACCS are atmospheric transport, mitigative actions based on dose projections, dose accumulation by a number of pathways including food and water ingestion, early and latent health effects, and economic costs. The MACCS code can be used for a variety of applications. These include (1) probabilistic risk assessment (PRA) of nuclear power plants and other nuclear facilities, (2) sensitivity studies to gain a better understanding of the parameters important to PRA, and (3) cost-benefit analysis. This report is composed of three volumes. Volume I, the User's Guide, describes the input data requirements of the MACCS code and provides directions for its use as illustrated by three sample problems. Volume II, the Model Description, describes the underlying models that are implemented in the code, and Volume III, the Programmer's Reference Manual, describes the code's structure and database management.

  2. MELCOR Accident Consequence Code System (MACCS)

    SciTech Connect

    Jow, H.N.; Sprung, J.L.; Ritchie, L.T. ); Rollstin, J.A. ); Chanin, D.I. )

    1990-02-01

    This report describes the MACCS computer code. The purpose of this code is to simulate the impact of severe accidents at nuclear power plants on the surrounding environment. MACCS has been developed for the US Nuclear Regulatory Commission to replace the previously used CRAC2 code, and it incorporates many improvements in modeling flexibility in comparison to CRAC2. The principal phenomena considered in MACCS are atmospheric transport, mitigative actions based on dose projection, dose accumulation by a number of pathways including food and water ingestion, early and latent health effects, and economic costs. The MACCS code can be used for a variety of applications. These include (1) probabilistic risk assessment (PRA) of nuclear power plants and other nuclear facilities, (2) sensitivity studies to gain a better understanding of the parameters important to PRA, and (3) cost-benefit analysis. This report is composed of three volumes. Volume I, the User's Guide, describes the input data requirements of the MACCS code and provides directions for its use as illustrated by three sample problems. Volume II, the Model Description, describes the underlying models that are implemented in the code, and Volume III, the Programmer's Reference Manual, describes the code's structure and database management. 59 refs., 14 figs., 15 tabs.

  3. MELCOR Accident Consequence Code System (MACCS)

    SciTech Connect

    Chanin, D.I. ); Sprung, J.L.; Ritchie, L.T.; Jow, Hong-Nian )

    1990-02-01

    This report describes the MACCS computer code. The purpose of this code is to simulate the impact of severe accidents at nuclear power plants on the surrounding environment. MACCS has been developed for the US Nuclear Regulatory Commission to replace the previous CRAC2 code, and it incorporates many improvements in modeling flexibility in comparison to CRAC2. The principal phenomena considered in MACCS are atmospheric transport, mitigative actions based on dose projection, dose accumulation by a number of pathways including food and water ingestion, early and latent health effects, and economic costs. The MACCS code can be used for a variety of applications. These include (1) probabilistic risk assessment (PRA) of nuclear power plants and other nuclear facilities, (2) sensitivity studies to gain a better understanding of the parameters important to PRA, and (3) cost-benefit analysis. This report is composed of three volumes. This document, Volume 1, the Users's Guide, describes the input data requirements of the MACCS code and provides directions for its use as illustrated by three sample problems.

  4. A review of the Melcor Accident Consequence Code System (MACCS): Capabilities and applications

    SciTech Connect

    Young, M.

    1995-02-01

    MACCS was developed at Sandia National Laboratories (SNL) under U.S. Nuclear Regulatory Commission (NRC) sponsorship to estimate the offsite consequences of potential severe accidents at nuclear power plants (NPPs). MACCS was publicly released in 1990. MACCS was developed to support the NRC`s probabilistic safety assessment (PSA) efforts. PSA techniques can provide a measure of the risk of reactor operation. PSAs are generally divided into three levels. Level one efforts identify potential plant damage states that lead to core damage and the associated probabilities, level two models damage progression and containment strength for establishing fission-product release categories, and level three efforts evaluate potential off-site consequences of radiological releases and the probabilities associated with the consequences. MACCS was designed as a tool for level three PSA analysis. MACCS performs probabilistic health and economic consequence assessments of hypothetical accidental releases of radioactive material from NPPs. MACCS includes models for atmospheric dispersion and transport, wet and dry deposition, the probabilistic treatment of meteorology, environmental transfer, countermeasure strategies, dosimetry, health effects, and economic impacts. The computer systems MACCS is designed to run on are the 386/486 PC, VAX/VMS, E3M RISC S/6000, Sun SPARC, and Cray UNICOS. This paper provides an overview of MACCS, reviews some of the applications of MACCS, international collaborations which have involved MACCS, current developmental efforts, and future directions.

  5. Evaluation of severe accident risks: Quantification of major input parameters: MAACS (MELCOR Accident Consequence Code System) input

    SciTech Connect

    Sprung, J.L.; Jow, H-N ); Rollstin, J.A. ); Helton, J.C. )

    1990-12-01

    Estimation of offsite accident consequences is the customary final step in a probabilistic assessment of the risks of severe nuclear reactor accidents. Recently, the Nuclear Regulatory Commission reassessed the risks of severe accidents at five US power reactors (NUREG-1150). Offsite accident consequences for NUREG-1150 source terms were estimated using the MELCOR Accident Consequence Code System (MACCS). Before these calculations were performed, most MACCS input parameters were reviewed, and for each parameter reviewed, a best-estimate value was recommended. This report presents the results of these reviews. Specifically, recommended values and the basis for their selection are presented for MACCS atmospheric and biospheric transport, emergency response, food pathway, and economic input parameters. Dose conversion factors and health effect parameters are not reviewed in this report. 134 refs., 15 figs., 110 tabs.

  6. Review of the chronic exposure pathways models in MACCS (MELCOR Accident Consequence Code System) and several other well-known probabilistic risk assessment models

    SciTech Connect

    Tveten, U. )

    1990-06-01

    The purpose of this report is to document the results of the work performed by the author in connection with the following task, performed for US Nuclear Regulatory Commission, (USNRC) Office of Nuclear Regulatory Research, Division of Systems Research: MACCS Chronic Exposure Pathway Models: Review the chronic exposure pathway models implemented in the MELCOR Accident Consequence Code System (MACCS) and compare those models to the chronic exposure pathway models implemented in similar codes developed in countries that are members of the OECD. The chronic exposures concerned are via: the terrestrial food pathways, the water pathways, the long-term groundshine pathway, and the inhalation of resuspended radionuclides pathway. The USNRC has indicated during discussions of the task that the major effort should be spent on the terrestrial food pathways. There is one chapter for each of the categories of chronic exposure pathways listed above.

  7. Myelography CPT Coding Updates: Effects of 4 New Codes and Unintended Consequences.

    PubMed

    Chokshi, F H; Tu, R K; Nicola, G N; Hirsch, J A

    2016-06-01

    The Current Procedural Terminology of the American Medical Association has recently introduced coding changes for myelography with the introduction of new bundled codes. The aim of this review was to help neuroradiologists understand these code changes and their unintended consequences and to discuss various scenarios in which permutations of various codes could occur in clinical practice.

  8. Tokamak Systems Code

    SciTech Connect

    Reid, R.L.; Barrett, R.J.; Brown, T.G.; Gorker, G.E.; Hooper, R.J.; Kalsi, S.S.; Metzler, D.H.; Peng, Y.K.M.; Roth, K.E.; Spampinato, P.T.

    1985-03-01

    The FEDC Tokamak Systems Code calculates tokamak performance, cost, and configuration as a function of plasma engineering parameters. This version of the code models experimental tokamaks. It does not currently consider tokamak configurations that generate electrical power or incorporate breeding blankets. The code has a modular (or subroutine) structure to allow independent modeling for each major tokamak component or system. A primary benefit of modularization is that a component module may be updated without disturbing the remainder of the systems code as long as the imput to or output from the module remains unchanged.

  9. RISKIND: An enhanced computer code for National Environmental Policy Act transportation consequence analysis

    SciTech Connect

    Biwer, B.M.; LePoire, D.J.; Chen, S.Y.

    1996-03-01

    The RISKIND computer program was developed for the analysis of radiological consequences and health risks to individuals and the collective population from exposures associated with the transportation of spent nuclear fuel (SNF) or other radioactive materials. The code is intended to provide scenario-specific analyses when evaluating alternatives for environmental assessment activities, including those for major federal actions involving radioactive material transport as required by the National Environmental Policy Act (NEPA). As such, rigorous procedures have been implemented to enhance the code`s credibility and strenuous efforts have been made to enhance ease of use of the code. To increase the code`s reliability and credibility, a new version of RISKIND was produced under a quality assurance plan that covered code development and testing, and a peer review process was conducted. During development of the new version, the flexibility and ease of use of RISKIND were enhanced through several major changes: (1) a Windows{sup {trademark}} point-and-click interface replaced the old DOS menu system, (2) the remaining model input parameters were added to the interface, (3) databases were updated, (4) the program output was revised, and (5) on-line help has been added. RISKIND has been well received by users and has been established as a key component in radiological transportation risk assessments through its acceptance by the U.S. Department of Energy community in recent environmental impact statements (EISs) and its continued use in the current preparation of several EISs.

  10. Software Systems: Consequence versus Functionality

    SciTech Connect

    Berg, Ray; Winter, Victor L.

    1999-08-05

    The purpose of this panel is to present different perspectives and opinions regarding the issues surrounding why software should or shouldn't be entrusted with critical (high consequence) functionality.

  11. ETR/ITER systems code

    SciTech Connect

    Barr, W.L.; Bathke, C.G.; Brooks, J.N.; Bulmer, R.H.; Busigin, A.; DuBois, P.F.; Fenstermacher, M.E.; Fink, J.; Finn, P.A.; Galambos, J.D.; Gohar, Y.; Gorker, G.E.; Haines, J.R.; Hassanein, A.M.; Hicks, D.R.; Ho, S.K.; Kalsi, S.S.; Kalyanam, K.M.; Kerns, J.A.; Lee, J.D.; Miller, J.R.; Miller, R.L.; Myall, J.O.; Peng, Y-K.M.; Perkins, L.J.; Spampinato, P.T.; Strickler, D.J.; Thomson, S.L.; Wagner, C.E.; Willms, R.S.; Reid, R.L.

    1988-04-01

    A tokamak systems code capable of modeling experimental test reactors has been developed and is described in this document. The code, named TETRA (for Tokamak Engineering Test Reactor Analysis), consists of a series of modules, each describing a tokamak system or component, controlled by an optimizer/driver. This code development was a national effort in that the modules were contributed by members of the fusion community and integrated into a code by the Fusion Engineering Design Center. The code has been checked out on the Cray computers at the National Magnetic Fusion Energy Computing Center and has satisfactorily simulated the Tokamak Ignition/Burn Experimental Reactor II (TIBER) design. A feature of this code is the ability to perform optimization studies through the use of a numerical software package, which iterates prescribed variables to satisfy a set of prescribed equations or constraints. This code will be used to perform sensitivity studies for the proposed International Thermonuclear Experimental Reactor (ITER). 22 figs., 29 tabs.

  12. A coded tracking telemetry system

    USGS Publications Warehouse

    Howey, P.W.; Seegar, W.S.; Fuller, M.R.; Titus, K.; Amlaner, Charles J.

    1989-01-01

    We describe the general characteristics of an automated radio telemetry system designed to operate for prolonged periods on a single frequency. Each transmitter sends a unique coded signal to a receiving system that encodes and records only the appropriater, pre-programmed codes. A record of the time of each reception is stored on diskettes in a micro-computer. This system enables continuous monitoring of infrequent signals (e.g. one per minute or one per hour), thus extending operation life or allowing size reduction of the transmitter, compared to conventional wildlife telemetry. Furthermore, when using unique codes transmitted on a single frequency, biologists can monitor many individuals without exceeding the radio frequency allocations for wildlife.

  13. Coordinated design of coding and modulation systems

    NASA Technical Reports Server (NTRS)

    Massey, J. L.; Ancheta, T.; Johannesson, R.; Lauer, G.; Lee, L.

    1976-01-01

    The joint optimization of the coding and modulation systems employed in telemetry systems was investigated. Emphasis was placed on formulating inner and outer coding standards used by the Goddard Spaceflight Center. Convolutional codes were found that are nearly optimum for use with Viterbi decoding in the inner coding of concatenated coding systems. A convolutional code, the unit-memory code, was discovered and is ideal for inner system usage because of its byte-oriented structure. Simulations of sequential decoding on the deep-space channel were carried out to compare directly various convolutional codes that are proposed for use in deep-space systems.

  14. High Consequence System Surety process description

    SciTech Connect

    Randall, G.T.

    1995-09-01

    This report documents work-in-progress accomplished prior to programmatic changes that negated bringing this effort to conclusion as originally intended. The High Consequence System Surety (HCS{sup 2}) project pulls together a multi-disciplinary team to integrate the elements of surety safety, security, control, reliability and quality--into a new, encompassing process. The benefit of using this process is enhanced surety in the design of a high consequence system through an up-front, designed-in approach. This report describes the integrated, high consequence surety process and includes a hypothetical example to illustrate the process.

  15. High Consequence System Surety. Issue 1

    SciTech Connect

    Randall, G.T.

    1994-07-11

    High Consequence System Surety is an ongoing project at Sandia National Laboratories. This project pulls together a multi- disciplinary team to integrate the elements of surety into an encompassing process. The surety process will be augmented and validated by applying it to an automated system handling a critical nuclear weapon component at the Mason & Hanger Pantex Plant. This paper presents the development to date of an integrated, high consequence surety process.

  16. The EGS5 Code System

    SciTech Connect

    Hirayama, Hideo; Namito, Yoshihito; Bielajew, Alex F.; Wilderman, Scott J.; U., Michigan; Nelson, Walter R.; /SLAC

    2005-12-20

    In the nineteen years since EGS4 was released, it has been used in a wide variety of applications, particularly in medical physics, radiation measurement studies, and industrial development. Every new user and every new application bring new challenges for Monte Carlo code designers, and code refinements and bug fixes eventually result in a code that becomes difficult to maintain. Several of the code modifications represented significant advances in electron and photon transport physics, and required a more substantial invocation than code patching. Moreover, the arcane MORTRAN3[48] computer language of EGS4, was highest on the complaint list of the users of EGS4. The size of the EGS4 user base is difficult to measure, as there never existed a formal user registration process. However, some idea of the numbers may be gleaned from the number of EGS4 manuals that were produced and distributed at SLAC: almost three thousand. Consequently, the EGS5 project was undertaken. It was decided to employ the FORTRAN 77 compiler, yet include as much as possible, the structural beauty and power of MORTRAN3. This report consists of four chapters and several appendices. Chapter 1 is an introduction to EGS5 and to this report in general. We suggest that you read it. Chapter 2 is a major update of similar chapters in the old EGS4 report[126] (SLAC-265) and the old EGS3 report[61] (SLAC-210), in which all the details of the old physics (i.e., models which were carried over from EGS4) and the new physics are gathered together. The descriptions of the new physics are extensive, and not for the faint of heart. Detailed knowledge of the contents of Chapter 2 is not essential in order to use EGS, but sophisticated users should be aware of its contents. In particular, details of the restrictions on the range of applicability of EGS are dispersed throughout the chapter. First-time users of EGS should skip Chapter 2 and come back to it later if necessary. With the release of the EGS4 version

  17. Calculations of reactor-accident consequences, Version 2. CRAC2: computer code user's guide

    SciTech Connect

    Ritchie, L.T.; Johnson, J.D.; Blond, R.M.

    1983-02-01

    The CRAC2 computer code is a revision of the Calculation of Reactor Accident Consequences computer code, CRAC, developed for the Reactor Safety Study. The CRAC2 computer code incorporates significant modeling improvements in the areas of weather sequence sampling and emergency response, and refinements to the plume rise, atmospheric dispersion, and wet deposition models. New output capabilities have also been added. This guide is to facilitate the informed and intelligent use of CRAC2. It includes descriptions of the input data, the output results, the file structures, control information, and five sample problems.

  18. Power System Optimization Codes Modified

    NASA Technical Reports Server (NTRS)

    Juhasz, Albert J.

    1999-01-01

    A major modification of and addition to existing Closed Brayton Cycle (CBC) space power system optimization codes was completed. These modifications relate to the global minimum mass search driver programs containing three nested iteration loops comprising iterations on cycle temperature ratio, and three separate pressure ratio iteration loops--one loop for maximizing thermodynamic efficiency, one for minimizing radiator area, and a final loop for minimizing overall power system mass. Using the method of steepest ascent, the code sweeps through the pressure ratio space repeatedly, each time with smaller iteration step sizes, so that the three optimum pressure ratios can be obtained to any desired accuracy for each of the objective functions referred to above (i.e., maximum thermodynamic efficiency, minimum radiator area, and minimum system mass). Two separate options for the power system heat source are available: 1. A nuclear fission reactor can be used. It is provided with a radiation shield 1. (composed of a lithium hydride (LiH) neutron shield and tungsten (W) gamma shield). Suboptions can be used to select the type of reactor (i.e., fast spectrum liquid metal cooled or epithermal high-temperature gas reactor (HTGR)). 2. A solar heat source can be used. This option includes a parabolic concentrator and heat receiver for raising the temperature of the recirculating working fluid. A useful feature of the code modifications is that key cycle parameters are displayed, including the overall system specific mass in kilograms per kilowatt and the system specific power in watts per kilogram, as the results for each temperature ratio are computed. As the minimum mass temperature ratio is encountered, a message is printed out. Several levels of detailed information on cycle state points, subsystem mass results, and radiator temperature profiles are stored for this temperature ratio condition and can be displayed or printed by users.

  19. Computer access security code system

    NASA Technical Reports Server (NTRS)

    Collins, Earl R., Jr. (Inventor)

    1990-01-01

    A security code system for controlling access to computer and computer-controlled entry situations comprises a plurality of subsets of alpha-numeric characters disposed in random order in matrices of at least two dimensions forming theoretical rectangles, cubes, etc., such that when access is desired, at least one pair of previously unused character subsets not found in the same row or column of the matrix is chosen at random and transmitted by the computer. The proper response to gain access is transmittal of subsets which complete the rectangle, and/or a parallelepiped whose opposite corners were defined by first groups of code. Once used, subsets are not used again to absolutely defeat unauthorized access by eavesdropping, and the like.

  20. Software development methodology for high consequence systems

    SciTech Connect

    Baca, L.S.; Bouchard, J.F.; Collins, E.W.; Eisenhour, M.; Neidigk, D.D.; Shortencarier, M.J.; Trellue, P.A.

    1997-10-01

    This document describes a Software Development Methodology for High Consequence Systems. A High Consequence System is a system whose failure could lead to serious injury, loss of life, destruction of valuable resources, unauthorized use, damaged reputation or loss of credibility or compromise of protected information. This methodology can be scaled for use in projects of any size and complexity and does not prescribe any specific software engineering technology. Tasks are described that ensure software is developed in a controlled environment. The effort needed to complete the tasks will vary according to the size, complexity, and risks of the project. The emphasis of this methodology is on obtaining the desired attributes for each individual High Consequence System.

  1. Human factors in high consequence manufacturing systems

    SciTech Connect

    Forsythe, C.; Grose, E.

    1997-11-01

    A high consequence system is often defined as one in which the potential exists for severe or catastrophic accidents. Familiar examples include nuclear power plants, airline and other mass transportation, dams and reservoirs, and large-scale food processing. Many manufacturing systems also qualify as high consequence systems. Much of the authors` experience with high consequence systems derives from work associated with the surveillance and dismantlement of nuclear weapons for the US Department of Energy. With such operations, there exists a risk of high explosive detonation accompanied by radiological dispersal and, potentially, nuclear detonation. Analysis of major industrial accidents such as Three Mile Island, Chernobyl and Bhopal have revealed that these incidents were not attributable to a single event or direct cause, but were the result of multiple factors that combined to create a condition ripe for an accident. In each case, human error was a critical factor contributing to the accident. Consequently, many authors have emphasized the need for greater appreciation of systematic factors and in particular, human activities. This paper discusses approaches used in hazard analysis of US nuclear weapons operations to assess risk associated with human factors.

  2. System Safety and the Unintended Consequence

    NASA Technical Reports Server (NTRS)

    Watson, Clifford

    2012-01-01

    The analysis and identification of risks often result in design changes or modification of operational steps. This paper identifies the potential of unintended consequences as an over-looked result of these changes. Examples of societal changes such as prohibition, regulatory changes including mandating lifeboats on passenger ships, and engineering proposals or design changes to automobiles and spaceflight hardware are used to demonstrate that the System Safety Engineer must be cognizant of the potential for unintended consequences as a result of an analysis. Conclusions of the report indicate the need for additional foresight and consideration of the potential effects of analysis-driven design, processing changes, and/or operational modifications.

  3. An Interactive Concatenated Turbo Coding System

    NASA Technical Reports Server (NTRS)

    Liu, Ye; Tang, Heng; Lin, Shu; Fossorier, Marc

    1999-01-01

    This paper presents a concatenated turbo coding system in which a Reed-Solomon outer code is concatenated with a binary turbo inner code. In the proposed system, the outer code decoder and the inner turbo code decoder interact to achieve both good bit error and frame error performances. The outer code decoder helps the inner turbo code decoder to terminate its decoding iteration while the inner turbo code decoder provides soft-output information to the outer code decoder to carry out a reliability-based soft- decision decoding. In the case that the outer code decoding fails, the outer code decoder instructs the inner code decoder to continue its decoding iterations until the outer code decoding is successful or a preset maximum number of decoding iterations is reached. This interaction between outer and inner code decoders reduces decoding delay. Also presented in the paper are an effective criterion for stopping the iteration process of the inner code decoder and a new reliability-based decoding algorithm for nonbinary codes.

  4. Correlation and synchrony transfer in integrate-and-fire neurons: basic properties and consequences for coding.

    PubMed

    Shea-Brown, Eric; Josić, Kresimir; de la Rocha, Jaime; Doiron, Brent

    2008-03-14

    We study how pairs of neurons transfer correlated input currents into correlated spikes. Over rapid time scales, correlation transfer increases with both spike time variability and rate; the dependence on variability disappears at large time scales. This persists for a nonlinear membrane model and for heterogeneous cell pairs, but strong nonmonotonicities follow from refractory effects. We present consequences for population coding and for the encoding of time-varying stimuli.

  5. The CALOR93 code system

    SciTech Connect

    Gabriel, T.A.

    1993-12-31

    The purpose of this paper is to describe a program package, CALOR93, that has been developed to design and analyze different detector systems, in particular, calorimeters which are used in high energy physics experiments to determine the energy of particles. One`s ability to design a calorimeter to perform a certain task can have a strong influence upon the validity of experimental results. The validity of the results obtained with CALOR93 has been verified many times by comparison with experimental data. The codes (HETC93, SPECT93, LIGHT, EGS4, MORSE, and MICAP) are quite generalized and detailed enough so that any experimental calorimeter setup can be studied. Due to this generalization, some software development is necessary because of the wide diversity of calorimeter designs.

  6. Tandem Mirror Reactor Systems Code (Version I)

    SciTech Connect

    Reid, R.L.; Finn, P.A.; Gohar, M.Y.; Barrett, R.J.; Gorker, G.E.; Spampinaton, P.T.; Bulmer, R.H.; Dorn, D.W.; Perkins, L.J.; Ghose, S.

    1985-09-01

    A computer code was developed to model a Tandem Mirror Reactor. Ths is the first Tandem Mirror Reactor model to couple, in detail, the highly linked physics, magnetics, and neutronic analysis into a single code. This report describes the code architecture, provides a summary description of the modules comprising the code, and includes an example execution of the Tandem Mirror Reactor Systems Code. Results from this code for two sensitivity studies are also included. These studies are: (1) to determine the impact of center cell plasma radius, length, and ion temperature on reactor cost and performance at constant fusion power; and (2) to determine the impact of reactor power level on cost.

  7. [Some consequences of the application of the new Swiss penal code on legal psychiatry].

    PubMed

    Gasser, Jacques; Gravier, Bruno

    2007-09-19

    The new text of the Swiss penal code, which entered into effect at the beginning of 2007, has many incidences on the practice of the psychiatrists realizing expertises in the penal field or engaged in the application of legal measures imposing a treatment. The most notable consequences of this text are, on the one hand, a new definition of the concept of penal irresponsibility which is not necessarily any more related to a psychiatric diagnosis and, on the other hand, a new definition of legal constraints that justice can take to prevent new punishable acts and which appreciably modifies the place of the psychiatrists in the questions binding psychiatric care and social control.

  8. Performance of convolutionally coded unbalanced QPSK systems

    NASA Technical Reports Server (NTRS)

    Divsalar, D.; Yuen, J. H.

    1980-01-01

    An evaluation is presented of the performance of three representative convolutionally coded unbalanced quadri-phase-shift-keying (UQPSK) systems in the presence of noisy carrier reference and crosstalk. The use of a coded UQPSK system for transmitting two telemetry data streams with different rates and different powers has been proposed for the Venus Orbiting Imaging Radar mission. Analytical expressions for bit error rates in the presence of a noisy carrier phase reference are derived for three representative cases: (1) I and Q channels are coded independently; (2) I channel is coded, Q channel is uncoded; and (3) I and Q channels are coded by a common 1/2 code. For rate 1/2 convolutional codes, QPSK modulation can be used to reduce the bandwidth requirement.

  9. HERCULES: A Pattern Driven Code Transformation System

    SciTech Connect

    Kartsaklis, Christos; Hernandez, Oscar R; Hsu, Chung-Hsing; Ilsche, Thomas; Joubert, Wayne; Graham, Richard L

    2012-01-01

    New parallel computers are emerging, but developing efficient scientific code for them remains difficult. A scientist must manage not only the science-domain complexity but also the performance-optimization complexity. HERCULES is a code transformation system designed to help the scientist to separate the two concerns, which improves code maintenance, and facilitates performance optimization. The system combines three technologies, code patterns, transformation scripts and compiler plugins, to provide the scientist with an environment to quickly implement code transformations that suit his needs. Unlike existing code optimization tools, HERCULES is unique in its focus on user-level accessibility. In this paper we discuss the design, implementation and an initial evaluation of HERCULES.

  10. The design of wavefront coded imaging system

    NASA Astrophysics Data System (ADS)

    Lan, Shun; Cen, Zhaofeng; Li, Xiaotong

    2016-10-01

    Wavefront Coding is a new method to extend the depth of field, which combines optical design and signal processing together. By using optical design software ZEMAX ,we designed a practical wavefront coded imaging system based on a conventional Cooke triplet system .Unlike conventional optical system, the wavefront of this new system is modulated by a specially designed phase mask, which makes the point spread function (PSF)of optical system not sensitive to defocus. Therefore, a series of same blurred images obtained at the image plane. In addition, the optical transfer function (OTF) of the wavefront coded imaging system is independent of focus, which is nearly constant with misfocus and has no regions of zeros. All object information can be completely recovered through digital filtering at different defocus positions. The focus invariance of MTF is selected as merit function in this design. And the coefficients of phase mask are set as optimization goals. Compared to conventional optical system, wavefront coded imaging system obtains better quality images under different object distances. Some deficiencies appear in the restored images due to the influence of digital filtering algorithm, which are also analyzed in this paper. The depth of field of the designed wavefront coded imaging system is about 28 times larger than initial optical system, while keeping higher optical power and resolution at the image plane.

  11. Arabic Natural Language Processing System Code Library

    DTIC Science & Technology

    2014-06-01

    Adelphi, MD 20783-1197 This technical note provides a brief description of a Java library for Arabic natural language processing ( NLP ) containing code...for training and applying the Arabic NLP system described in the paper "A Cross-Task Flexible Transition Model for Arabic Tokenization, Affix...processing, NLP , Java, code 14 Stephen C. Tratz (301) 394-2305Unclassified Unclassified Unclassified UU ii Contents 1. Introduction 1 2. File Overview 1 3

  12. Code Usage Analysis System (CUAS)

    NASA Technical Reports Server (NTRS)

    Horsley, P. H.; Oliver, J. D.

    1976-01-01

    A set of computer programs is offered to aid a user in evaluating performance of an application program. The system provides reports of subroutine usage, program errors, and segment loading which occurred during the execution of an application program. It is presented in support of the development and validation of the space vehicle dynamics project.

  13. Implementing a modular system of computer codes

    SciTech Connect

    Vondy, D.R.; Fowler, T.B.

    1983-07-01

    A modular computation system has been developed for nuclear reactor core analysis. The codes can be applied repeatedly in blocks without extensive user input data, as needed for reactor history calculations. The primary control options over the calculational paths and task assignments within the codes are blocked separately from other instructions, admitting ready access by user input instruction or directions from automated procedures and promoting flexible and diverse applications at minimum application cost. Data interfacing is done under formal specifications with data files manipulated by an informed manager. This report emphasizes the system aspects and the development of useful capability, hopefully informative and useful to anyone developing a modular code system of much sophistication. Overall, this report in a general way summarizes the many factors and difficulties that are faced in making reactor core calculations, based on the experience of the authors. It provides the background on which work on HTGR reactor physics is being carried out.

  14. Bar-code automated waste tracking system

    SciTech Connect

    Hull, T.E.

    1994-10-01

    The Bar-Code Automated Waste Tracking System was designed to be a site-Specific program with a general purpose application for transportability to other facilities. The system is user-friendly, totally automated, and incorporates the use of a drive-up window that is close to the areas dealing in container preparation, delivery, pickup, and disposal. The system features ``stop-and-go`` operation rather than a long, tedious, error-prone manual entry. The system is designed for automation but allows operators to concentrate on proper handling of waste while maintaining manual entry of data as a backup. A large wall plaque filled with bar-code labels is used to input specific details about any movement of waste.

  15. An integrated radiation physics computer code system.

    NASA Technical Reports Server (NTRS)

    Steyn, J. J.; Harris, D. W.

    1972-01-01

    An integrated computer code system for the semi-automatic and rapid analysis of experimental and analytic problems in gamma photon and fast neutron radiation physics is presented. Such problems as the design of optimum radiation shields and radioisotope power source configurations may be studied. The system codes allow for the unfolding of complex neutron and gamma photon experimental spectra. Monte Carlo and analytic techniques are used for the theoretical prediction of radiation transport. The system includes a multichannel pulse-height analyzer scintillation and semiconductor spectrometer coupled to an on-line digital computer with appropriate peripheral equipment. The system is geometry generalized as well as self-contained with respect to material nuclear cross sections and the determination of the spectrometer response functions. Input data may be either analytic or experimental.

  16. User Instructions for the Systems Assessment Capability, Rev. 1, Computer Codes Volume 3: Utility Codes

    SciTech Connect

    Eslinger, Paul W.; Aaberg, Rosanne L.; Lopresti, Charles A.; Miley, Terri B.; Nichols, William E.; Strenge, Dennis L.

    2004-09-14

    This document contains detailed user instructions for a suite of utility codes developed for Rev. 1 of the Systems Assessment Capability. The suite of computer codes for Rev. 1 of Systems Assessment Capability performs many functions.

  17. Bilingual processing of ASL-English code-blends: The consequences of accessing two lexical representations simultaneously

    PubMed Central

    Emmorey, Karen; Petrich, Jennifer; Gollan, Tamar H.

    2012-01-01

    Bilinguals who are fluent in American Sign Language (ASL) and English often produce code-blends - simultaneously articulating a sign and a word while conversing with other ASL-English bilinguals. To investigate the cognitive mechanisms underlying code-blend processing, we compared picture-naming times (Experiment 1) and semantic categorization times (Experiment 2) for code-blends versus ASL signs and English words produced alone. In production, code-blending did not slow lexical retrieval for ASL and actually facilitated access to low-frequency signs. However, code-blending delayed speech production because bimodal bilinguals synchronized English and ASL lexical onsets. In comprehension, code-blending speeded access to both languages. Bimodal bilinguals’ ability to produce code-blends without any cost to ASL implies that the language system either has (or can develop) a mechanism for switching off competition to allow simultaneous production of close competitors. Code-blend facilitation effects during comprehension likely reflect cross-linguistic (and cross-modal) integration at the phonological and/or semantic levels. The absence of any consistent processing costs for code-blending illustrates a surprising limitation on dual-task costs and may explain why bimodal bilinguals code-blend more often than they code-switch. PMID:22773886

  18. Safety assessment of high consequence robotics system

    SciTech Connect

    Robinson, D.G.; Atcitty, C.B.

    1996-08-01

    This paper outlines the use of a failure modes and effects analysis for the safety assessment of a robotic system being developed at Sandia National Laboratories. The robotic system, the weigh and leak check system, is to replace a manual process for weight and leakage of nuclear materials at the DOE Pantex facility. Failure modes and effects analyses were completed for the robotics process to ensure that safety goals for the systems have been met. Due to the flexible nature of the robot configuration, traditional failure modes and effects analysis (FMEA) were not applicable. In addition, the primary focus of safety assessments of robotics systems has been the protection of personnel in the immediate area. In this application, the safety analysis must account for the sensitivities of the payload as well as traditional issues. A unique variation on the classical FMEA was developed that permits an organized and quite effective tool to be used to assure that safety was adequately considered during the development of the robotic system. The fundamental aspects of the approach are outlined in the paper.

  19. BCH codes for large IC random-access memory systems

    NASA Technical Reports Server (NTRS)

    Lin, S.; Costello, D. J., Jr.

    1983-01-01

    In this report some shortened BCH codes for possible applications to large IC random-access memory systems are presented. These codes are given by their parity-check matrices. Encoding and decoding of these codes are discussed.

  20. LPGS. Code System for Calculating Radiation Exposure

    SciTech Connect

    White, J.E.; Eckerman, K.F.

    1983-01-01

    LPGS was developed to calculate the radiological impacts resulting from radioactive releases to the hydrosphere. The name LPGS was derived from the Liquid Pathway Generic Study for which the original code was used primarily as an analytic tool in the assessment process. The hydrosphere is represented by the following types of water bodies: estuary, small river, well, lake, and one-dimensional (1-d) river. LPGS is designed to calculate radiation dose (individual and population) to body organs as a function of time for the various exposure pathways. The radiological consequences to the aquatic biota are estimated. Several simplified radionuclide transport models are employed with built-in formulations to describe the release rate of the radionuclides. A tabulated user-supplied release model can be input, if desired. Printer plots of dose versus time for the various exposure pathways are provided.

  1. [Remodeling of Cardiovascular System: Causes and Consequences].

    PubMed

    Lopatina, E V; Kipenko, A V; Penniyaynen, V A; Pasatetckaia, N A; Tsyrline, V A

    2016-01-01

    Literature and our data suggest the regulatory action of a number of biologically active substances (catecholamines, cardiac glycosides, β-blockers, angiotensin-converting-enzyme inhibitor) on the growth and proliferation of heart cells. By using of organotypic tissue culture has proved that the basis of this regulation is the ability of test substances, receptor- or transducer-mediated signaling to modulate the function of Na⁺, K⁺-ATPase. There is a delay in the development of vascular smooth muscle in the late postnatal period in rats with the blockade of the sympathetic nervous system in the prenatal period. The relationship between vascular remodeling and contractile activity is described. It seems that one of the causes of high blood pressure is a remodeling of the cardiovascular system, which precedes the development of hypertension.

  2. Analog system for computing sparse codes

    DOEpatents

    Rozell, Christopher John; Johnson, Don Herrick; Baraniuk, Richard Gordon; Olshausen, Bruno A.; Ortman, Robert Lowell

    2010-08-24

    A parallel dynamical system for computing sparse representations of data, i.e., where the data can be fully represented in terms of a small number of non-zero code elements, and for reconstructing compressively sensed images. The system is based on the principles of thresholding and local competition that solves a family of sparse approximation problems corresponding to various sparsity metrics. The system utilizes Locally Competitive Algorithms (LCAs), nodes in a population continually compete with neighboring units using (usually one-way) lateral inhibition to calculate coefficients representing an input in an over complete dictionary.

  3. Comparison of MACCS users calculations for the international comparison exercise on probabilistic accident consequence assessment code, October 1989--June 1993

    SciTech Connect

    Neymotin, L.

    1994-04-01

    Over the past several years, the OECD/NEA and CEC sponsored an international program intercomparing a group of six probabilistic consequence assessment (PCA) codes designed to simulate health and economic consequences of radioactive releases into atmosphere of radioactive materials following severe accidents at nuclear power plants (NPPs): ARANO (Finland), CONDOR (UK), COSYMA (CEC), LENA (Sweden), MACCS (USA), and OSCAAR (Japan). In parallel with this effort, two separate groups performed similar calculations using the MACCS and COSYMA codes. Results produced in the MACCS Users Group (Greece, Italy, Spain, and USA) calculations and their comparison are contained in the present report. Version 1.5.11.1 of the MACCS code was used for the calculations. Good agreement between the results produced in the four participating calculations has been reached, with the exception of the results related to the ingestion pathway dose predictions. The main reason for the scatter in those particular results is attributed to the lack of a straightforward implementation of the specifications for agricultural production and counter-measures criteria provided for the exercise. A significantly smaller scatter in predictions of other consequences was successfully explained by differences in meteorological files and weather sampling, grids, rain distance intervals, dispersion model options, and population distributions.

  4. Suppressing feedback in a distributed video coding system by employing real field codes

    NASA Astrophysics Data System (ADS)

    Louw, Daniel J.; Kaneko, Haruhiko

    2013-12-01

    Single-view distributed video coding (DVC) is a video compression method that allows for the computational complexity of the system to be shifted from the encoder to the decoder. The reduced encoding complexity makes DVC attractive for use in systems where processing power or energy use at the encoder is constrained, for example, in wireless devices and surveillance systems. One of the biggest challenges in implementing DVC systems is that the required rate must be known at the encoder. The conventional approach is to use a feedback channel from the decoder to control the rate. Feedback channels introduce their own difficulties such as increased latency and buffering requirements, which makes the resultant system unsuitable for some applications. Alternative approaches, which do not employ feedback, suffer from either increased encoder complexity due to performing motion estimation at the encoder, or an inaccurate rate estimate. Inaccurate rate estimates can result in a reduced average rate-distortion performance, as well as unpleasant visual artifacts. In this paper, the authors propose a single-view DVC system that does not require a feedback channel. The consequences of inaccuracies in the rate estimate are addressed by using codes defined over the real field and a decoder employing successive refinement. The result is a codec with performance that is comparable to that of a feedback-based system at low rates without the use of motion estimation at the encoder or a feedback path. The disadvantage of the approach is a reduction in average rate-distortion performance in the high-rate regime for sequences with significant motion.

  5. Control of Chromatin Structure by Spt6: Different Consequences in Coding and Regulatory Regions▿ †

    PubMed Central

    Ivanovska, Iva; Jacques, Pierre-Étienne; Rando, Oliver J.; Robert, François; Winston, Fred

    2011-01-01

    Spt6 is a highly conserved factor required for normal transcription and chromatin structure. To gain new insights into the roles of Spt6, we measured nucleosome occupancy along Saccharomyces cerevisiae chromosome III in an spt6 mutant. We found that the level of nucleosomes is greatly reduced across some, but not all, coding regions in an spt6 mutant, with nucleosome loss preferentially occurring over highly transcribed genes. This result provides strong support for recent studies that have suggested that transcription at low levels does not displace nucleosomes, while transcription at high levels does, and adds the idea that Spt6 is required for restoration of nucleosomes at the highly transcribed genes. Unexpectedly, our studies have also suggested that the spt6 effects on nucleosome levels across coding regions do not cause the spt6 effects on mRNA levels, suggesting that the role of Spt6 across coding regions is separate from its role in transcriptional regulation. In the case of the CHA1 gene, regulation by Spt6 likely occurs by controlling the position of the +1 nucleosome. These results, along with previous studies, suggest that Spt6 regulates transcription by controlling chromatin structure over regulatory regions, and its effects on nucleosome levels over coding regions likely serve an independent function. PMID:21098123

  6. Irregular Repeat-Accumulate Codes for Volume Holographic Memory Systems

    NASA Astrophysics Data System (ADS)

    Pishro-Nik, Hossein; Fekri, Faramarz

    2004-09-01

    We investigate the application of irregular repeat-accumulate (IRA) codes in volume holographic memory (VHM) systems. We introduce methodologies to design efficient IRA codes. We show that a judiciously designed IRA code for a typical VHM can be as good as the optimized irregular low-density-parity-check codes while having the additional advantage of lower encoding complexity. Moreover, we present a method to reduce the error-floor effect of the IRA codes in the VHM systems. This method explores the structure of the noise pattern in holographic memories. Finally, we explain why IRA codes are good candidates for the VHM systems.

  7. A Neural Code That Is Isometric to Vocal Output and Correlates with Its Sensory Consequences

    PubMed Central

    Vyssotski, Alexei L.; Stepien, Anna E.; Keller, Georg B.; Hahnloser, Richard H. R.

    2016-01-01

    What cortical inputs are provided to motor control areas while they drive complex learned behaviors? We study this question in the nucleus interface of the nidopallium (NIf), which is required for normal birdsong production and provides the main source of auditory input to HVC, the driver of adult song. In juvenile and adult zebra finches, we find that spikes in NIf projection neurons precede vocalizations by several tens of milliseconds and are insensitive to distortions of auditory feedback. We identify a local isometry between NIf output and vocalizations: quasi-identical notes produced in different syllables are preceded by highly similar NIf spike patterns. NIf multiunit firing during song precedes responses in auditory cortical neurons by about 50 ms, revealing delayed congruence between NIf spiking and a neural representation of auditory feedback. Our findings suggest that NIf codes for imminent acoustic events within vocal performance. PMID:27723764

  8. A Neural Code That Is Isometric to Vocal Output and Correlates with Its Sensory Consequences.

    PubMed

    Vyssotski, Alexei L; Stepien, Anna E; Keller, Georg B; Hahnloser, Richard H R

    2016-10-01

    What cortical inputs are provided to motor control areas while they drive complex learned behaviors? We study this question in the nucleus interface of the nidopallium (NIf), which is required for normal birdsong production and provides the main source of auditory input to HVC, the driver of adult song. In juvenile and adult zebra finches, we find that spikes in NIf projection neurons precede vocalizations by several tens of milliseconds and are insensitive to distortions of auditory feedback. We identify a local isometry between NIf output and vocalizations: quasi-identical notes produced in different syllables are preceded by highly similar NIf spike patterns. NIf multiunit firing during song precedes responses in auditory cortical neurons by about 50 ms, revealing delayed congruence between NIf spiking and a neural representation of auditory feedback. Our findings suggest that NIf codes for imminent acoustic events within vocal performance.

  9. [The Medical Information Systems Project clinical coding and surgeons: why should surgeons code and how?].

    PubMed

    Bensadoun, H

    2001-02-01

    The clinical coding system recently instituted in France, the PMSI (Projet de Médicalisation du Système d'Information), has become an unavoidable element in funding allocations for short-term private and public hospitalization centers. Surgeons must take into serious consideration this controversial medicoeconomic instrument. Coding is a dire time-consuming task but, like the hospitalization or surgery report, is an essential part of the discharge procedure. Coding can in the long run be used to establish pricing by pathology. Surgeons should learn the rules and the logic behind this coding system: which, not being based on a medical rationale, may be somewhat difficult to understand. Choosing the right main diagnosis and the comobidity Items is crucial. Quality homogeneous coding is essential if one expects the health authorities to make good use of the system. Our medical societies have a role to play in promoting and harmonizing the coding technique.

  10. Classification and Coding, An Introduction and Review of Classification and Coding Systems. Management Guide No. 1.

    ERIC Educational Resources Information Center

    MacConnell, W.

    Nearly all organizations are faced with problems of classifying and coding financial data, management and technical information, components, stores, etc. and need to apply some logical and meaningful system of identification. This report examines the objectives and applications of classification and coding systems and reviews eight systems…

  11. SCALE Code System 6.2.1

    SciTech Connect

    Rearden, Bradley T.; Jessee, Matthew Anderson

    2016-08-01

    The SCALE Code System is a widely-used modeling and simulation suite for nuclear safety analysis and design that is developed, maintained, tested, and managed by the Reactor and Nuclear Systems Division (RNSD) of Oak Ridge National Laboratory (ORNL). SCALE provides a comprehensive, verified and validated, user-friendly tool set for criticality safety, reactor and lattice physics, radiation shielding, spent fuel and radioactive source term characterization, and sensitivity and uncertainty analysis. Since 1980, regulators, licensees, and research institutions around the world have used SCALE for safety analysis and design. SCALE provides an integrated framework with dozens of computational modules including three deterministic and three Monte Carlo radiation transport solvers that are selected based on the desired solution strategy. SCALE includes current nuclear data libraries and problem-dependent processing tools for continuous-energy (CE) and multigroup (MG) neutronics and coupled neutron-gamma calculations, as well as activation, depletion, and decay calculations. SCALE includes unique capabilities for automated variance reduction for shielding calculations, as well as sensitivity and uncertainty analysis. SCALE’s graphical user interfaces assist with accurate system modeling, visualization of nuclear data, and convenient access to desired results.

  12. Selected Systems Engineering Process Deficiencies and Their Consequences

    NASA Technical Reports Server (NTRS)

    Thomas, Lawrence Dale

    2006-01-01

    The systems engineering process is well established and well understood. While this statement could be argued in the light of the many systems engineering guidelines and that have been developed, comparative review of these respective descriptions reveal that they differ primarily in the number of discrete steps or other nuances, and are at their core essentially common. Likewise, the systems engineering textbooks differ primarily in the context for application of systems engineering or in the utilization of evolved tools and techniques, not in the basic method. Thus, failures in systems engineering cannot credibly be attributed to implementation of the wrong systems engineering process among alternatives. However, numerous systems failures can be attributed to deficient implementation of the systems engineering process. What may clearly be perceived as a system engineering deficiency in retrospect can appear to be a well considered system engineering efficiency in real time - an efficiency taken to reduce cost or meet a schedule, or more often both. Typically these efficiencies are grounded on apparently solid rationale, such as reuse of heritage hardware or software. Over time, unintended consequences of a systems engineering process deficiency may begin to be realized, and unfortunately often the consequence is system failure. This paper describes several actual cases of system failures that resulted from deficiencies in their systems engineering process implementation, including the Ariane 5 and the Hubble Space Telescope.

  13. Parents' Cultural Belief Systems: Their Origins, Expressions, and Consequences.

    ERIC Educational Resources Information Center

    Harkness, Sara, Ed.; Super, Charles M., Ed.

    This volume presents observations and thinking of scholars from a variety of disciplines about parental cultural belief systems. The chapters are concerned with the sources and consequences of parental ethnotheories in a number of societies. The following chapters are included: (1) "Introduction" (Sara Harkness and Charles M. Super); (2)…

  14. EquiFACS: The Equine Facial Action Coding System

    PubMed Central

    Wathan, Jen; Burrows, Anne M.; Waller, Bridget M.; McComb, Karen

    2015-01-01

    Although previous studies of horses have investigated their facial expressions in specific contexts, e.g. pain, until now there has been no methodology available that documents all the possible facial movements of the horse and provides a way to record all potential facial configurations. This is essential for an objective description of horse facial expressions across a range of contexts that reflect different emotional states. Facial Action Coding Systems (FACS) provide a systematic methodology of identifying and coding facial expressions on the basis of underlying facial musculature and muscle movement. FACS are anatomically based and document all possible facial movements rather than a configuration of movements associated with a particular situation. Consequently, FACS can be applied as a tool for a wide range of research questions. We developed FACS for the domestic horse (Equus caballus) through anatomical investigation of the underlying musculature and subsequent analysis of naturally occurring behaviour captured on high quality video. Discrete facial movements were identified and described in terms of the underlying muscle contractions, in correspondence with previous FACS systems. The reliability of others to be able to learn this system (EquiFACS) and consistently code behavioural sequences was high—and this included people with no previous experience of horses. A wide range of facial movements were identified, including many that are also seen in primates and other domestic animals (dogs and cats). EquiFACS provides a method that can now be used to document the facial movements associated with different social contexts and thus to address questions relevant to understanding social cognition and comparative psychology, as well as informing current veterinary and animal welfare practices. PMID:26244573

  15. Deductive Glue Code Synthesis for Embedded Software Systems Based on Code Patterns

    NASA Technical Reports Server (NTRS)

    Liu, Jian; Fu, Jicheng; Zhang, Yansheng; Bastani, Farokh; Yen, I-Ling; Tai, Ann; Chau, Savio N.

    2006-01-01

    Automated code synthesis is a constructive process that can be used to generate programs from specifications. It can, thus, greatly reduce the software development cost and time. The use of formal code synthesis approach for software generation further increases the dependability of the system. Though code synthesis has many potential benefits, the synthesis techniques are still limited. Meanwhile, components are widely used in embedded system development. Applying code synthesis to component based software development (CBSD) process can greatly enhance the capability of code synthesis while reducing the component composition efforts. In this paper, we discuss the issues and techniques for applying deductive code synthesis techniques to CBSD. For deductive synthesis in CBSD, a rule base is the key for inferring appropriate component composition. We use the code patterns to guide the development of rules. Code patterns have been proposed to capture the typical usages of the components. Several general composition operations have been identified to facilitate systematic composition. We present the technique for rule development and automated generation of new patterns from existing code patterns. A case study of using this method in building a real-time control system is also presented.

  16. Communication Systems Simulator with Error Correcting Codes Using MATLAB

    ERIC Educational Resources Information Center

    Gomez, C.; Gonzalez, J. E.; Pardo, J. M.

    2003-01-01

    In this work, the characteristics of a simulator for channel coding techniques used in communication systems, are described. This software has been designed for engineering students in order to facilitate the understanding of how the error correcting codes work. To help students understand easily the concepts related to these kinds of codes, a…

  17. Channel coding in the space station data system network

    NASA Technical Reports Server (NTRS)

    Healy, T.

    1982-01-01

    A detailed discussion of the use of channel coding for error correction, privacy/secrecy, channel separation, and synchronization is presented. Channel coding, in one form or another, is an established and common element in data systems. No analysis and design of a major new system would fail to consider ways in which channel coding could make the system more effective. The presence of channel coding on TDRS, Shuttle, the Advanced Communication Technology Satellite Program system, the JSC-proposed Space Operations Center, and the proposed 30/20 GHz Satellite Communication System strongly support the requirement for the utilization of coding for the communications channel. The designers of the space station data system have to consider the use of channel coding.

  18. Recent developments in the Los Alamos radiation transport code system

    SciTech Connect

    Forster, R.A.; Parsons, K.

    1997-06-01

    A brief progress report on updates to the Los Alamos Radiation Transport Code System (LARTCS) for solving criticality and fixed-source problems is provided. LARTCS integrates the Diffusion Accelerated Neutral Transport (DANT) discrete ordinates codes with the Monte Carlo N-Particle (MCNP) code. The LARCTS code is being developed with a graphical user interface for problem setup and analysis. Progress in the DANT system for criticality applications include a two-dimensional module which can be linked to a mesh-generation code and a faster iteration scheme. Updates to MCNP Version 4A allow statistical checks of calculated Monte Carlo results.

  19. The Therapy Process Observational Coding System for Child Psychotherapy Strategies Scale

    ERIC Educational Resources Information Center

    McLeod, Bryce D.; Weisz, John R.

    2010-01-01

    Most everyday child and adolescent psychotherapy does not follow manuals that document the procedures. Consequently, usual clinical care has remained poorly understood and rarely studied. The Therapy Process Observational Coding System for Child Psychotherapy-Strategies scale (TPOCS-S) is an observational measure of youth psychotherapy procedures…

  20. RAMONA-4B code for BWR systems analysis

    SciTech Connect

    Cheng, H.S.; Rohatgi, U.S.

    1996-12-31

    The RAMONA-4B code is a coupled thermal-hydraulic, 3D kinetics code for plant transient analyses of a complete Boiling Water Reactor (BWR) system including Reactor Pressure Vessel (RPV), Balance of Plant (BOP) and containment. The complete system representation enables an integrated and coupled systems analysis of a BWR without recourse to prescribed boundary conditions.

  1. Generalized optical code construction for enhanced and Modified Double Weight like codes without mapping for SAC-OCDMA systems

    NASA Astrophysics Data System (ADS)

    Kumawat, Soma; Ravi Kumar, M.

    2016-07-01

    Double Weight (DW) code family is one of the coding schemes proposed for Spectral Amplitude Coding-Optical Code Division Multiple Access (SAC-OCDMA) systems. Modified Double Weight (MDW) code for even weights and Enhanced Double Weight (EDW) code for odd weights are two algorithms extending the use of DW code for SAC-OCDMA systems. The above mentioned codes use mapping technique to provide codes for higher number of users. A new generalized algorithm to construct EDW and MDW like codes without mapping for any weight greater than 2 is proposed. A single code construction algorithm gives same length increment, Bit Error Rate (BER) calculation and other properties for all weights greater than 2. Algorithm first constructs a generalized basic matrix which is repeated in a different way to produce the codes for all users (different from mapping). The generalized code is analysed for BER using balanced detection and direct detection techniques.

  2. Modern Nuclear Data Evaluation with the TALYS Code System

    NASA Astrophysics Data System (ADS)

    Koning, A. J.; Rochman, D.

    2012-12-01

    This paper presents a general overview of nuclear data evaluation and its applications as developed at NRG, Petten. Based on concepts such as robustness, reproducibility and automation, modern calculation tools are exploited to produce original nuclear data libraries that meet the current demands on quality and completeness. This requires a system which comprises differential measurements, theory development, nuclear model codes, resonance analysis, evaluation, ENDF formatting, data processing and integral validation in one integrated approach. Software, built around the TALYS code, will be presented in which all these essential nuclear data components are seamlessly integrated. Besides the quality of the basic data and its extensive format testing, a second goal lies in the diversity of processing for different type of users. The implications of this scheme are unprecedented. The most important are: 1. Complete ENDF-6 nuclear data files, in the form of the TENDL library, including covariance matrices, for many isotopes, particles, energies, reaction channels and derived quantities. All isotopic data files are mutually consistent and are supposed to rival those of the major world libraries. 2. More exact uncertainty propagation from basic nuclear physics to applied (reactor) calculations based on a Monte Carlo approach: "Total" Monte Carlo (TMC), using random nuclear data libraries. 3. Automatic optimization in the form of systematic feedback from integral measurements back to the basic data. This method of work also opens a new way of approaching the analysis of nuclear applications, with consequences in both applied nuclear physics and safety of nuclear installations, and several examples are given here. This applied experience and feedback is integrated in a final step to improve the quality of the nuclear data, to change the users vision and finally to orchestrate their integration into simulation codes.

  3. The radiological assessment system for consequence analysis - RASCAL

    SciTech Connect

    Sjoreen, A.L.; Ramsdell, J.V.; Athey, G.F.

    1996-04-01

    The Radiological Assessment System for Consequence Analysis, Version 2.1 (RASCAL 2.1) has been developed for use during a response to radiological emergencies. The model estimates doses for comparison with U.S. Environmental Protection Agency (EPA) Protective Action Guides (PAGs) and thresholds for acute health effects. RASCAL was designed to be used by U.S. Nuclear Regulatory Commission (NRC) personnel who report to the site of a nuclear accident to conduct an independent evaluation of dose and consequence projections and personnel who conduct training and drills on emergency responses. It allows consideration of the dominant aspects of the source term, transport, dose, and consequences. RASCAL consists of three computational tools: ST-DOSE, FM-DOSE, and DECAY. ST-DOSE computes source term, atmospheric transport, and dose to man from accidental airborne releases of radionuclides. The source-term calculations are appropriate for accidents at U.S. power reactors. FM-DOSE computes doses from environmental concentrations of radionuclides in the air and on the ground. DECAY computes radiological decay and daughter in-growth. RASCAL 2.1 is a DOS application that can be run under Windows 3.1 and 95. RASCAL has been the starting point for other accident consequence models, notably INTERRAS, an international version of RASCAL, and HASCAL, an expansion of RASCAL that will model radiological, biological, and chemical accidents.

  4. FORTRAN Automated Code Evaluation System (FACES) user's manual, version 2

    NASA Technical Reports Server (NTRS)

    1975-01-01

    A system which provides analysis services for FORTRAN based software systems not normally available from system software is presented. The system is not a compiler, and compiler syntax diagnostics are not duplicated. For maximum adaptation to FORTRAN dialects, the code presented to the system is assumed to be compiler acceptable. The system concentrates on acceptable FORTRAN code features which are likely to produce undesirable results and identifies potential trouble areas before they become execution time malfunctions.

  5. TDRSS telecommunications system, PN code analysis

    NASA Technical Reports Server (NTRS)

    Dixon, R.; Gold, R.; Kaiser, F.

    1976-01-01

    The pseudo noise (PN) codes required to support the TDRSS telecommunications services are analyzed and the impact of alternate coding techniques on the user transponder equipment, the TDRSS equipment, and all factors that contribute to the acquisition and performance of these telecommunication services is assessed. Possible alternatives to the currently proposed hybrid FH/direct sequence acquisition procedures are considered and compared relative to acquisition time, implementation complexity, operational reliability, and cost. The hybrid FH/direct sequence technique is analyzed and rejected in favor of a recommended approach which minimizes acquisition time and user transponder complexity while maximizing probability of acquisition and overall link reliability.

  6. RELAP5/MOD3 code manual: Code structure, system models, and solution methods. Volume 1

    SciTech Connect

    1995-08-01

    The RELAP5 code has been developed for best estimate transient simulation of light water reactor coolant systems during postulated accidents. The code models the coupled behavior of the reactor coolant system and the core for loss-of-coolant accidents, and operational transients, such as anticipated transient without scram, loss of offsite power, loss of feedwater, and loss of flow. A generic modeling, approach is used that permits simulating a variety of thermal hydraulic systems. Control system and secondary system components are included to permit modeling of plant controls, turbines, condensers, and secondary feedwater systems. RELAP5/MOD3 code documentation is divided into seven volumes: Volume I provides modeling theory and associated numerical schemes.

  7. Identification coding schemes for modulated reflectance systems

    DOEpatents

    Coates, Don M.; Briles, Scott D.; Neagley, Daniel L.; Platts, David; Clark, David D.

    2006-08-22

    An identifying coding apparatus employing modulated reflectance technology involving a base station emitting a RF signal, with a tag, located remotely from the base station, and containing at least one antenna and predetermined other passive circuit components, receiving the RF signal and reflecting back to the base station a modulated signal indicative of characteristics related to the tag.

  8. Performance optimization of spectral amplitude coding OCDMA system using new enhanced multi diagonal code

    NASA Astrophysics Data System (ADS)

    Imtiaz, Waqas A.; Ilyas, M.; Khan, Yousaf

    2016-11-01

    This paper propose a new code to optimize the performance of spectral amplitude coding-optical code division multiple access (SAC-OCDMA) system. The unique two-matrix structure of the proposed enhanced multi diagonal (EMD) code and effective correlation properties, between intended and interfering subscribers, significantly elevates the performance of SAC-OCDMA system by negating multiple access interference (MAI) and associated phase induce intensity noise (PIIN). Performance of SAC-OCDMA system based on the proposed code is thoroughly analyzed for two detection techniques through analytic and simulation analysis by referring to bit error rate (BER), signal to noise ratio (SNR) and eye patterns at the receiving end. It is shown that EMD code while using SDD technique provides high transmission capacity, reduces the receiver complexity, and provides better performance as compared to complementary subtraction detection (CSD) technique. Furthermore, analysis shows that, for a minimum acceptable BER of 10-9 , the proposed system supports 64 subscribers at data rates of up to 2 Gbps for both up-down link transmission.

  9. Interface requirements for coupling a containment code to a reactor system thermal hydraulic codes

    SciTech Connect

    Baratta, A.J.

    1997-07-01

    To perform a complete analysis of a reactor transient, not only the primary system response but the containment response must also be accounted for. Such transients and accidents as a loss of coolant accident in both pressurized water and boiling water reactors and inadvertent operation of safety relief valves all challenge the containment and may influence flows because of containment feedback. More recently, the advanced reactor designs put forth by General Electric and Westinghouse in the US and by Framatome and Seimens in Europe rely on the containment to act as the ultimate heat sink. Techniques used by analysts and engineers to analyze the interaction of the containment and the primary system were usually iterative in nature. Codes such as RELAP or RETRAN were used to analyze the primary system response and CONTAIN or CONTEMPT the containment response. The analysis was performed by first running the system code and representing the containment as a fixed pressure boundary condition. The flows were usually from the primary system to the containment initially and generally under choked conditions. Once the mass flows and timing are determined from the system codes, these conditions were input into the containment code. The resulting pressures and temperatures were then calculated and the containment performance analyzed. The disadvantage of this approach becomes evident when one performs an analysis of a rapid depressurization or a long term accident sequence in which feedback from the containment can occur. For example, in a BWR main steam line break transient, the containment heats up and becomes a source of energy for the primary system. Recent advances in programming and computer technology are available to provide an alternative approach. The author and other researchers have developed linkage codes capable of transferring data between codes at each time step allowing discrete codes to be coupled together.

  10. System Measures Errors Between Time-Code Signals

    NASA Technical Reports Server (NTRS)

    Cree, David; Venkatesh, C. N.

    1993-01-01

    System measures timing errors between signals produced by three asynchronous time-code generators. Errors between 1-second clock pulses resolved to 2 microseconds. Basic principle of computation of timing errors as follows: central processing unit in microcontroller constantly monitors time data received from time-code generators for changes in 1-second time-code intervals. In response to any such change, microprocessor buffers count of 16-bit internal timer.

  11. System Data Model (SDM) Source Code

    DTIC Science & Technology

    2012-08-23

    subject 407: ecode pointer to current position in compiled code 408: mstart pointer to the current match start position (can be...repeated call or recursion limit) 425: */ 426: 427: static int 428: match(REGISTER USPTR eptr, REGISTER const uschar * ecode , const uschar *mstart...variables */ 453: 454: frame->Xeptr = eptr; 455: frame->Xecode = ecode ; 456: frame->Xmstart = mstart; 457: frame->Xoffset_top = offset_top; 458

  12. Climate-induced tree mortality: Earth system consequences

    USGS Publications Warehouse

    Adams, Henry D.; Macalady, Alison K.; Breshears, David D.; Allen, Craig D.; Stephenson, Nathan L.; Saleska, Scott; Huxman, Travis E.; McDowell, Nathan G.

    2010-01-01

    One of the greatest uncertainties in global environmental change is predicting changes in feedbacks between the biosphere and the Earth system. Terrestrial ecosystems and, in particular, forests exert strong controls on the global carbon cycle and influence regional hydrology and climatology directly through water and surface energy budgets [Bonan, 2008; Chapin et al., 2008].According to new research, tree mortality associated with elevated temperatures and drought has the potential to rapidly alter forest ecosystems, potentially affecting feedbacks to the Earth system [Allen et al., 2010]. Several lines of recent research demonstrate how tree mortality rates in forests may be sensitive to climate change—particularly warming and drying. This emerging consequence of global change has important effects on Earth system processes (Figure 1).

  13. Wind turbine control systems: Dynamic model development using system identification and the fast structural dynamics code

    SciTech Connect

    Stuart, J.G.; Wright, A.D.; Butterfield, C.P.

    1996-10-01

    Mitigating the effects of damaging wind turbine loads and responses extends the lifetime of the turbine and, consequently, reduces the associated Cost of Energy (COE). Active control of aerodynamic devices is one option for achieving wind turbine load mitigation. Generally speaking, control system design and analysis requires a reasonable dynamic model of {open_quotes}plant,{close_quotes} (i.e., the system being controlled). This paper extends the wind turbine aileron control research, previously conducted at the National Wind Technology Center (NWTC), by presenting a more detailed development of the wind turbine dynamic model. In prior research, active aileron control designs were implemented in an existing wind turbine structural dynamics code, FAST (Fatigue, Aerodynamics, Structures, and Turbulence). In this paper, the FAST code is used, in conjunction with system identification, to generate a wind turbine dynamic model for use in active aileron control system design. The FAST code is described and an overview of the system identification technique is presented. An aileron control case study is used to demonstrate this modeling technique. The results of the case study are then used to propose ideas for generalizing this technique for creating dynamic models for other wind turbine control applications.

  14. Code system to compute radiation dose in human phantoms

    SciTech Connect

    Ryman, J.C.; Cristy, M.; Eckerman, K.F.; Davis, J.L.; Tang, J.S.; Kerr, G.D.

    1986-01-01

    Monte Carlo photon transport code and a code using Monte Carlo integration of a point kernel have been revised to incorporate human phantom models for an adult female, juveniles of various ages, and a pregnant female at the end of the first trimester of pregnancy, in addition to the adult male used earlier. An analysis code has been developed for deriving recommended values of specific absorbed fractions of photon energy. The computer code system and calculational method are described, emphasizing recent improvements in methods. (LEW)

  15. 14 CFR Sec. 1-4 - System of accounts coding.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... General Accounting Provisions Sec. 1-4 System of accounts coding. (a) A four digit control number is... digit code assigned to each profit and loss account denote a detailed area of financial activity or... sequentially within blocks, designating more general classifications of financial activity and...

  16. The Modified Cognitive Constructions Coding System: Reliability and Validity Assessments

    ERIC Educational Resources Information Center

    Moran, Galia S.; Diamond, Gary M.

    2006-01-01

    The cognitive constructions coding system (CCCS) was designed for coding client's expressed problem constructions on four dimensions: intrapersonal-interpersonal, internal-external, responsible-not responsible, and linear-circular. This study introduces, and examines the reliability and validity of, a modified version of the CCCS--a version that…

  17. Concatenated coding systems employing a unit-memory convolutional code and a byte-oriented decoding algorithm

    NASA Technical Reports Server (NTRS)

    Lee, L. N.

    1976-01-01

    Concatenated coding systems utilizing a convolutional code as the inner code and a Reed-Solomon code as the outer code are considered. In order to obtain very reliable communications over a very noisy channel with relatively small coding complexity, it is proposed to concatenate a byte oriented unit memory convolutional code with an RS outer code whose symbol size is one byte. It is further proposed to utilize a real time minimal byte error probability decoding algorithm, together with feedback from the outer decoder, in the decoder for the inner convolutional code. The performance of the proposed concatenated coding system is studied, and the improvement over conventional concatenated systems due to each additional feature is isolated.

  18. Binary random systematic erasure code for RAID system

    NASA Astrophysics Data System (ADS)

    Teng, Pengguo; Wang, Xiaojing; Chen, Liang; Yuan, Dezhai

    2017-03-01

    As the increasing expansion of data scale, storage systems grow in size and complexity, the requirements for systems scalability and methodologies to recover simultaneous disk and sector failures are inevitable. To ensure high reliability and flexible scalability, erasure codes with high fault tolerance and flexibility are required. In this pa per, we present a class of erasure codes satisfied the previous requirements, which referred as Binary Random Systematic erasure code, called BRS code for short. BRS code constructs its generator matrix based on random matrix, whose elements are in Galois Field GF (2), and takes the advantage of exclusive-or (XOR) operations to make it work much fast. It is designed as a systematic code to facilitate the store and recovery. Moreover, δ random redundancies make the probability of successfully decoding controllable. Our evaluations and experiments show that BRS code is flexible on parameters and fault tolerance setting, and has high computing efficiency on encoding and decoding speeds, what is more, when the code length is long enough, BRS code is approximately MDS, thus make it have nearly optimal storage efficiency.

  19. A highly specific coding system for structural chromosomal alterations.

    PubMed

    Martínez-Frías, M L; Martínez-Fernández, M L

    2013-04-01

    The Spanish Collaborative Study of Congenital Malformations (ECEMC, from the name in Spanish) has developed a very simple and highly specific coding system for structural chromosomal alterations. Such a coding system would be of value at present due to the dramatic increase in the diagnosis of submicroscopic chromosomal deletions and duplications through molecular techniques. In summary, our new coding system allows the characterization of: (a) the type of structural anomaly; (b) the chromosome affected; (c) if the alteration affects the short or/and the long arm, and (d) if it is a non-pure dicentric, a non-pure isochromosome, or if it affects several chromosomes. We show the distribution of 276 newborn patients with these types of chromosomal alterations using their corresponding codes according to our system. We consider that our approach may be useful not only for other registries, but also for laboratories performing these studies to store their results on case series. Therefore, the aim of this article is to describe this coding system and to offer the opportunity for this coding to be applied by others. Moreover, as this is a SYSTEM, rather than a fixed code, it can be implemented with the necessary modifications to include the specific objectives of each program.

  20. Code-modulated interferometric imaging system using phased arrays

    NASA Astrophysics Data System (ADS)

    Chauhan, Vikas; Greene, Kevin; Floyd, Brian

    2016-05-01

    Millimeter-wave (mm-wave) imaging provides compelling capabilities for security screening, navigation, and bio- medical applications. Traditional scanned or focal-plane mm-wave imagers are bulky and costly. In contrast, phased-array hardware developed for mass-market wireless communications and automotive radar promise to be extremely low cost. In this work, we present techniques which can allow low-cost phased-array receivers to be reconfigured or re-purposed as interferometric imagers, removing the need for custom hardware and thereby reducing cost. Since traditional phased arrays power combine incoming signals prior to digitization, orthogonal code-modulation is applied to each incoming signal using phase shifters within each front-end and two-bit codes. These code-modulated signals can then be combined and processed coherently through a shared hardware path. Once digitized, visibility functions can be recovered through squaring and code-demultiplexing operations. Pro- vided that codes are selected such that the product of two orthogonal codes is a third unique and orthogonal code, it is possible to demultiplex complex visibility functions directly. As such, the proposed system modulates incoming signals but demodulates desired correlations. In this work, we present the operation of the system, a validation of its operation using behavioral models of a traditional phased array, and a benchmarking of the code-modulated interferometer against traditional interferometer and focal-plane arrays.

  1. ARAC: A flexible real-time dose consequence assessment system

    SciTech Connect

    Ellis, J.S.; Sullivan, T.J.

    1993-10-07

    Since its beginning, the Atmospheric Release Advisory Capability (ARAC), an emergency radiological dose assessment service of the US Government, has been called on to do consequence assessments for releases into the atmosphere of radionuclides and a variety of other substances. Some of the more noteworthy emergency responses have been for the Three Mile Island and Chernobyl nuclear power reactor accidents, and more recently, for a cloud of gases from a rail-car spill into the Sacramento river of the herbicide metam sodium, smoke from hundreds of burning oil wells in Kuwait, and ash clouds from the eruption of Mt. Pinatubo. The spatial scales of these responses range from local, to regional, to global, and the response periods from hours, to weeks, to months. Because of the variety of requirements of each unique assessment, ARAC has developed and maintains a flexible system of people, computer software and hardware.

  2. Equipment Readiness Code Rule System (ERCRULES)

    DTIC Science & Technology

    1988-06-01

    Teknowledge by the US Army. It is used as the instructional vehicle for training in expert systems by the Army Artificial Intelligence Training Cell ( AITC ...Fort Gordon, GA. As part of the rule development process (see Chapter 4), it was determined that the AITC would provide training to CAA and TRADOC...a base of technical support at the AITC for the operation of the system into the future. 3-4. DESCRIPTION OF SELECTED TOOL. The expert system tool

  3. Convolutionally-Coded Unbalanced QPSK Systems

    NASA Technical Reports Server (NTRS)

    Divsalar, D.; Yuen, J. H.

    1985-01-01

    Report discusses error-rate performance for three convolutionallycoded unbalanced quadriphase-shift-keying (UQPSK) communication systems with noisy carriers that introduce crosstalk. Systems analyzed unbalanced in sense that each transmits two data streams with different bit rates and (in some cases) different powers.

  4. Code System to Model Aqueous Geochemical Equilibria.

    SciTech Connect

    PETERSON, S. R.

    2001-08-23

    Version: 00 MINTEQ is a geochemical program to model aqueous solutions and the interactions of aqueous solutions with hypothesized assemblages of solid phases. It was developed for the Environmental Protection Agency to perform the calculations necessary to simulate the contact of waste solutions with heterogeneous sediments or the interaction of ground water with solidified wastes. MINTEQ can calculate ion speciation/solubility, adsorption, oxidation-reduction, gas phase equilibria, and precipitation/dissolution ofsolid phases. MINTEQ can accept a finite mass for any solid considered for dissolution and will dissolve the specified solid phase only until its initial mass is exhausted. This ability enables MINTEQ to model flow-through systems. In these systems the masses of solid phases that precipitate at earlier pore volumes can be dissolved at later pore volumes according to thermodynamic constraints imposed by the solution composition and solid phases present. The ability to model these systems permits evaluation of the geochemistry of dissolved traced metals, such as low-level waste in shallow land burial sites. MINTEQ was designed to solve geochemical equilibria for systems composed of one kilogram of water, various amounts of material dissolved in solution, and any solid materials that are present. Systems modeled using MINTEQ can exchange energy and material (open systems) or just energy (closed systems) with the surrounding environment. Each system is composed of a number of phases. Every phase is a region with distinct composition and physically definable boundaries. All of the material in the aqueous solution forms one phase. The gas phase is composed of any gaseous material present, and each compositionally and structurally distinct solid forms a separate phase.

  5. The FORTRAN static source code analyzer program (SAP) system description

    NASA Technical Reports Server (NTRS)

    Decker, W.; Taylor, W.; Merwarth, P.; Oneill, M.; Goorevich, C.; Waligora, S.

    1982-01-01

    A source code analyzer program (SAP) designed to assist personnel in conducting studies of FORTRAN programs is described. The SAP scans FORTRAN source code and produces reports that present statistics and measures of statements and structures that make up a module. The processing performed by SAP and of the routines, COMMON blocks, and files used by SAP are described. The system generation procedure for SAP is also presented.

  6. Analytical considerations in the code qualification of piping systems

    SciTech Connect

    Antaki, G.A.

    1995-02-01

    The paper addresses several analytical topics in the design and qualification of piping systems which have a direct bearing on the prediction of stresses in the pipe and hence on the application of the equations of NB, NC and ND-3600 of the ASME Boiler and Pressure Vessel Code. For each of the analytical topics, the paper summarizes the current code requirements, if any, and the industry practice.

  7. Bilingual Processing of ASL-English Code-Blends: The Consequences of Accessing Two Lexical Representations Simultaneously

    ERIC Educational Resources Information Center

    Emmorey, Karen; Petrich, Jennifer A. F.; Gollan, Tamar H.

    2012-01-01

    Bilinguals who are fluent in American Sign Language (ASL) and English often produce "code-blends"--simultaneously articulating a sign and a word while conversing with other ASL-English bilinguals. To investigate the cognitive mechanisms underlying code-blend processing, we compared picture-naming times (Experiment 1) and semantic categorization…

  8. 3D neutronic codes coupled with thermal-hydraulic system codes for PWR, and BWR and VVER reactors

    SciTech Connect

    Langenbuch, S.; Velkov, K.; Lizorkin, M.

    1997-07-01

    This paper describes the objectives of code development for coupling 3D neutronics codes with thermal-hydraulic system codes. The present status of coupling ATHLET with three 3D neutronics codes for VVER- and LWR-reactors is presented. After describing the basic features of the 3D neutronic codes BIPR-8 from Kurchatov-Institute, DYN3D from Research Center Rossendorf and QUABOX/CUBBOX from GRS, first applications of coupled codes for different transient and accident scenarios are presented. The need of further investigations is discussed.

  9. Design of wavefront coding optical system with annular aperture

    NASA Astrophysics Data System (ADS)

    Chen, Xinhua; Zhou, Jiankang; Shen, Weimin

    2016-10-01

    Wavefront coding can extend the depth of field of traditional optical system by inserting a phase mask into the pupil plane. In this paper, the point spread function (PSF) of wavefront coding system with annular aperture are analyzed. Stationary phase method and fast Fourier transform (FFT) method are used to compute the diffraction integral respectively. The OTF invariance is analyzed for the annular aperture with cubic phase mask under different obscuration ratio. With these analysis results, a wavefront coding system using Maksutov-Cassegrain configuration is designed finally. It is an F/8.21 catadioptric system with annular aperture, and its focal length is 821mm. The strength of the cubic phase mask is optimized with user-defined operand in Zemax. The Wiener filtering algorithm is used to restore the images and the numerical simulation proves the validity of the design.

  10. Theory and Implementation of Nuclear Safety System Codes - Part II: System Code Closure Relations, Validation, and Limitations

    SciTech Connect

    Glenn A Roth; Fatih Aydogan

    2014-09-01

    This is Part II of two articles describing the details of thermal-hydraulic sys- tem codes. In this second part of the article series, the system code closure relationships (used to model thermal and mechanical non-equilibrium and the coupling of the phases) for the governing equations are discussed and evaluated. These include several thermal and hydraulic models, such as heat transfer coefficients for various flow regimes, two phase pressure correlations, two phase friction correlations, drag coefficients and interfacial models be- tween the fields. These models are often developed from experimental data. The experiment conditions should be understood to evaluate the efficacy of the closure models. Code verification and validation, including Separate Effects Tests (SETs) and Integral effects tests (IETs) is also assessed. It can be shown from the assessments that the test cases cover a significant section of the system code capabilities, but some of the more advanced reactor designs will push the limits of validation for the codes. Lastly, the limitations of the codes are discussed by considering next generation power plants, such as Small Modular Reactors (SMRs), analyz- ing not only existing nuclear power plants, but also next generation nuclear power plants. The nuclear industry is developing new, innovative reactor designs, such as Small Modular Reactors (SMRs), High-Temperature Gas-cooled Reactors (HTGRs) and others. Sub-types of these reactor designs utilize pebbles, prismatic graphite moderators, helical steam generators, in- novative fuel types, and many other design features that may not be fully analyzed by current system codes. This second part completes the series on the comparison and evaluation of the selected reactor system codes by discussing the closure relations, val- idation and limitations. These two articles indicate areas where the models can be improved to adequately address issues with new reactor design and development.

  11. Inter-Rater Reliability of an Electronic Discussion Coding System.

    ERIC Educational Resources Information Center

    MacKinnon, Gregory R.

    A "cognote" system was developed for coding electronic discussion groups and promoting critical thinking. Previous literature has provided an account of the strategy as applied to several academic settings. This paper addresses the research around establishing the inter-rater reliability of the cognote system. The findings suggest three indicators…

  12. User's guide for the GSMP/OCMHD system code

    SciTech Connect

    Dennis, C. B.; Berry, G. F.

    1980-12-01

    The Systems Analysis group of the ANL Engineering Division conducts overall system studies for various power plant concepts, utilizing a computer simulation code. Analytical investigations explore a range of possible performance variables, in order to determine the sensitivity of a specific plant design to variation in key system parameters and, ultimately, to establish probable system performance limits. To accomplish this task, a Generalized System Modeling Program (GSMP) has been developed that will analyze and simulate the particular system of interest for any number of different configurations, automatically holding constraints while conducting either sensitivity studies or optimizations. One system investigated, while developing the ANL/GSMP code, is an open-cycle magneto-hydrodynamic (OCMHD) power plant. By linking mathematical models representing these OCMHD power plant components to the executive level GSMP driver the resulting system code, GSMP/OCMHD, can be used to simulate any OCMHD power plant configuration. This report, a user's guide for GSMP/OCMHD, describes the process for setting up an OCMHD configuration, preparing the input defining that configuration, running the computer code and interpreting the results generated.

  13. Physical-layer network coding in coherent optical OFDM systems.

    PubMed

    Guan, Xun; Chan, Chun-Kit

    2015-04-20

    We present the first experimental demonstration and characterization of the application of optical physical-layer network coding in coherent optical OFDM systems. It combines two optical OFDM frames to share the same link so as to enhance system throughput, while individual OFDM frames can be recovered with digital signal processing at the destined node.

  14. FORTRAN Automated Code Evaluation System (faces) system documentation, version 2, mod 0. [error detection codes/user manuals (computer programs)

    NASA Technical Reports Server (NTRS)

    1975-01-01

    A system is presented which processes FORTRAN based software systems to surface potential problems before they become execution malfunctions. The system complements the diagnostic capabilities of compilers, loaders, and execution monitors rather than duplicating these functions. Also, it emphasizes frequent sources of FORTRAN problems which require inordinate manual effort to identify. The principle value of the system is extracting small sections of unusual code from the bulk of normal sequences. Code structures likely to cause immediate or future problems are brought to the user's attention. These messages stimulate timely corrective action of solid errors and promote identification of 'tricky' code. Corrective action may require recoding or simply extending software documentation to explain the unusual technique.

  15. A systems neurophysiology approach to voluntary event coding.

    PubMed

    Petruo, Vanessa A; Stock, Ann-Kathrin; Münchau, Alexander; Beste, Christian

    2016-07-15

    Mechanisms responsible for the integration of perceptual events and appropriate actions (sensorimotor processes) have been subject to intense research. Different theoretical frameworks have been put forward with the "Theory of Event Coding (TEC)" being one of the most influential. In the current study, we focus on the concept of 'event files' within TEC and examine what sub-processes being dissociable by means of cognitive-neurophysiological methods are involved in voluntary event coding. This was combined with EEG source localization. We also introduce reward manipulations to delineate the neurophysiological sub-processes most relevant for performance variations during event coding. The results show that processes involved in voluntary event coding included predominantly stimulus categorization, feature unbinding and response selection, which were reflected by distinct neurophysiological processes (the P1, N2 and P3 ERPs). On a system's neurophysiological level, voluntary event-file coding is thus related to widely distributed parietal-medial frontal networks. Attentional selection processes (N1 ERP) turned out to be less important. Reward modulated stimulus categorization in parietal regions likely reflecting aspects of perceptual decision making but not in other processes. The perceptual categorization stage appears central for voluntary event-file coding.

  16. Use of post-Chernobyl data from Norway to validate the long-term exposure pathway models in the accident consequence code MACCS

    SciTech Connect

    Tveten, U. )

    1994-03-01

    This paper describes a task performed for the US Nuclear Regulatory Commission (NRC), consisting of using post-Chernobyl data from Norway to verify or find areas for possible improvement in the chronic exposure pathway models utilized in the NRC's program for probabilistic risk analysis, level 3, of the MELCOR accident consequence code system (MACCS), developed at Sandia National Laboratories, Albuquerque, New Mexico. Because of unfortunate combinations of weather conditions, the levels of Chernobyl fallout in parts of Norway were quite high, with large areas contaminated to more than 100 kBq/m[sup 2] of radioactive cesium. Approximately 6% of the total amount of radioactive cesium released from Chernobyl is deposited on Norwegian territory, according to a countrywide survey performed by the Norwegian National Institute for Radiation Hygiene. Accordingly, a very large monitoring effort was carried out in Norway, and some of the results of this effort have provided important new insights into the ways in which radioactive cesium behaves in the environment. In addition to collection and evaluation of post-Chernobyl monitoring results, some experiments were also performed as part of the task. Some experiments performed pre-Chernobyl were also relevant, and some conclusions could be drawn from these. In most connections, the data available show the models and data in MACCS to be appropriate. A few areas where the data indicate that the MACCS approach is inadequate are, however, also pointed out in the paper.

  17. Focal Manual for CAI Coding on the TSS/8 System.

    ERIC Educational Resources Information Center

    Kirbs, H. Dewey; And Others

    Basic information is provided in this manual for coding drill-and-practice CAI (computer-assisted instruction) applications in the language FOCAL (Formulating On-line Calculations in Algebraic Language). This language is available on the Digital Equipment Corporation Time-Sharing 8 system (TSS/8). While FOCAL is oriented toward solution of…

  18. The Facial Expression Coding System (FACES): Development, Validation, and Utility

    ERIC Educational Resources Information Center

    Kring, Ann M.; Sloan, Denise M.

    2007-01-01

    This article presents information on the development and validation of the Facial Expression Coding System (FACES; A. M. Kring & D. Sloan, 1991). Grounded in a dimensional model of emotion, FACES provides information on the valence (positive, negative) of facial expressive behavior. In 5 studies, reliability and validity data from 13 diverse…

  19. Confidence Intervals for Error Rates Observed in Coded Communications Systems

    NASA Astrophysics Data System (ADS)

    Hamkins, J.

    2015-05-01

    We present methods to compute confidence intervals for the codeword error rate (CWER) and bit error rate (BER) of a coded communications link. We review several methods to compute exact and approximate confidence intervals for the CWER, and specifically consider the situation in which the true CWER is so low that only a handful, if any, codeword errors are able to be simulated. In doing so, we answer the question of how long an error-free simulation must be run in order to certify that a given CWER requirement is met with a given level of confidence, and discuss the bias introduced by aborting a simulation after observing the first codeword error. Next, we turn to the lesser studied problem of determining confidence intervals for the BER of coded systems. Since bit errors in systems that use coding or higher-order modulation do not occur independently, blind application of a method that assumes independence leads to inappropriately narrow confidence intervals. We present a new method to compute the confidence interval properly, using the first and second sample moments of the number of bit errors per codeword. This is the first method we know of to compute a confidence interval for the BER of a coded or higher-order modulation system.

  20. What's in a code? Towards a formal account of the relation of ontologies and coding systems.

    PubMed

    Rector, Alan L

    2007-01-01

    Terminologies are increasingly based on "ontologies" developed in description logics and related languages such as the new Web Ontology Language, OWL. The use of description logic has been expected to reduce ambiguity and make it easier determine logical equivalence, deal with negation, and specify EHRs. However, this promise has not been fully realised: in part because early description logics were relatively inexpressive, in part, because the relation between coding systems, EHRs, and ontologies expressed in description logics has not been fully understood. This paper presents a unifying approach using the expressive formalisms available in the latest version of OWL, OWL 1.1.

  1. Two Serial Data to Pulse Code Modulation System Interfaces

    NASA Technical Reports Server (NTRS)

    Hamory, Phil

    2006-01-01

    Two pulse code modulation (PCM) system interfaces for asynchronous serial data are described. One interface is for global positioning system (GPS) data on the NASA Dryden Flight Research Center (DFRC) F-15B (McDonnell Douglas Corporation, St. Louis, Missouri) airplane, tail number 836 (F-15B/836). The other is for flight control computer data on the duPont Aerospace (La Jolla, California) DP-1, a 53-percent scale model of the duPont Aerospace DP-2.

  2. Efficient Signal, Code, and Receiver Designs for MIMO Communication Systems

    DTIC Science & Technology

    2003-06-01

    to obtain good performance at moderate SNR . These new systems are compared to orthogonal space-time coded systems, which we show to achieve near op...timal performance at low SNR . We also examine traditional sequential versions and develop new block versions of the Bell Labs layered architecture...BLAST). While some of these can in principle reach the performance limit at all SNRs , we show they also have various practical problems. Finally, for the

  3. Code for Analyzing and Designing Spacecraft Power System Radiators

    NASA Technical Reports Server (NTRS)

    Juhasz, Albert

    2005-01-01

    GPHRAD is a computer code for analysis and design of disk or circular-sector heat-rejecting radiators for spacecraft power systems. A specific application is for Stirling-cycle/linear-alternator electric-power systems coupled to radioisotope general-purpose heat sources. GPHRAD affords capabilities and options to account for thermophysical properties (thermal conductivity, density) of either metal-alloy or composite radiator materials.

  4. Distributed magnetic field positioning system using code division multiple access

    NASA Technical Reports Server (NTRS)

    Prigge, Eric A. (Inventor)

    2003-01-01

    An apparatus and methods for a magnetic field positioning system use a fundamentally different, and advantageous, signal structure and multiple access method, known as Code Division Multiple Access (CDMA). This signal architecture, when combined with processing methods, leads to advantages over the existing technologies, especially when applied to a system with a large number of magnetic field generators (beacons). Beacons at known positions generate coded magnetic fields, and a magnetic sensor measures a sum field and decomposes it into component fields to determine the sensor position and orientation. The apparatus and methods can have a large `building-sized` coverage area. The system allows for numerous beacons to be distributed throughout an area at a number of different locations. A method to estimate position and attitude, with no prior knowledge, uses dipole fields produced by these beacons in different locations.

  5. Code System for Reactor Physics and Fuel Cycle Simulation.

    SciTech Connect

    TEUCHERT, E.

    1999-04-21

    Version 00 VSOP94 (Very Superior Old Programs) is a system of codes linked together for the simulation of reactor life histories. It comprises neutron cross section libraries and processing routines, repeated neutron spectrum evaluation, 2-D diffusion calculation based on neutron flux synthesis with depletion and shut-down features, in-core and out-of-pile fuel management, fuel cycle cost analysis, and thermal hydraulics (at present restricted to Pebble Bed HTRs). Various techniques have been employed to accelerate the iterative processes and to optimize the internal data transfer. The code system has been used extensively for comparison studies of reactors, their fuel cycles, and related detailed features. In addition to its use in research and development work for the High Temperature Reactor, the system has been applied successfully to Light Water and Heavy Water Reactors.

  6. Upgrades to the NESS (Nuclear Engine System Simulation) Code

    NASA Technical Reports Server (NTRS)

    Fittje, James E.

    2007-01-01

    In support of the President's Vision for Space Exploration, the Nuclear Thermal Rocket (NTR) concept is being evaluated as a potential propulsion technology for human expeditions to the moon and Mars. The need for exceptional propulsion system performance in these missions has been documented in numerous studies, and was the primary focus of a considerable effort undertaken during the 1960's and 1970's. The NASA Glenn Research Center is leveraging this past NTR investment in their vehicle concepts and mission analysis studies with the aid of the Nuclear Engine System Simulation (NESS) code. This paper presents the additional capabilities and upgrades made to this code in order to perform higher fidelity NTR propulsion system analysis and design.

  7. Coded multiple chirp spread spectrum system and overlay service

    NASA Technical Reports Server (NTRS)

    Kim, Junghwan; Pratt, Timothy; Ha, Tri T.

    1988-01-01

    An asynchronous spread-spectrum system called coded multiple chirp is proposed, and the possible spread-spectrum overlay over an analog FM-TV signal is investigated by computer simulation. Multiple single-sloped up and down chirps are encoded by a pseudonoise code and decoded by dechirpers (pulse-compression filters) followed by a digital code correlator. The performance of the proposed system, expressed in terms of in probability of bit error and code miss probability, is similar to that of FSK (frequency shift keying) using codewords if sufficient compression gain is used. When chirp is used to overlay an FM-TV channel, two chirp signals with data rate up to 25 kb/s could be overlaid in a 36-MHz satellite transponder without significant mutual interference. Performance estimates for a VSAT (very small aperture terminal) earth station operating at C-band show that a 2.4-m antenna and 300-mW transmitter could send a 2.4-kb/s signal to a large central earth station over an occupied channel.

  8. An Expert System for the Development of Efficient Parallel Code

    NASA Technical Reports Server (NTRS)

    Jost, Gabriele; Chun, Robert; Jin, Hao-Qiang; Labarta, Jesus; Gimenez, Judit

    2004-01-01

    We have built the prototype of an expert system to assist the user in the development of efficient parallel code. The system was integrated into the parallel programming environment that is currently being developed at NASA Ames. The expert system interfaces to tools for automatic parallelization and performance analysis. It uses static program structure information and performance data in order to automatically determine causes of poor performance and to make suggestions for improvements. In this paper we give an overview of our programming environment, describe the prototype implementation of our expert system, and demonstrate its usefulness with several case studies.

  9. Fish stranding in freshwater systems: sources, consequences, and mitigation.

    PubMed

    Nagrodski, Alexander; Raby, Graham D; Hasler, Caleb T; Taylor, Mark K; Cooke, Steven J

    2012-07-30

    Fish can become stranded when water levels decrease, often rapidly, as a result of anthropogenic (e.g., canal drawdown, hydropeaking, vessel wakes) and natural (e.g., floods, drought, winter ice dynamics) events. We summarize existing research on stranding of fish in freshwater, discuss the sources, consequences, and mitigation options for stranding, and report current knowledge gaps. Our literature review revealed that ∼65.5% of relevant peer-reviewed articles were found to focus on stranding associated with hydropower operations and irrigation projects. In fact, anthropogenic sources of fish stranding represented 81.8% of available literature compared to only 19.9% attributed to natural fish stranding events. While fish mortality as a result of stranding is well documented, our analysis revealed that little is known about the sublethal and long-term consequences of stranding on growth and population dynamics. Furthermore, the contribution of stranding to annual mortality rates is poorly understood as are the potential ecosystem-scale impacts. Mitigation strategies available to deal with stranding include fish salvage, ramping rate limitations, and physical habitat works (e.g., to contour substrate to minimize stranding). However, a greater knowledge of the factors that cause fish stranding would promote the development and refinement of mitigation strategies that are economically and ecologically sustainable.

  10. Clinical laboratory sciences data transmission: the NPU coding system.

    PubMed

    Pontet, Françoise; Magdal Petersen, Ulla; Fuentes-Arderiu, Xavier; Nordin, Gunnar; Bruunshuus, Ivan; Ihalainen, Jarkko; Karlsson, Daniel; Forsum, Urban; Dybkaer, René; Schadow, Gunther; Kuelpmann, Wolf; Férard, Georges; Kang, Dongchon; McDonald, Clement; Hill, Gilbert

    2009-01-01

    In health care services, technology requires that correct information be duly available to professionals, citizens and authorities, worldwide. Thus, clinical laboratory sciences require standardized electronic exchanges for results of laboratory examinations. The NPU (Nomenclature, Properties and Units) coding system provides a terminology for identification of result values (property values). It is structured according to BIPM, ISO, IUPAC and IFCC recommendations. It uses standard terms for established concepts and structured definitions describing: which part of the universe is examined, which component of relevance in that part, which kind-of-property is relevant. Unit and specifications can be added where relevant [System(spec)-Component(spec); kind-of-property(spec) = ? unit]. The English version of this terminology is freely accessible at http://dior.imt.liu.se/cnpu/ and http://www.labterm.dk, directly or through the IFCC and IUPAC websites. It has been nationally used for more than 10 years in Denmark and Sweden and has been translated into 6 other languages. The NPU coding system provides a terminology for dedicated kinds-of-property following the international recommendations. It fits well in the health network and is freely accessible. Clinical laboratory professionals worldwide will find many advantages in using the NPU coding system, notably with regards to an accreditation process.

  11. A seismic data compression system using subband coding

    NASA Technical Reports Server (NTRS)

    Kiely, A. B.; Pollara, F.

    1995-01-01

    This article presents a study of seismic data compression techniques and a compression algorithm based on subband coding. The algorithm includes three stages: a decorrelation stage, a quantization stage that introduces a controlled amount of distortion to allow for high compression ratios, and a lossless entropy coding stage based on a simple but efficient arithmetic coding method. Subband coding methods are particularly suited to the decorrelation of nonstationary processes such as seismic events. Adaptivity to the nonstationary behavior of the waveform is achieved by dividing the data into separate blocks that are encoded separately with an adaptive arithmetic encoder. This is done with high efficiency due to the low overhead introduced by the arithmetic encoder in specifying its parameters. The technique could be used as a progressive transmission system, where successive refinements of the data can be requested by the user. This allows seismologists to first examine a coarse version of waveforms with minimal usage of the channel and then decide where refinements are required. Rate-distortion performance results are presented and comparisons are made with two block transform methods.

  12. Photovoltaic power systems and the National Electrical Code: Suggested practices

    SciTech Connect

    Wiles, J.

    1996-12-01

    This guide provides information on how the National Electrical Code (NEC) applies to photovoltaic systems. The guide is not intended to supplant or replace the NEC; it paraphrases the NEC where it pertains to photovoltaic systems and should be used with the full text of the NEC. Users of this guide should be thoroughly familiar with the NEC and know the engineering principles and hazards associated with electrical and photovoltaic power systems. The information in this guide is the best available at the time of publication and is believed to be technically accurate; it will be updated frequently. Application of this information and results obtained are the responsibility of the user.

  13. Possible consequences of absence of "Jupiters" in planetary systems.

    PubMed

    Wetherill, G W

    1994-01-01

    The formation of the gas giant planets Jupiter and Saturn probably required the growth of massive approximately 15 Earth-mass cores on a time scale shorter than the approximately 10(7) time scale for removal of nebular gas. Relatively minor variations in nebular parameters could preclude the growth of full-size gas giants even in systems in which the terrestrial planet region is similar to our own. Systems containing "failed Jupiters," resembling Uranus and Neptune in their failure to capture much nebular gas, would be expected to contain more densely populated cometary source regions. They will also eject a smaller number of comets into interstellar space. If systems of this kind were the norm, observation of hyperbolic comets would be unexpected. Monte Carlo calculations of the orbital evolution of region of such systems (the Kuiper belt) indicate that throughout Earth history the cometary impact flux in their terrestrial planet regions would be approximately 1000 times greater than in our Solar System. It may be speculated that this could frustrate the evolution of organisms that observe and seek to understand their planetary system. For this reason our observation of these planets in our Solar System may tell us nothing about the probability of similar gas giants occurring in other planetary systems. This situation can be corrected by observation of an unbiased sample of planetary systems.

  14. Earth system consequences of a Pine Island Glacier collapse

    NASA Astrophysics Data System (ADS)

    Green, Mattias; Schmittner, Andreas

    2016-04-01

    An intermediate complexity climate model is used to simulate the impact of an accelerated Pine Island Glacier mass loss on the large-scale ocean circulation and climate. Simulations are performed for pre-industrial conditions using hosing levels consistent with present day observation of 3,000 m3 s-1, at an accelerated rate of 6,000 m3 s-1, and at a total collapse rate of 100,000 m3 s-1, and in all experiments the hosing lasted 100 years. It is shown that even a modest input of meltwater from the glacier can introduce an initial cooling over the upper part of the Southern Ocean due to increased stratification and ice cover leading to a reduced upward heat flux from Circumpolar Deep Water. This causes global ocean heat content to increase and global surface air temperatures to decrease. The Atlantic Meridional Overturning Circulation (AMOC) increases, presumably due to changes in the density difference between Antarctic Intermediate Water and North Atlantic Deep Water. Simulations with a simultaneous hosing and increases of atmospheric CO2 concentrations show smaller effects of the hosing on global surface air temperature and ocean heat content, which we attribute to the melting of Southern Ocean sea ice. The sensitivity of the AMOC to the hosing is also reduced as the warming by the atmosphere completely dominates the perturbations. Further consequences for oceanic biogeochemical cycles in realistic future warming scenarios are discussed.

  15. Code-Time Diversity for Direct Sequence Spread Spectrum Systems

    PubMed Central

    Hassan, A. Y.

    2014-01-01

    Time diversity is achieved in direct sequence spread spectrum by receiving different faded delayed copies of the transmitted symbols from different uncorrelated channel paths when the transmission signal bandwidth is greater than the coherence bandwidth of the channel. In this paper, a new time diversity scheme is proposed for spread spectrum systems. It is called code-time diversity. In this new scheme, N spreading codes are used to transmit one data symbol over N successive symbols interval. The diversity order in the proposed scheme equals to the number of the used spreading codes N multiplied by the number of the uncorrelated paths of the channel L. The paper represents the transmitted signal model. Two demodulators structures will be proposed based on the received signal models from Rayleigh flat and frequency selective fading channels. Probability of error in the proposed diversity scheme is also calculated for the same two fading channels. Finally, simulation results are represented and compared with that of maximal ration combiner (MRC) and multiple-input and multiple-output (MIMO) systems. PMID:24982925

  16. A System for Coding the Presenting Requests of Ambulatory Patients

    PubMed Central

    Weinstein, Philip; Gordon, Michael J.; Gilson, John S.

    1977-01-01

    Effective methods developed to review and study the care of patients in hospital have not been applicable to ambulatory care, in which definitive diagnosis is the exception rather than the rule. A reasonable alternative to using diagnosis as the basis for assessing ambulatory care is to use the problems or requests presented by the patients themselves. A system has been developed for classifying and coding this information for flexible computer retrieval. Testing indicates that the system is simple in design, easily mastered by nonphysicians and provides reliable, useful data at a low cost. PMID:855324

  17. [Behavior ethogram and PAE coding system of Cervus nippon sichuanicus].

    PubMed

    Qi, Wen-Hua; Yue, Bi-Song; Ning, Ji-Zu; Jiang, Xue-Mei; Quan, Qiu-Mei; Guo, Yan-Shu; Mi, Jun; Zuo, Lin; Xiong, Yuan-Qing

    2010-02-01

    A monthly 5-day periodic observation at 06:00-18:00 from March to November 2007 was conducted to record the behavioral processes, contents, and results, and the surrounding habitats of Sichuan sika deer (Cervus nippon sichuanicus) in Donglie, Chonger, and Reer villages of Tiebu Natural Reserve of Sichuan Province. The behavioral ethogram, vigilance behaviors ethogram and its PAE (posture, act, and environment) coding system of the Sichuan sika deer were established, which filled the gap of the PAE coding of ungulates vigilance behaviors. A total of 11 kinds of postures, 83 acts, and 136 behaviors were recorded and distinguished, with the relative frequency of each behavior in relation to gender, age, and season described. Compared with other ungulates, the behavioral repertoire of Sichuan sika deer was mostly similar to that of other cervid animals.

  18. Medium-rate speech coding simulator for mobile satellite systems

    NASA Astrophysics Data System (ADS)

    Copperi, Maurizio; Perosino, F.; Rusina, F.; Albertengo, G.; Biglieri, E.

    1986-01-01

    Channel modeling and error protection schemes for speech coding are described. A residual excited linear predictive (RELP) coder for bit rates 4.8, 7.2, and 9.6 kbit/sec is outlined. The coder at 9.6 kbit/sec incorporates a number of channel error protection techniques, such as bit interleaving, error correction codes, and parameter repetition. Results of formal subjective experiments (DRT and DAM tests) under various channel conditions, reveal that the proposed coder outperforms conventional LPC-10 vocoders by 2 subjective categories, thus confirming the suitability of the RELP coder at 9.6 kbit/sec for good quality speech transmission in mobile satellite systems.

  19. Nexus: A modular workflow management system for quantum simulation codes

    NASA Astrophysics Data System (ADS)

    Krogel, Jaron T.

    2016-01-01

    The management of simulation workflows represents a significant task for the individual computational researcher. Automation of the required tasks involved in simulation work can decrease the overall time to solution and reduce sources of human error. A new simulation workflow management system, Nexus, is presented to address these issues. Nexus is capable of automated job management on workstations and resources at several major supercomputing centers. Its modular design allows many quantum simulation codes to be supported within the same framework. Current support includes quantum Monte Carlo calculations with QMCPACK, density functional theory calculations with Quantum Espresso or VASP, and quantum chemical calculations with GAMESS. Users can compose workflows through a transparent, text-based interface, resembling the input file of a typical simulation code. A usage example is provided to illustrate the process.

  20. Nexus: a modular workflow management system for quantum simulation codes

    SciTech Connect

    Krogel, Jaron T.

    2015-08-24

    The management of simulation workflows is a significant task for the individual computational researcher. Automation of the required tasks involved in simulation work can decrease the overall time to solution and reduce sources of human error. A new simulation workflow management system, Nexus, is presented to address these issues. Nexus is capable of automated job management on workstations and resources at several major supercomputing centers. Its modular design allows many quantum simulation codes to be supported within the same framework. Current support includes quantum Monte Carlo calculations with QMCPACK, density functional theory calculations with Quantum Espresso or VASP, and quantum chemical calculations with GAMESS. Users can compose workflows through a transparent, text-based interface, resembling the input file of a typical simulation code. A usage example is provided to illustrate the process.

  1. Nexus: a modular workflow management system for quantum simulation codes

    DOE PAGES

    Krogel, Jaron T.

    2015-08-24

    The management of simulation workflows is a significant task for the individual computational researcher. Automation of the required tasks involved in simulation work can decrease the overall time to solution and reduce sources of human error. A new simulation workflow management system, Nexus, is presented to address these issues. Nexus is capable of automated job management on workstations and resources at several major supercomputing centers. Its modular design allows many quantum simulation codes to be supported within the same framework. Current support includes quantum Monte Carlo calculations with QMCPACK, density functional theory calculations with Quantum Espresso or VASP, and quantummore » chemical calculations with GAMESS. Users can compose workflows through a transparent, text-based interface, resembling the input file of a typical simulation code. A usage example is provided to illustrate the process.« less

  2. Performance enhancement of successive interference cancellation scheme based on spectral amplitude coding for optical code-division multiple-access systems using Hadamard codes

    NASA Astrophysics Data System (ADS)

    Eltaif, Tawfig; Shalaby, Hossam M. H.; Shaari, Sahbudin; Hamarsheh, Mohammad M. N.

    2009-04-01

    A successive interference cancellation scheme is applied to optical code-division multiple-access (OCDMA) systems with spectral amplitude coding (SAC). A detailed analysis of this system, with Hadamard codes used as signature sequences, is presented. The system can easily remove the effect of the strongest signal at each stage of the cancellation process. In addition, simulation of the prose system is performed in order to validate the theoretical results. The system shows a small bit error rate at a large number of active users compared to the SAC OCDMA system. Our results reveal that the proposed system is efficient in eliminating the effect of the multiple-user interference and in the enhancement of the overall performance.

  3. Source Code Analysis Laboratory (SCALe) for Energy Delivery Systems

    DTIC Science & Technology

    2010-12-01

    applications for conformance to one of the CERT® secure coding standards. CERT secure coding standards provide a detailed enumeration of coding errors...automated analysis tools to help them code securely. Secure coding standards provide a detailed enumeration of coding errors that have caused...including possible additional job aids . SCALe analysts will also be interviewed for context information surrounding incorrect judgments as part of

  4. Behavioral and systemic consequences of long-term inflammatory challenge.

    PubMed

    Fischer, Christina W; Elfving, Betina; Lund, Sten; Wegener, Gregers

    2015-11-15

    Inflammatory reactions are involved in a diversity of diseases, including major depressive disorder. Cytokines act as intercellular signaling molecules and mediators of inflammation between the periphery and the brain. Within the brain, evidence from animal studies of acute inflammation has shown that elevated cytokine levels are linked to behavioral responses of sickness and depression-like behavior. Although chronic inflammation is more translational to human depression than acute studies, little is known on central cytokine expression and associated behavioral responses following chronic immune challenges. The present study assessed behavioral changes and a selection of cytokines in the brain and in the blood in rats randomized to receive a single or 8week administration with either lipopolysaccharide (LPS, 600μg/kg, i.p.) or saline. Acute and long-term LPS treatments caused similar sickness and depression-like behavior. Chronic LPS administration did not have an effect on blood cytokine levels, indicating endotoxin tolerance, whereas increased fasting blood glucose was observed, indicating insulin resistance, a metabolic consequence of chronic inflammation. While a single LPS injection produced a generalized cytokine response in the brain, long-term LPS administration produced a specific central cytokine response with increased interleukin (IL)-1β and interferon (IFN)-γ. These cytokines can explain the behavioral changes observed, and could indicate microglia activation, although future studies are needed to uncover this assumption. Taken together, although the behavioral outcome was similar between acute and chronic LPS administration, the central cytokine response was distinct. As the long-term LPS paradigm also posed a metabolic demand, this setting may reflect a more translational insight into inflammatory reactions in human depression, and could prove useful for assessing cytokine down-stream effects and experimental antidepressant drug products.

  5. Validation of hydrogeochemical codes using the New Zealand geothermal system

    SciTech Connect

    Glassley, W.

    1992-12-01

    Evaluation of the performance of a nuclear waste repository requires that numerous parameters be evaluated over a broad range of conditions using codes. The capabilities of these codes must be demonstrated using complex natural systems in which the processes of interest have already occurred or are occurring. We have initiated such a test of geochemical and hydrological simulation codes, using the geothermal areas of the Taupo Volcanic Zone, New Zealand. Areas that have been evolving for a few tens to a few tens of thousands of years are of particular interest. This effort will help determine the extent to which simplified modeling approaches can be used in performance assessment calculations. To guide the selection of natural systems, we are attempting to map potential repository regions dominated by equilibrium processes and those dominated by kinetically controlled processes. To do so, fluid velocities and temperatures were computed using the V-TOUGH code assuming an equivalent continuum, dual porosity model. These results were then used to compare advective fluid flow rate with silica dissolution/precipitation rates, using Damkoehler numbers. Only the first 5000 years of repository operation were considered. The results identify a migrating envelope of kinetically dominated activity several meters wide in the vicinity of waste packages that contrasts with other parts of the repository. The Lake Rotokawa region, New Zealand, has been used in our first test effort, since it contains environments that are examples of kinetic and equilibrium processes. The results of tests involving equilibrium processes show excellent correspondence between simulated and observed mineral alteration sequences, although discrepancies in some mineral parageneses demonstrate that operator decisions in conducting simulations must be considered an integral part of validation efforts.

  6. Polar Codes

    DTIC Science & Technology

    2014-12-01

    density parity check (LDPC) code, a Reed–Solomon code, and three convolutional codes. iii CONTENTS EXECUTIVE SUMMARY...the most common. Many civilian systems use low density parity check (LDPC) FEC codes, and the Navy is planning to use LDPC for some future systems...other forward error correction methods: a turbo code, a low density parity check (LDPC) code, a Reed–Solomon code, and three convolutional codes

  7. Transmission over UWB channels with OFDM system using LDPC coding

    NASA Astrophysics Data System (ADS)

    Dziwoki, Grzegorz; Kucharczyk, Marcin; Sulek, Wojciech

    2009-06-01

    Hostile wireless environment requires use of sophisticated signal processing methods. The paper concerns on Ultra Wideband (UWB) transmission over Personal Area Networks (PAN) including MB-OFDM specification of physical layer. In presented work the transmission system with OFDM modulation was connected with LDPC encoder/decoder. Additionally the frame and bit error rate (FER and BER) of the system was decreased using results from the LDPC decoder in a kind of turbo equalization algorithm for better channel estimation. Computational block using evolutionary strategy, from genetic algorithms family, was also used in presented system. It was placed after SPA (Sum-Product Algorithm) decoder and is conditionally turned on in the decoding process. The result is increased effectiveness of the whole system, especially lower FER. The system was tested with two types of LDPC codes, depending on type of parity check matrices: randomly generated and constructed deterministically, optimized for practical decoder architecture implemented in the FPGA device.

  8. National Combustion Code: A Multidisciplinary Combustor Design System

    NASA Technical Reports Server (NTRS)

    Stubbs, Robert M.; Liu, Nan-Suey

    1997-01-01

    The Internal Fluid Mechanics Division conducts both basic research and technology, and system technology research for aerospace propulsion systems components. The research within the division, which is both computational and experimental, is aimed at improving fundamental understanding of flow physics in inlets, ducts, nozzles, turbomachinery, and combustors. This article and the following three articles highlight some of the work accomplished in 1996. A multidisciplinary combustor design system is critical for optimizing the combustor design process. Such a system should include sophisticated computer-aided design (CAD) tools for geometry creation, advanced mesh generators for creating solid model representations, a common framework for fluid flow and structural analyses, modern postprocessing tools, and parallel processing. The goal of the present effort is to develop some of the enabling technologies and to demonstrate their overall performance in an integrated system called the National Combustion Code.

  9. Performance of a space-time block coded code division multiple access system over Nakagami-m fading channels

    NASA Astrophysics Data System (ADS)

    Yu, Xiangbin; Dong, Tao; Xu, Dazhuan; Bi, Guangguo

    2010-09-01

    By introducing an orthogonal space-time coding scheme, multiuser code division multiple access (CDMA) systems with different space time codes are given, and corresponding system performance is investigated over a Nakagami-m fading channel. A low-complexity multiuser receiver scheme is developed for space-time block coded CDMA (STBC-CDMA) systems. The scheme can make full use of the complex orthogonality of space-time block coding to simplify the high decoding complexity of the existing scheme. Compared to the existing scheme with exponential decoding complexity, it has linear decoding complexity. Based on the performance analysis and mathematical calculation, the average bit error rate (BER) of the system is derived in detail for integer m and non-integer m, respectively. As a result, a tight closed-form BER expression is obtained for STBC-CDMA with an orthogonal spreading code, and an approximate closed-form BER expression is attained for STBC-CDMA with a quasi-orthogonal spreading code. Simulation results show that the proposed scheme can achieve almost the same performance as the existing scheme with low complexity. Moreover, the simulation results for average BER are consistent with the theoretical analysis.

  10. Origin and evolution of the Saturn system: Observational consequences

    NASA Technical Reports Server (NTRS)

    Pollack, J. B.

    1978-01-01

    A number of important cosmogonic questions concerning the Saturn system can be addressed with a Saturn-orbiter-dual-probe spacecraft mission. These questions include: The origin of the Saturn system; the source of Saturn's excess luminosity; the mechanism by which the irregular satellites were captured; the influence of Saturn's early luminosity on the composition of its regular satellites; and the origin of the rings. The first two topics can be studied by measurements made from an entry probe into Saturn's atmosphere, while the remaining issues can be investigated by measurements conducted from an orbiter. Background information is provided on these five questions describing the critical experiments needed to help resolve them.

  11. Defending public interests in private lands: compliance, costs and potential environmental consequences of the Brazilian Forest Code in Mato Grosso

    PubMed Central

    Stickler, Claudia M.; Nepstad, Daniel C.; Azevedo, Andrea A.; McGrath, David G.

    2013-01-01

    Land-use regulations are a critical component of forest governance and conservation strategies, but their effectiveness in shaping landholder behaviour is poorly understood. We conducted a spatial and temporal analysis of the Brazilian Forest Code (BFC) to understand the patterns of regulatory compliance over time and across changes in the policy, and the implications of these compliance patterns for the perceived costs to landholders and environmental performance of agricultural landscapes in the southern Amazon state of Mato Grosso. Landholdings tended to remain in compliance or not according to their status at the beginning of the study period. The perceived economic burden of BFC compliance on soya bean and beef producers (US$3–5.6 billion in net present value of the land) may in part explain the massive, successful campaign launched by the farm lobby to change the BFC. The ecological benefits of compliance (e.g. greater connectivity and carbon) with the BFC are diffuse and do not compete effectively with the economic benefits of non-compliance that are perceived by landholders. Volatile regulation of land-use decisions that affect billions in economic rent that could be captured is an inadequate forest governance instrument; effectiveness of such regulations may increase when implemented in tandem with positive incentives for forest conservation. PMID:23610168

  12. Error correcting coding-theory for structured light illumination systems

    NASA Astrophysics Data System (ADS)

    Porras-Aguilar, Rosario; Falaggis, Konstantinos; Ramos-Garcia, Ruben

    2017-06-01

    Intensity discrete structured light illumination systems project a series of projection patterns for the estimation of the absolute fringe order using only the temporal grey-level sequence at each pixel. This work proposes the use of error-correcting codes for pixel-wise correction of measurement errors. The use of an error correcting code is advantageous in many ways: it allows reducing the effect of random intensity noise, it corrects outliners near the border of the fringe commonly present when using intensity discrete patterns, and it provides a robustness in case of severe measurement errors (even for burst errors where whole frames are lost). The latter aspect is particular interesting in environments with varying ambient light as well as in critical safety applications as e.g. monitoring of deformations of components in nuclear power plants, where a high reliability is ensured even in case of short measurement disruptions. A special form of burst errors is the so-called salt and pepper noise, which can largely be removed with error correcting codes using only the information of a given pixel. The performance of this technique is evaluated using both simulations and experiments.

  13. An engineering code to analyze hypersonic thermal management systems

    NASA Technical Reports Server (NTRS)

    Vangriethuysen, Valerie J.; Wallace, Clark E.

    1993-01-01

    Thermal loads on current and future aircraft are increasing and as a result are stressing the energy collection, control, and dissipation capabilities of current thermal management systems and technology. The thermal loads for hypersonic vehicles will be no exception. In fact, with their projected high heat loads and fluxes, hypersonic vehicles are a prime example of systems that will require thermal management systems (TMS) that have been optimized and integrated with the entire vehicle to the maximum extent possible during the initial design stages. This will not only be to meet operational requirements, but also to fulfill weight and performance constraints in order for the vehicle to takeoff and complete its mission successfully. To meet this challenge, the TMS can no longer be two or more entirely independent systems, nor can thermal management be an after thought in the design process, the typical pervasive approach in the past. Instead, a TMS that was integrated throughout the entire vehicle and subsequently optimized will be required. To accomplish this, a method that iteratively optimizes the TMS throughout the vehicle will not only be highly desirable, but advantageous in order to reduce the manhours normally required to conduct the necessary tradeoff studies and comparisons. A thermal management engineering computer code that is under development and being managed at Wright Laboratory, Wright-Patterson AFB, is discussed. The primary goal of the code is to aid in the development of a hypersonic vehicle TMS that has been optimized and integrated on a total vehicle basis.

  14. System Design Considerations In Bar-Code Laser Scanning

    NASA Astrophysics Data System (ADS)

    Barkan, Eric; Swartz, Jerome

    1984-08-01

    The unified transfer function approach to the design of laser barcode scanner signal acquisition hardware is considered. The treatment of seemingly disparate system areas such as the optical train, the scanning spot, the electrical filter circuits, the effects of noise, and printing errors is presented using linear systems theory. Such important issues as determination of depth of modulation, filter specification, tolerancing of optical components, and optimi-zation of system performance in the presence of noise are discussed. The concept of effective spot size to allow for impact of optical system and analog processing circuitry upon depth of modulation is introduced. Considerations are limited primarily to Gaussian spot profiles, but also apply to more general cases. Attention is paid to realistic bar-code symbol models and to implications with respect to printing tolerances.

  15. [Aging of the respiratory system: anatomical changes and physiological consequences].

    PubMed

    Ketata, W; Rekik, W K; Ayadi, H; Kammoun, S

    2012-10-01

    The respiratory system undergoes progressive involution with age, resulting in anatomical and functional changes that are exerted on all levels. The rib cage stiffens and respiratory muscles weaken. Distal bronchioles have reduced diameter and tend to be collapsed. Mobilized lung volumes decrease with age while residual volume increases. Gas exchanges are modified with a linear decrease of PaO(2) up to the age of 70 years and a decreased diffusing capacity of carbon monoxide. Ventilatory responses to hypercapnia, hypoxia and exercise decrease in the elderly. Knowledge of changes in the respiratory system related to advancing age is a medical issue of great importance in order to distinguish the effects of aging from those of diseases.

  16. [Formal care systems consequences of a vision on informal caretakers].

    PubMed

    Escuredo Rodríguez, Bibiana

    2006-10-01

    Care for dependent persons falls, fundamentally, on their family members who usually perceive this situation as a problem due to its repercussions on the family group in general and on the health and quality of life for the informal caretaker in particular. The burden which an informal caretaker assumes depends on diverse variables among which the most important are considered to be social assistance and the forms of help which the caretaker has to rely on. At the same time, the resources and help available are determined by the vision which the formal system has for informal caretakers; therefore, it is important that nurses, as caretakers in the formal system, have a clear idea about the situations that are created and that nurses reflect on the alternatives which allow a dependent person to be cared for without forgetting the needs and rights of the caretakers.

  17. Corrosion consequences of microfouling in water reclamation systems

    NASA Technical Reports Server (NTRS)

    Ford, Tim; Mitchell, Ralph

    1991-01-01

    This paper examines the potential fouling and corrosion problems associated with microbial film formation throughout the water reclamation system (WRS) designed for the Space Station Freedom. It is shown that the use of advanced metal sputtering techiques combined with image analysis and FTIR spectroscopy will present realistic solutions for investigating the formation and function of biofilm on different alloys, the subsequent corrosion, and the efficiency of different treatments. These techniques, used in combination with electrochemical measurements of corrosion, will provide a powerful approach to examinations of materials considered for use in the WRS.

  18. Quantum Random Access Codes Using Single d -Level Systems

    NASA Astrophysics Data System (ADS)

    Tavakoli, Armin; Hameedi, Alley; Marques, Breno; Bourennane, Mohamed

    2015-05-01

    Random access codes (RACs) are used by a party to, with limited communication, access an arbitrary subset of information held by another party. Quantum resources are known to enable RACs that break classical limitations. Here, we study quantum and classical RACs with high-level communication. We derive average performances of classical RACs and present families of high-level quantum RACs. Our results show that high-level quantum systems can significantly increase the advantage of quantum RACs over their classical counterparts. We demonstrate our findings in an experimental realization of a quantum RAC with four-level communication.

  19. A LONE code for the sparse control of quantum systems

    NASA Astrophysics Data System (ADS)

    Ciaramella, G.; Borzì, A.

    2016-03-01

    In many applications with quantum spin systems, control functions with a sparse and pulse-shaped structure are often required. These controls can be obtained by solving quantum optimal control problems with L1-penalized cost functionals. In this paper, the MATLAB package LONE is presented aimed to solving L1-penalized optimal control problems governed by unitary-operator quantum spin models. This package implements a new strategy that includes a globalized semi-smooth Krylov-Newton scheme and a continuation procedure. Results of numerical experiments demonstrate the ability of the LONE code in computing accurate sparse optimal control solutions.

  20. A new balanced modulation code for a phase-image-based holographic data storage system

    NASA Astrophysics Data System (ADS)

    John, Renu; Joseph, Joby; Singh, Kehar

    2005-08-01

    We propose a new balanced modulation code for coding data pages for phase-image-based holographic data storage systems. The new code addresses the coding subtleties associated with phase-based systems while performing a content-based search in a holographic database. The new code, which is a balanced modulation code, is a modification of the existing 8:12 modulation code, and removes the false hits that occur in phase-based content-addressable systems due to phase-pixel subtractions. We demonstrate the better performance of the new code using simulations and experiments in terms of discrimination ratio while content addressing through a holographic memory. The new code is compared with the conventional coding scheme to analyse the false hits due to subtraction of phase pixels.

  1. 76 FR 4113 - Federal Procurement Data System Product Service Code Manual Update

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-01-24

    ... ADMINISTRATION Federal Procurement Data System Product Service Code Manual Update AGENCY: Office of... the Products and Services Code (PSC) Manual, which provides codes to describe products, services, and... pat.brooks@gsa.gov . SUPPLEMENTARY INFORMATION: The Products and Services Code (PSC) Manual...

  2. [Data coding in the Israeli healthcare system - do choices provide the answers to our system's needs?].

    PubMed

    Zelingher, Julian; Ash, Nachman

    2013-05-01

    The IsraeLi healthcare system has undergone major processes for the adoption of health information technologies (HIT), and enjoys high Levels of utilization in hospital and ambulatory care. Coding is an essential infrastructure component of HIT, and ts purpose is to represent data in a simplified and common format, enhancing its manipulation by digital systems. Proper coding of data enables efficient identification, storage, retrieval and communication of data. UtiLization of uniform coding systems by different organizations enables data interoperability between them, facilitating communication and integrating data elements originating in different information systems from various organizations. Current needs in Israel for heaLth data coding include recording and reporting of diagnoses for hospitalized patients, outpatients and visitors of the Emergency Department, coding of procedures and operations, coding of pathology findings, reporting of discharge diagnoses and causes of death, billing codes, organizational data warehouses and national registries. New national projects for cLinicaL data integration, obligatory reporting of quality indicators and new Ministry of Health (MOH) requirements for HIT necessitate a high Level of interoperability that can be achieved only through the adoption of uniform coding. Additional pressures were introduced by the USA decision to stop the maintenance of the ICD-9-CM codes that are also used by Israeli healthcare, and the adoption of ICD-10-C and ICD-10-PCS as the main coding system for billing purpose. The USA has also mandated utilization of SNOMED-CT as the coding terminology for the ELectronic Health Record problem list, and for reporting quality indicators to the CMS. Hence, the Israeli MOH has recently decided that discharge diagnoses will be reported using ICD-10-CM codes, and SNOMED-CT will be used to code the cLinical information in the EHR. We reviewed the characteristics, strengths and weaknesses of these two coding

  3. YALINA analytical benchmark analyses using the deterministic ERANOS code system.

    SciTech Connect

    Gohar, Y.; Aliberti, G.; Nuclear Engineering Division

    2009-08-31

    The growing stockpile of nuclear waste constitutes a severe challenge for the mankind for more than hundred thousand years. To reduce the radiotoxicity of the nuclear waste, the Accelerator Driven System (ADS) has been proposed. One of the most important issues of ADSs technology is the choice of the appropriate neutron spectrum for the transmutation of Minor Actinides (MA) and Long Lived Fission Products (LLFP). This report presents the analytical analyses obtained with the deterministic ERANOS code system for the YALINA facility within: (a) the collaboration between Argonne National Laboratory (ANL) of USA and the Joint Institute for Power and Nuclear Research (JIPNR) Sosny of Belarus; and (b) the IAEA coordinated research projects for accelerator driven systems (ADS). This activity is conducted as a part of the Russian Research Reactor Fuel Return (RRRFR) Program and the Global Threat Reduction Initiative (GTRI) of DOE/NNSA.

  4. System for Processing Coded OFDM Under Doppler and Fading

    NASA Technical Reports Server (NTRS)

    Tsou, Haiping; Darden, Scott; Lee, Dennis; Yan, Tsun-Yee

    2005-01-01

    An advanced communication system has been proposed for transmitting and receiving coded digital data conveyed as a form of quadrature amplitude modulation (QAM) on orthogonal frequency-division multiplexing (OFDM) signals in the presence of such adverse propagation-channel effects as large dynamic Doppler shifts and frequency-selective multipath fading. Such adverse channel effects are typical of data communications between mobile units or between mobile and stationary units (e.g., telemetric transmissions from aircraft to ground stations). The proposed system incorporates novel signal processing techniques intended to reduce the losses associated with adverse channel effects while maintaining compatibility with the high-speed physical layer specifications defined for wireless local area networks (LANs) as the standard 802.11a of the Institute of Electrical and Electronics Engineers (IEEE 802.11a). OFDM is a multi-carrier modulation technique that is widely used for wireless transmission of data in LANs and in metropolitan area networks (MANs). OFDM has been adopted in IEEE 802.11a and some other industry standards because it affords robust performance under frequency-selective fading. However, its intrinsic frequency-diversity feature is highly sensitive to synchronization errors; this sensitivity poses a challenge to preserve coherence between the component subcarriers of an OFDM system in order to avoid intercarrier interference in the presence of large dynamic Doppler shifts as well as frequency-selective fading. As a result, heretofore, the use of OFDM has been limited primarily to applications involving small or zero Doppler shifts. The proposed system includes a digital coherent OFDM communication system that would utilize enhanced 802.1la-compatible signal-processing algorithms to overcome effects of frequency-selective fading and large dynamic Doppler shifts. The overall transceiver design would implement a two-frequency-channel architecture (see figure

  5. FPGA based digital phase-coding quantum key distribution system

    NASA Astrophysics Data System (ADS)

    Lu, XiaoMing; Zhang, LiJun; Wang, YongGang; Chen, Wei; Huang, DaJun; Li, Deng; Wang, Shuang; He, DeYong; Yin, ZhenQiang; Zhou, Yu; Hui, Cong; Han, ZhengFu

    2015-12-01

    Quantum key distribution (QKD) is a technology with the potential capability to achieve information-theoretic security. Phasecoding is an important approach to develop practical QKD systems in fiber channel. In order to improve the phase-coding modulation rate, we proposed a new digital-modulation method in this paper and constructed a compact and robust prototype of QKD system using currently available components in our lab to demonstrate the effectiveness of the method. The system was deployed in laboratory environment over a 50 km fiber and continuously operated during 87 h without manual interaction. The quantum bit error rate (QBER) of the system was stable with an average value of 3.22% and the secure key generation rate is 8.91 kbps. Although the modulation rate of the photon in the demo system was only 200 MHz, which was limited by the Faraday-Michelson interferometer (FMI) structure, the proposed method and the field programmable gate array (FPGA) based electronics scheme have a great potential for high speed QKD systems with Giga-bits/second modulation rate.

  6. EMPIRE: Nuclear Reaction Model Code System for Data Evaluation

    NASA Astrophysics Data System (ADS)

    Herman, M.; Capote, R.; Carlson, B. V.; Obložinský, P.; Sin, M.; Trkov, A.; Wienke, H.; Zerkin, V.

    2007-12-01

    EMPIRE is a modular system of nuclear reaction codes, comprising various nuclear models, and designed for calculations over a broad range of energies and incident particles. A projectile can be a neutron, proton, any ion (including heavy-ions) or a photon. The energy range extends from the beginning of the unresolved resonance region for neutron-induced reactions (∽ keV) and goes up to several hundred MeV for heavy-ion induced reactions. The code accounts for the major nuclear reaction mechanisms, including direct, pre-equilibrium and compound nucleus ones. Direct reactions are described by a generalized optical model (ECIS03) or by the simplified coupled-channels approach (CCFUS). The pre-equilibrium mechanism can be treated by a deformation dependent multi-step direct (ORION + TRISTAN) model, by a NVWY multi-step compound one or by either a pre-equilibrium exciton model with cluster emission (PCROSS) or by another with full angular momentum coupling (DEGAS). Finally, the compound nucleus decay is described by the full featured Hauser-Feshbach model with γ-cascade and width-fluctuations. Advanced treatment of the fission channel takes into account transmission through a multiple-humped fission barrier with absorption in the wells. The fission probability is derived in the WKB approximation within the optical model of fission. Several options for nuclear level densities include the EMPIRE-specific approach, which accounts for the effects of the dynamic deformation of a fast rotating nucleus, the classical Gilbert-Cameron approach and pre-calculated tables obtained with a microscopic model based on HFB single-particle level schemes with collective enhancement. A comprehensive library of input parameters covers nuclear masses, optical model parameters, ground state deformations, discrete levels and decay schemes, level densities, fission barriers, moments of inertia and γ-ray strength functions. The results can be converted into ENDF-6 formatted files using the

  7. Overview of Particle and Heavy Ion Transport Code System PHITS

    NASA Astrophysics Data System (ADS)

    Sato, Tatsuhiko; Niita, Koji; Matsuda, Norihiro; Hashimoto, Shintaro; Iwamoto, Yosuke; Furuta, Takuya; Noda, Shusaku; Ogawa, Tatsuhiko; Iwase, Hiroshi; Nakashima, Hiroshi; Fukahori, Tokio; Okumura, Keisuke; Kai, Tetsuya; Chiba, Satoshi; Sihver, Lembit

    2014-06-01

    A general purpose Monte Carlo Particle and Heavy Ion Transport code System, PHITS, is being developed through the collaboration of several institutes in Japan and Europe. The Japan Atomic Energy Agency is responsible for managing the entire project. PHITS can deal with the transport of nearly all particles, including neutrons, protons, heavy ions, photons, and electrons, over wide energy ranges using various nuclear reaction models and data libraries. It is written in Fortran language and can be executed on almost all computers. All components of PHITS such as its source, executable and data-library files are assembled in one package and then distributed to many countries via the Research organization for Information Science and Technology, the Data Bank of the Organization for Economic Co-operation and Development's Nuclear Energy Agency, and the Radiation Safety Information Computational Center. More than 1,000 researchers have been registered as PHITS users, and they apply the code to various research and development fields such as nuclear technology, accelerator design, medical physics, and cosmic-ray research. This paper briefly summarizes the physics models implemented in PHITS, and introduces some important functions useful for specific applications, such as an event generator mode and beam transport functions.

  8. Video coding for next-generation surveillance systems

    NASA Astrophysics Data System (ADS)

    Klasen, Lena M.; Fahlander, Olov

    1997-02-01

    Video is used as recording media in surveillance system and also more frequently by the Swedish Police Force. Methods for analyzing video using an image processing system have recently been introduced at the Swedish National Laboratory of Forensic Science, and new methods are in focus in a research project at Linkoping University, Image Coding Group. The accuracy of the result of those forensic investigations often depends on the quality of the video recordings, and one of the major problems when analyzing videos from crime scenes is the poor quality of the recordings. Enhancing poor image quality might add manipulative or subjective effects and does not seem to be the right way of getting reliable analysis results. The surveillance system in use today is mainly based on video techniques, VHS or S-VHS, and the weakest link is the video cassette recorder, (VCR). Multiplexers for selecting one of many camera outputs for recording is another problem as it often filters the video signal, and recording is limited to only one of the available cameras connected to the VCR. A way to get around the problem of poor recording is to simultaneously record all camera outputs digitally. It is also very important to build such a system bearing in mind that image processing analysis methods becomes more important as a complement to the human eye. Using one or more cameras gives a large amount of data, and the need for data compression is more than obvious. Crime scenes often involve persons or moving objects, and the available coding techniques are more or less useful. Our goal is to propose a possible system, being the best compromise with respect to what needs to be recorded, movements in the recorded scene, loss of information and resolution etc., to secure the efficient recording of the crime and enable forensic analysis. The preventative effective of having a well functioning surveillance system and well established image analysis methods is not to be neglected. Aspects of

  9. Charged and neutral particle transport methods and applications: The CALOR code system

    SciTech Connect

    Gabriel, T.A.; Charlton, L.A.

    1997-04-01

    The CALOR code system, which is a complete radiation transport code system, is described with emphasis on the high-energy (> 20 MeV) nuclear collision models. Codes similar to CALOR are also briefly discussed. A current application using CALOR which deals with the development of the National Spallation Neutron Source is also given.

  10. 10 CFR 434.99 - Explanation of numbering system for codes.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 10 Energy 3 2014-01-01 2014-01-01 false Explanation of numbering system for codes. 434.99 Section 434.99 Energy DEPARTMENT OF ENERGY ENERGY CONSERVATION ENERGY CODE FOR NEW FEDERAL COMMERCIAL AND MULTI-FAMILY HIGH RISE RESIDENTIAL BUILDINGS § 434.99 Explanation of numbering system for codes. (a)...

  11. 10 CFR 434.99 - Explanation of numbering system for codes.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 10 Energy 3 2013-01-01 2013-01-01 false Explanation of numbering system for codes. 434.99 Section 434.99 Energy DEPARTMENT OF ENERGY ENERGY CONSERVATION ENERGY CODE FOR NEW FEDERAL COMMERCIAL AND MULTI-FAMILY HIGH RISE RESIDENTIAL BUILDINGS § 434.99 Explanation of numbering system for codes. (a)...

  12. 10 CFR 434.99 - Explanation of numbering system for codes.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 10 Energy 3 2012-01-01 2012-01-01 false Explanation of numbering system for codes. 434.99 Section 434.99 Energy DEPARTMENT OF ENERGY ENERGY CONSERVATION ENERGY CODE FOR NEW FEDERAL COMMERCIAL AND MULTI-FAMILY HIGH RISE RESIDENTIAL BUILDINGS § 434.99 Explanation of numbering system for codes. (a)...

  13. 10 CFR 434.99 - Explanation of numbering system for codes.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 10 Energy 3 2011-01-01 2011-01-01 false Explanation of numbering system for codes. 434.99 Section 434.99 Energy DEPARTMENT OF ENERGY ENERGY CONSERVATION ENERGY CODE FOR NEW FEDERAL COMMERCIAL AND MULTI-FAMILY HIGH RISE RESIDENTIAL BUILDINGS § 434.99 Explanation of numbering system for codes. (a)...

  14. 10 CFR 434.99 - Explanation of numbering system for codes.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 10 Energy 3 2010-01-01 2010-01-01 false Explanation of numbering system for codes. 434.99 Section 434.99 Energy DEPARTMENT OF ENERGY ENERGY CONSERVATION ENERGY CODE FOR NEW FEDERAL COMMERCIAL AND MULTI-FAMILY HIGH RISE RESIDENTIAL BUILDINGS § 434.99 Explanation of numbering system for codes. (a)...

  15. Verification of ARES transport code system with TAKEDA benchmarks

    NASA Astrophysics Data System (ADS)

    Zhang, Liang; Zhang, Bin; Zhang, Penghe; Chen, Mengteng; Zhao, Jingchang; Zhang, Shun; Chen, Yixue

    2015-10-01

    Neutron transport modeling and simulation are central to many areas of nuclear technology, including reactor core analysis, radiation shielding and radiation detection. In this paper the series of TAKEDA benchmarks are modeled to verify the critical calculation capability of ARES, a discrete ordinates neutral particle transport code system. SALOME platform is coupled with ARES to provide geometry modeling and mesh generation function. The Koch-Baker-Alcouffe parallel sweep algorithm is applied to accelerate the traditional transport calculation process. The results show that the eigenvalues calculated by ARES are in excellent agreement with the reference values presented in NEACRP-L-330, with a difference less than 30 pcm except for the first case of model 3. Additionally, ARES provides accurate fluxes distribution compared to reference values, with a deviation less than 2% for region-averaged fluxes in all cases. All of these confirms the feasibility of ARES-SALOME coupling and demonstrate that ARES has a good performance in critical calculation.

  16. Hybrid Compton camera/coded aperture imaging system

    DOEpatents

    Mihailescu, Lucian [Livermore, CA; Vetter, Kai M [Alameda, CA

    2012-04-10

    A system in one embodiment includes an array of radiation detectors; and an array of imagers positioned behind the array of detectors relative to an expected trajectory of incoming radiation. A method in another embodiment includes detecting incoming radiation with an array of radiation detectors; detecting the incoming radiation with an array of imagers positioned behind the array of detectors relative to a trajectory of the incoming radiation; and performing at least one of Compton imaging using at least the imagers and coded aperture imaging using at least the imagers. A method in yet another embodiment includes detecting incoming radiation with an array of imagers positioned behind an array of detectors relative to a trajectory of the incoming radiation; and performing Compton imaging using at least the imagers.

  17. Biometric iris image acquisition system with wavefront coding technology

    NASA Astrophysics Data System (ADS)

    Hsieh, Sheng-Hsun; Yang, Hsi-Wen; Huang, Shao-Hung; Li, Yung-Hui; Tien, Chung-Hao

    2013-09-01

    Biometric signatures for identity recognition have been practiced for centuries. Basically, the personal attributes used for a biometric identification system can be classified into two areas: one is based on physiological attributes, such as DNA, facial features, retinal vasculature, fingerprint, hand geometry, iris texture and so on; the other scenario is dependent on the individual behavioral attributes, such as signature, keystroke, voice and gait style. Among these features, iris recognition is one of the most attractive approaches due to its nature of randomness, texture stability over a life time, high entropy density and non-invasive acquisition. While the performance of iris recognition on high quality image is well investigated, not too many studies addressed that how iris recognition performs subject to non-ideal image data, especially when the data is acquired in challenging conditions, such as long working distance, dynamical movement of subjects, uncontrolled illumination conditions and so on. There are three main contributions in this paper. Firstly, the optical system parameters, such as magnification and field of view, was optimally designed through the first-order optics. Secondly, the irradiance constraints was derived by optical conservation theorem. Through the relationship between the subject and the detector, we could estimate the limitation of working distance when the camera lens and CCD sensor were known. The working distance is set to 3m in our system with pupil diameter 86mm and CCD irradiance 0.3mW/cm2. Finally, We employed a hybrid scheme combining eye tracking with pan and tilt system, wavefront coding technology, filter optimization and post signal recognition to implement a robust iris recognition system in dynamic operation. The blurred image was restored to ensure recognition accuracy over 3m working distance with 400mm focal length and aperture F/6.3 optics. The simulation result as well as experiment validates the proposed code

  18. RFSYS: an inventory code for RF system parameters

    SciTech Connect

    Treadwell, E.A.

    1983-03-01

    RFSYS is a program which maintains an inventory of rf system parameters associated with the 200 MeV Linear Accelerator at Fermi National Accelerator Laboratory. The program, written by Elliott Treadwell, of the Linac group, offers five modes of operation: (1) Allocates memory space for additional rf systems (data arrays). (2) Prints a total or partial list of old tube parameters on an ADM-3 terminal. (3) Changes tube data stored in the master array. If the number of systems increases, this mode permits the user to enter new data. (4) Computes the average time of operation for a given tube and system. (5) Stops program execution. There is an exit option, (a) create one output data file or (b) create three output files, one of which contains column headers and coded comments. All output files are stored on the CYBER-175 disc, and eventually on high density (6250 B.P.I.) magnetic tapes. This arrangement eliminates the necessity for online data buffers.

  19. A simple model of optimal population coding for sensory systems.

    PubMed

    Doi, Eizaburo; Lewicki, Michael S

    2014-08-01

    A fundamental task of a sensory system is to infer information about the environment. It has long been suggested that an important goal of the first stage of this process is to encode the raw sensory signal efficiently by reducing its redundancy in the neural representation. Some redundancy, however, would be expected because it can provide robustness to noise inherent in the system. Encoding the raw sensory signal itself is also problematic, because it contains distortion and noise. The optimal solution would be constrained further by limited biological resources. Here, we analyze a simple theoretical model that incorporates these key aspects of sensory coding, and apply it to conditions in the retina. The model specifies the optimal way to incorporate redundancy in a population of noisy neurons, while also optimally compensating for sensory distortion and noise. Importantly, it allows an arbitrary input-to-output cell ratio between sensory units (photoreceptors) and encoding units (retinal ganglion cells), providing predictions of retinal codes at different eccentricities. Compared to earlier models based on redundancy reduction, the proposed model conveys more information about the original signal. Interestingly, redundancy reduction can be near-optimal when the number of encoding units is limited, such as in the peripheral retina. We show that there exist multiple, equally-optimal solutions whose receptive field structure and organization vary significantly. Among these, the one which maximizes the spatial locality of the computation, but not the sparsity of either synaptic weights or neural responses, is consistent with known basic properties of retinal receptive fields. The model further predicts that receptive field structure changes less with light adaptation at higher input-to-output cell ratios, such as in the periphery.

  20. Comparison of PSF maxima and minima of multiple annuli coded aperture (MACA) and complementary multiple annuli coded aperture (CMACA) systems

    NASA Astrophysics Data System (ADS)

    Ratnam, Challa; Lakshmana Rao, Vadlamudi; Lachaa Goud, Sivagouni

    2006-10-01

    In the present paper, and a series of papers to follow, the Fourier analytical properties of multiple annuli coded aperture (MACA) and complementary multiple annuli coded aperture (CMACA) systems are investigated. First, the transmission function for MACA and CMACA is derived using Fourier methods and, based on the Fresnel-Kirchoff diffraction theory, the formulae for the point spread function are formulated. The PSF maxima and minima are calculated for both the MACA and CMACA systems. The dependence of these properties on the number of zones is studied and reported in this paper.

  1. Performance Analysis of a CDMA VSAT System With Convoltional and Reed-Solomon Coding

    DTIC Science & Technology

    2002-09-01

    Error Correction (FEC), Walsh codes and PN sequences are used to generate a CDMA system and FEC is used to further improve the performance. Convolutional and block coding methods are examined and the results are obtained for each different case, including concatenated use of the codes, The performance of the system is given in terms of Bit Error Rate (BER), As observed from the results, the performance is mainly affected by the number of users and the code

  2. Variable weight Khazani-Syed code using hybrid fixed-dynamic technique for optical code division multiple access system

    NASA Astrophysics Data System (ADS)

    Anas, Siti Barirah Ahmad; Seyedzadeh, Saleh; Mokhtar, Makhfudzah; Sahbudin, Ratna Kalos Zakiah

    2016-10-01

    Future Internet consists of a wide spectrum of applications with different bit rates and quality of service (QoS) requirements. Prioritizing the services is essential to ensure that the delivery of information is at its best. Existing technologies have demonstrated how service differentiation techniques can be implemented in optical networks using data link and network layer operations. However, a physical layer approach can further improve system performance at a prescribed received signal quality by applying control at the bit level. This paper proposes a coding algorithm to support optical domain service differentiation using spectral amplitude coding techniques within an optical code division multiple access (OCDMA) scenario. A particular user or service has a varying weight applied to obtain the desired signal quality. The properties of the new code are compared with other OCDMA codes proposed for service differentiation. In addition, a mathematical model is developed for performance evaluation of the proposed code using two different detection techniques, namely direct decoding and complementary subtraction.

  3. Development of System Based Code: Case Study of Life-Cycle Margin Evaluation

    SciTech Connect

    Tai Asayama; Masaki Morishita; Masanori Tashimo

    2006-07-01

    For a leap of progress in structural deign of nuclear plant components, The late Professor Emeritus Yasuhide Asada proposed the System Based Code. The key concepts of the System Based Code are; (1) life-cycle margin optimization, (2) expansion of technical options as well as combinations of technical options beyond the current codes and standards, and (3) designing to clearly defined target reliabilities. Those concepts are very new to most of the nuclear power plant designers who are naturally obliged to design to current codes and standards; the application of the concepts of the System Based Code to design will lead to entire change of practices that designers have long been accustomed to. On the other hand, experienced designers are supposed to have expertise that can support and accelerate the development of the System Based Code. Therefore, interfacing with experienced designers is of crucial importance for the development of the System Based Code. The authors conducted a survey on the acceptability of the System Based Code concept. The results were analyzed from the possibility of improving structural design both in terms of reliability and cost effectiveness by the introduction of the System Based Code concept. It was concluded that the System Based Code is beneficial for those purposes. Also described is the expertise elicited from the results of the survey that can be reflected to the development of the System Based Code. (authors)

  4. Time-Dependent, Parallel Neutral Particle Transport Code System.

    SciTech Connect

    BAKER, RANDAL S.

    2009-09-10

    Version 00 PARTISN (PARallel, TIme-Dependent SN) is the evolutionary successor to CCC-547/DANTSYS. The PARTISN code package is a modular computer program package designed to solve the time-independent or dependent multigroup discrete ordinates form of the Boltzmann transport equation in several different geometries. The modular construction of the package separates the input processing, the transport equation solving, and the post processing (or edit) functions into distinct code modules: the Input Module, the Solver Module, and the Edit Module, respectively. PARTISN is the evolutionary successor to the DANTSYSTM code system package. The Input and Edit Modules in PARTISN are very similar to those in DANTSYS. However, unlike DANTSYS, the Solver Module in PARTISN contains one, two, and three-dimensional solvers in a single module. In addition to the diamond-differencing method, the Solver Module also has Adaptive Weighted Diamond-Differencing (AWDD), Linear Discontinuous (LD), and Exponential Discontinuous (ED) spatial differencing methods. The spatial mesh may consist of either a standard orthogonal mesh or a block adaptive orthogonal mesh. The Solver Module may be run in parallel for two and three dimensional problems. One can now run 1-D problems in parallel using Energy Domain Decomposition (triggered by Block 5 input keyword npeg>0). EDD can also be used in 2-D/3-D with or without our standard Spatial Domain Decomposition. Both the static (fixed source or eigenvalue) and time-dependent forms of the transport equation are solved in forward or adjoint mode. In addition, PARTISN now has a probabilistic mode for Probability of Initiation (static) and Probability of Survival (dynamic) calculations. Vacuum, reflective, periodic, white, or inhomogeneous boundary conditions are solved. General anisotropic scattering and inhomogeneous sources are permitted. PARTISN solves the transport equation on orthogonal (single level or block-structured AMR) grids in 1-D (slab, two

  5. Web- and system-code based, interactive, nuclear power plant simulators

    SciTech Connect

    Kim, K. D.; Jain, P.; Rizwan, U.

    2006-07-01

    Using two different approaches, on-line, web- and system-code based graphical user interfaces have been developed for reactor system analysis. Both are LabVIEW (graphical programming language developed by National Instruments) based systems that allow local users as well as those at remote sites to run, interact and view the results of the system code in a web browser. In the first approach, only the data written by the system code in a tab separated ASCII output file is accessed and displayed graphically. In the second approach, LabVIEW virtual instruments are coupled with the system code as dynamic link libraries (DLL). RELAP5 is used as the system code to demonstrate the capabilities of these approaches. From collaborative projects between teams in geographically remote locations to providing system code experience to distance education students, these tools can be very beneficial in many areas of teaching and R and D. (authors)

  6. Evaluation and implementation of QR Code Identity Tag system for Healthcare in Turkey.

    PubMed

    Uzun, Vassilya; Bilgin, Sami

    2016-01-01

    For this study, we designed a QR Code Identity Tag system to integrate into the Turkish healthcare system. This system provides QR code-based medical identification alerts and an in-hospital patient identification system. Every member of the medical system is assigned a unique QR Code Tag; to facilitate medical identification alerts, the QR Code Identity Tag can be worn as a bracelet or necklace or carried as an ID card. Patients must always possess the QR Code Identity bracelets within hospital grounds. These QR code bracelets link to the QR Code Identity website, where detailed information is stored; a smartphone or standalone QR code scanner can be used to scan the code. The design of this system allows authorized personnel (e.g., paramedics, firefighters, or police) to access more detailed patient information than the average smartphone user: emergency service professionals are authorized to access patient medical histories to improve the accuracy of medical treatment. In Istanbul, we tested the self-designed system with 174 participants. To analyze the QR Code Identity Tag system's usability, the participants completed the System Usability Scale questionnaire after using the system.

  7. Multiparticle Monte Carlo Code System for Shielding and Criticality Use.

    SciTech Connect

    2015-06-01

    Version 00 COG is a modern, full-featured Monte Carlo radiation transport code that provides accurate answers to complex shielding, criticality, and activation problems.COG was written to be state-of-the-art and free of physics approximations and compromises found in earlier codes. COG is fully 3-D, uses point-wise cross sections and exact angular scattering, and allows a full range of biasing options to speed up solutions for deep penetration problems. Additionally, a criticality option is available for computing Keff for assemblies of fissile materials. ENDL or ENDFB cross section libraries may be used. COG home page: http://cog.llnl.gov. Cross section libraries are included in the package. COG can use either the LLNL ENDL-90 cross section set or the ENDFB/VI set. Analytic surfaces are used to describe geometric boundaries. Parts (volumes) are described by a method of Constructive Solid Geometry. Surface types include surfaces of up to fourth order, and pseudo-surfaces such as boxes, finite cylinders, and figures of revolution. Repeated assemblies need be defined only once. Parts are visualized in cross-section and perspective picture views. A lattice feature simplifies the specification of regular arrays of parts. Parallel processing under MPI is supported for multi-CPU systems. Source and random-walk biasing techniques may be selected to improve solution statistics. These include source angular biasing, importance weighting, particle splitting and Russian roulette, pathlength stretching, point detectors, scattered direction biasing, and forced collisions. Criticality – For a fissioning system, COG will compute Keff by transporting batches of neutrons through the system. Activation – COG can compute gamma-ray doses due to neutron-activated materials, starting with just a neutron source. Coupled Problems – COG can solve coupled problems involving neutrons, photons, and electrons. COG 11.1 is an updated version of COG11.1 BETA 2 (RSICC C00777MNYCP02). New

  8. Cause-consequence analysis of a generic Space Station computer system

    NASA Astrophysics Data System (ADS)

    Pauperas, John

    This paper reviews the application of a cause-consequence analysis technique to summarize the safety concerns and proposed safeguards for a generic Space Station computer system. The cause-consequence diagram included in this paper presents a summary of causal factors for the initiating event. The diagram also identifies the inherent safety features of the computer system, both hardware and software, that preclude unwanted command and control functions. Additional safeguards needed to prevent or minimize the occurrence of the noted safety critical hazards are also shown in the event tree portion of the diagram. A complex safety analysis of a computer system application is summarized on a single page for management review.

  9. Prototype demonstration of radiation therapy planning code system

    SciTech Connect

    Little, R.C.; Adams, K.J.; Estes, G.P.; Hughes, L.S. III; Waters, L.S.

    1996-09-01

    This is the final report of a one-year, Laboratory-Directed Research and Development project at the Los Alamos National Laboratory (LANL). Radiation therapy planning is the process by which a radiation oncologist plans a treatment protocol for a patient preparing to undergo radiation therapy. The objective is to develop a protocol that delivers sufficient radiation dose to the entire tumor volume, while minimizing dose to healthy tissue. Radiation therapy planning, as currently practiced in the field, suffers from inaccuracies made in modeling patient anatomy and radiation transport. This project investigated the ability to automatically model patient-specific, three-dimensional (3-D) geometries in advanced Los Alamos radiation transport codes (such as MCNP), and to efficiently generate accurate radiation dose profiles in these geometries via sophisticated physics modeling. Modem scientific visualization techniques were utilized. The long-term goal is that such a system could be used by a non-expert in a distributed computing environment to help plan the treatment protocol for any candidate radiation source. The improved accuracy offered by such a system promises increased efficacy and reduced costs for this important aspect of health care.

  10. A dual-sided coded-aperture radiation detection system

    NASA Astrophysics Data System (ADS)

    Penny, R. D.; Hood, W. E.; Polichar, R. M.; Cardone, F. H.; Chavez, L. G.; Grubbs, S. G.; Huntley, B. P.; Kuharski, R. A.; Shyffer, R. T.; Fabris, L.; Ziock, K. P.; Labov, S. E.; Nelson, K.

    2011-10-01

    We report the development of a large-area, mobile, coded-aperture radiation imaging system for localizing compact radioactive sources in three dimensions while rejecting distributed background. The 3D Stand-Off Radiation Detection System (SORDS-3D) has been tested at speeds up to 95 km/h and has detected and located sources in the millicurie range at distances of over 100 m. Radiation data are imaged to a geospatially mapped world grid with a nominal 1.25- to 2.5-m pixel pitch at distances out to 120 m on either side of the platform. Source elevation is also extracted. Imaged radiation alarms are superimposed on a side-facing video log that can be played back for direct localization of sources in buildings in urban environments. The system utilizes a 37-element array of 5×5×50 cm 3 cesium-iodide (sodium) detectors. Scintillation light is collected by a pair of photomultiplier tubes placed at either end of each detector, with the detectors achieving an energy resolution of 6.15% FWHM (662 keV) and a position resolution along their length of 5 cm FWHM. The imaging system generates a dual-sided two-dimensional image allowing users to efficiently survey a large area. Imaged radiation data and raw spectra are forwarded to the RadioNuclide Analysis Kit (RNAK), developed by our collaborators, for isotope ID. An intuitive real-time display aids users in performing searches. Detector calibration is dynamically maintained by monitoring the potassium-40 peak and digitally adjusting individual detector gains. We have recently realized improvements, both in isotope identification and in distinguishing compact sources from background, through the installation of optimal-filter reconstruction kernels.

  11. The Unintended Consequences of the Adoption of Electronic Medical Record Systems on Healthcare Costs

    ERIC Educational Resources Information Center

    Ganju, Kartik K.

    2016-01-01

    In my dissertation, I study unintended consequences of the adoption of EMR systems. In my three essays, I examine how the adoption of EMR systems affects neighboring hospitals (spillover effects), can be used by hospitals to further its objectives in an unconventional manner ("upcoding" of patient case mix data), and how EMR adoption may…

  12. System for loading executable code into volatile memory in a downhole tool

    DOEpatents

    Hall, David R.; Bartholomew, David B.; Johnson, Monte L.

    2007-09-25

    A system for loading an executable code into volatile memory in a downhole tool string component comprises a surface control unit comprising executable code. An integrated downhole network comprises data transmission elements in communication with the surface control unit and the volatile memory. The executable code, stored in the surface control unit, is not permanently stored in the downhole tool string component. In a preferred embodiment of the present invention, the downhole tool string component comprises boot memory. In another embodiment, the executable code is an operating system executable code. Preferably, the volatile memory comprises random access memory (RAM). A method for loading executable code to volatile memory in a downhole tool string component comprises sending the code from the surface control unit to a processor in the downhole tool string component over the network. A central processing unit writes the executable code in the volatile memory.

  13. Environmental performance of green building code and certification systems.

    PubMed

    Suh, Sangwon; Tomar, Shivira; Leighton, Matthew; Kneifel, Joshua

    2014-01-01

    We examined the potential life-cycle environmental impact reduction of three green building code and certification (GBCC) systems: LEED, ASHRAE 189.1, and IgCC. A recently completed whole-building life cycle assessment (LCA) database of NIST was applied to a prototype building model specification by NREL. TRACI 2.0 of EPA was used for life cycle impact assessment (LCIA). The results showed that the baseline building model generates about 18 thousand metric tons CO2-equiv. of greenhouse gases (GHGs) and consumes 6 terajoule (TJ) of primary energy and 328 million liter of water over its life-cycle. Overall, GBCC-compliant building models generated 0% to 25% less environmental impacts than the baseline case (average 14% reduction). The largest reductions were associated with acidification (25%), human health-respiratory (24%), and global warming (GW) (22%), while no reductions were observed for ozone layer depletion (OD) and land use (LU). The performances of the three GBCC-compliant building models measured in life-cycle impact reduction were comparable. A sensitivity analysis showed that the comparative results were reasonably robust, although some results were relatively sensitive to the behavioral parameters, including employee transportation and purchased electricity during the occupancy phase (average sensitivity coefficients 0.26-0.29).

  14. Error correction coding for frequency-hopping multiple-access spread spectrum communication systems

    NASA Technical Reports Server (NTRS)

    Healy, T. J.

    1982-01-01

    A communication system which would effect channel coding for frequency-hopped multiple-access is described. It is shown that in theory coding can increase the spectrum utilization efficiency of a system with mutual interference to 100 percent. Various coding strategies are discussed and some initial comparisons are given. Some of the problems associated with implementing the type of system described here are discussed.

  15. TFaNS Tone Fan Noise Design/Prediction System. Volume 3; Evaluation of System Codes

    NASA Technical Reports Server (NTRS)

    Topol, David A.

    1999-01-01

    TFANS is the Tone Fan Noise Design/Prediction System developed by Pratt & Whitney under contract to NASA Lewis (presently NASA Glenn). The purpose of this system is to predict tone noise emanating from a fan stage including the effects of reflection and transmission by the rotor and stator and by the duct inlet and nozzle. These effects have been added to an existing annular duct/isolated stator noise prediction capability. TFANS consists of: The codes that compute the acoustic properties (reflection and transmission coefficients) of the various elements and write them to files. Cup3D: Fan Noise Coupling Code that reads these files, solves the coupling problem, and outputs the desired noise predictions. AWAKEN: CFD/Measured Wake Postprocessor which reformats CFD wake predictions and/or measured wake data so it can be used by the system. This volume of the report evaluates TFANS versus full-scale and ADP 22" fig data using the semi-empirical wake modelling in the system. This report is divided into three volumes: Volume 1: System Description, CUP3D Technical Documentation, and Manual for Code Developers; Volume II: User's Manual, TFANS Version 1.4; Volume III: Evaluation of System Codes.

  16. Channel coding and data compression system considerations for efficient communication of planetary imaging data

    NASA Technical Reports Server (NTRS)

    Rice, R. F.

    1974-01-01

    End-to-end system considerations involving channel coding and data compression which could drastically improve the efficiency in communicating pictorial information from future planetary spacecraft are presented.

  17. Modifying real convolutional codes for protecting digital filtering systems

    NASA Technical Reports Server (NTRS)

    Redinbo, G. R.; Zagar, Bernhard

    1993-01-01

    A novel method is proposed for protecting digital filters from temporary and permanent failures that are not easily detected by conventional fault-tolerant computer design principles, on the basis of the error-detecting properties of real convolutional codes. Erroneous behavior is detected by externally comparing the calculated and regenerated parity samples. Great simplifications are obtainable by modifying the code structure to yield simplified parity channels with finite impulse response structures. A matrix equation involving the original parity values of the code and the polynomial of the digital filter's transfer function is formed, and row manipulations separate this equation into a set of homogeneous equations constraining the modifying scaling coefficients and another set which defines the code parity values' implementation.

  18. Analysis of the KUCA MEU experiments using the ANL code system

    SciTech Connect

    Shiroya, S.; Hayashi, M.; Kanda, K.; Shibata, T.; Woodruff, W.L.; Matos, J.E.

    1982-01-01

    This paper provides some preliminary results on the analysis of the KUCA critical experiments using the ANL code system. Since this system was employed in the earlier neutronics calculations for the KUHFR, it is important to assess its capabilities for the KUHFR. The KUHFR has a unique core configuration which is difficult to model precisely with current diffusion theory codes. This paper also provides some results from a finite-element diffusion code (2D-FEM-KUR), which was developed in a cooperative research program between KURRI and JAERI. This code provides the capability for mockup of a complex core configuration as the KUHFR. Using the same group constants generated by the EPRI-CELL code, the results of the 2D-FEM-KUR code are compared with the finite difference diffusion code (DIF3D(2D) which is mainly employed in this analysis.

  19. Three narrative-based coding systems: Innovative moments, ambivalence and ambivalence resolution.

    PubMed

    Gonçalves, Miguel M; Ribeiro, António P; Mendes, Inês; Alves, Daniela; Silva, Joana; Rosa, Catarina; Braga, Cátia; Batista, João; Fernández-Navarro, Pablo; Oliveira, João Tiago

    2016-11-18

    Narrative and dialogical perspectives suggest that personal meaning systems' flexibility is an important resource for change in psychotherapy. Drawn from these theoretical backgrounds, a research program focused on the identification of Innovative Moments (IMs)-exceptions to the inflexible meaning systems present in psychopathological suffering-has been carried out. For this purpose, three process-oriented coding systems were developed: The IMs Coding System, the Ambivalence Coding System, and the Ambivalence Resolution Coding System. They allow, respectively, for the study of change, ambivalence, and ambivalence resolution in therapy. This paper presents these coding systems, the main findings that resulted from their application to different samples and therapeutic models, the main current and future lines of research, as well as the clinical applications of this research program.

  20. Optical CDMA system using 2-D run-length limited code

    NASA Astrophysics Data System (ADS)

    Liu, Maw-Yang; Jiang, Joe-Air

    2010-10-01

    In this paper, time-spreading wavelength-hopping optical CDMA system using 2-D run-length limited code is investigated. The run-length limited code we use here is predicated upon spatial coding scheme, which can improve system performance significantly. In our proposed system, we employ carrier-hopping prime code and its shifted version as signature sequences. Based on the zero auto-correlation sidelobes property of signature sequence, we propose a two-state trellis coding architecture, which utilizes 2-D parallel detection scheme. The proposed scheme is compact and simple that can be applied to more complicated trellis to further enhance system performance. Multiple access interference is the main deterioration factor in optical CDMA system that affects system performance adversely. Aside from the multiple access interference, some of the adverse impacts of system performance are also taken into consideration, which include thermal noise, shot noise, relative intensity noise, and beat noise.

  1. Development of A Monte Carlo Radiation Transport Code System For HEDS: Status Update

    NASA Technical Reports Server (NTRS)

    Townsend, Lawrence W.; Gabriel, Tony A.; Miller, Thomas M.

    2003-01-01

    Modifications of the Monte Carlo radiation transport code HETC are underway to extend the code to include transport of energetic heavy ions, such as are found in the galactic cosmic ray spectrum in space. The new HETC code will be available for use in radiation shielding applications associated with missions, such as the proposed manned mission to Mars. In this work the current status of code modification is described. Methods used to develop the required nuclear reaction models, including total, elastic and nuclear breakup processes, and their associated databases are also presented. Finally, plans for future work on the extended HETC code system and for its validation are described.

  2. Development of System Based Code: Methodologies for Life-Cycle Margin Evaluation

    SciTech Connect

    Masaki Morishita; Tai Asayama; Masanori Tashimo

    2006-07-01

    The late Professor Emeritus Yasuhide Asada proposed the System Based Code concept, which intends the optimization of design of nuclear plants through margin exchange among a variety of technical options which are not allowed by current codes and standards. The key technology of the System Based Code is margin exchange evaluation methodology. This paper describes recent progress with regards to margin exchange methodologies in Japan. (authors)

  3. Improving the performance of BICM-ID and MLC systems with different FEC codes

    NASA Astrophysics Data System (ADS)

    Arafa, T.; Sauer-Greff, W.; Urbansky, R.

    2013-07-01

    In bandwidth limited communication systems, the high data rate transmission with performance close to capacity limits is achieved by applying multilevel modulation schemes in association with powerful forward error correction (FEC) coding, i.e. coded modulation systems. The most important practical approaches to coded modulation systems are multilevel coding with multistage decoding (MLC/MSD) and bit interleaved coded modulation with iterative demapping and decoding (BICM-ID). Multilevel modulation formats such as M-QAM, which can be used as a part of coded modulation systems, have the capability of multilevel protection. Based on this fact, we investigate the methods to improve the performance of BICM-ID using multiple interleavers with different binary channel coding schemes such as convolutional codes, turbo codes and low-density parity-check (LDPC) codes. Moreover, an MLC system with parallel decoding on levels (PDL) at the receiver is considered. In our contribution, we propose to design the individual coding schemes using the extrinsic information transfer (EXIT) charts for individual bit levels in the constellation. Our simulation results show that the BICM-ID systems, taking into account different bit-level protections, can provide an improvement of 0.65 dB, 1.2 dB and 1.5 dB for 256-QAM with turbo, LDPC and convolutional codes, respectively. On the other hand, MLC systems with PDL designed using EXIT charts for individual bit levels can slightly improve the performance and eliminate the error floor compared to the systems with MSD.

  4. Channel coding for underwater acoustic single-carrier CDMA communication system

    NASA Astrophysics Data System (ADS)

    Liu, Lanjun; Zhang, Yonglei; Zhang, Pengcheng; Zhou, Lin; Niu, Jiong

    2017-01-01

    CDMA is an effective multiple access protocol for underwater acoustic networks, and channel coding can effectively reduce the bit error rate (BER) of the underwater acoustic communication system. For the requirements of underwater acoustic mobile networks based on CDMA, an underwater acoustic single-carrier CDMA communication system (UWA/SCCDMA) based on the direct-sequence spread spectrum is proposed, and its channel coding scheme is studied based on convolution, RA, Turbo and LDPC coding respectively. The implementation steps of the Viterbi algorithm of convolutional coding, BP and minimum sum algorithms of RA coding, Log-MAP and SOVA algorithms of Turbo coding, and sum-product algorithm of LDPC coding are given. An UWA/SCCDMA simulation system based on Matlab is designed. Simulation results show that the UWA/SCCDMA based on RA, Turbo and LDPC coding have good performance such that the communication BER is all less than 10-6 in the underwater acoustic channel with low signal to noise ratio (SNR) from -12 dB to -10dB, which is about 2 orders of magnitude lower than that of the convolutional coding. The system based on Turbo coding with Log-MAP algorithm has the best performance.

  5. Reliability of ICD-10 external cause of death codes in the National Coroners Information System.

    PubMed

    Bugeja, Lyndal; Clapperton, Angela J; Killian, Jessica J; Stephan, Karen L; Ozanne-Smith, Joan

    2010-01-01

    Availability of ICD-10 cause of death codes in the National Coroners Information System (NCIS) strengthens its value as a public health surveillance tool. This study quantified the completeness of external cause ICD-10 codes in the NCIS for Victorian deaths (as assigned by the Australian Bureau of Statistics (ABS) in the yearly Cause of Death data). It also examined the concordance between external cause ICD-10 codes contained in the NCIS and a re-code of the same deaths conducted by an independent coder. Of 7,400 NCIS external cause deaths included in this study, 961 (13.0%) did not contain an ABS assigned ICD-10 code and 225 (3.0%) contained only a natural cause code. Where an ABS assigned external cause ICD-10 code was present (n=6,214), 4,397 (70.8%) matched exactly with the independently assigned ICD-10 code. Coding disparity primarily related to differences in assignment of intent and specificity. However, in a small number of deaths (n=49, 0.8%) there was coding disparity for both intent and external cause category. NCIS users should be aware of the limitations of relying only on ICD-10 codes contained within the NCIS for deaths prior to 2007 and consider using these in combination with the other NCIS data fields and code sets to ensure optimum case identification.

  6. Automotive Gas Turbine Power System-Performance Analysis Code

    NASA Technical Reports Server (NTRS)

    Juhasz, Albert J.

    1997-01-01

    An open cycle gas turbine numerical modelling code suitable for thermodynamic performance analysis (i.e. thermal efficiency, specific fuel consumption, cycle state points, working fluid flowrates etc.) of automotive and aircraft powerplant applications has been generated at the NASA Lewis Research Center's Power Technology Division. The use this code can be made available to automotive gas turbine preliminary design efforts, either in its present version, or, assuming that resources can be obtained to incorporate empirical models for component weight and packaging volume, in later version that includes the weight-volume estimator feature. The paper contains a brief discussion of the capabilities of the presently operational version of the code, including a listing of input and output parameters and actual sample output listings.

  7. SEACC: the systems engineering and analysis computer code for small wind systems

    SciTech Connect

    Tu, P.K.C.; Kertesz, V.

    1983-03-01

    The systems engineering and analysis (SEA) computer program (code) evaluates complete horizontal-axis SWECS performance. Rotor power output as a function of wind speed and energy production at various wind regions are predicted by the code. Efficiencies of components such as gearbox, electric generators, rectifiers, electronic inverters, and batteries can be included in the evaluation process to reflect the complete system performance. Parametric studies can be carried out for blade design characteristics such as airfoil series, taper rate, twist degrees and pitch setting; and for geometry such as rotor radius, hub radius, number of blades, coning angle, rotor rpm, etc. Design tradeoffs can also be performed to optimize system configurations for constant rpm, constant tip speed ratio and rpm-specific rotors. SWECS energy supply as compared to the load demand for each hour of the day and during each session of the year can be assessed by the code if the diurnal wind and load distributions are known. Also available during each run of the code is blade aerodynamic loading information.

  8. A Coding System for Qualitative Studies of the Information-Seeking Process in Computer Science Research

    ERIC Educational Resources Information Center

    Moral, Cristian; de Antonio, Angelica; Ferre, Xavier; Lara, Graciela

    2015-01-01

    Introduction: In this article we propose a qualitative analysis tool--a coding system--that can support the formalisation of the information-seeking process in a specific field: research in computer science. Method: In order to elaborate the coding system, we have conducted a set of qualitative studies, more specifically a focus group and some…

  9. Proposing a Web-Based Tutorial System to Teach Malay Language Braille Code to the Sighted

    ERIC Educational Resources Information Center

    Wah, Lee Lay; Keong, Foo Kok

    2010-01-01

    The "e-KodBrailleBM Tutorial System" is a web-based tutorial system which is specially designed to teach, facilitate and support the learning of Malay Language Braille Code to individuals who are sighted. The targeted group includes special education teachers, pre-service teachers, and parents. Learning Braille code involves memorisation…

  10. De novo, systemic, deleterious amino acid substitutions are common in large cytoskeleton-related protein coding regions

    PubMed Central

    Stoll, Rebecca J.; Thompson, Grace R.; Samy, Mohammad D.; Blanck, George

    2017-01-01

    Human mutagenesis is largely random, thus large coding regions, simply on the basis of probability, represent relatively large mutagenesis targets. Thus, we considered the possibility that large cytoskeletal-protein related coding regions (CPCRs), including extra-cellular matrix (ECM) coding regions, would have systemic nucleotide variants that are not present in common SNP databases. Presumably, such variants arose recently in development or in recent, preceding generations. Using matched breast cancer and blood-derived normal datasets from the cancer genome atlas, CPCR single nucleotide variants (SNVs) not present in the All SNPs(142) or 1000 Genomes databases were identified. Using the Protein Variation Effect Analyzer internet-based tool, it was discovered that apparent, systemic mutations (not shared among others in the analysis group) in the CPCRs, represented numerous deleterious amino acid substitutions. However, no such deleterious variants were identified among the (cancer blood-matched) variants shared by other members of the analysis group. These data indicate that private SNVs, which potentially have a medical consequence, occur de novo with significant frequency in the larger, human coding regions that collectively impact the cytoskeleton and ECM. PMID:28357075

  11. Internal Corrosion Control of Water Supply Systems Code of Practice

    EPA Science Inventory

    This Code of Practice is part of a series of publications by the IWA Specialist Group on Metals and Related Substances in Drinking Water. It complements the following IWA Specialist Group publications: 1. Best Practice Guide on the Control of Lead in Drinking Water 2. Best Prac...

  12. Accuracy and time requirements of a bar-code inventory system for medical supplies.

    PubMed

    Hanson, L B; Weinswig, M H; De Muth, J E

    1988-02-01

    The effects of implementing a bar-code system for issuing medical supplies to nursing units at a university teaching hospital were evaluated. Data on the time required to issue medical supplies to three nursing units at a 480-bed, tertiary-care teaching hospital were collected (1) before the bar-code system was implemented (i.e., when the manual system was in use), (2) one month after implementation, and (3) four months after implementation. At the same times, the accuracy of the central supply perpetual inventory was monitored using 15 selected items. One-way analysis of variance tests were done to determine any significant differences between the bar-code and manual systems. Using the bar-code system took longer than using the manual system because of a significant difference in the time required for order entry into the computer. Multiple-use requirements of the central supply computer system made entering bar-code data a much slower process. There was, however, a significant improvement in the accuracy of the perpetual inventory. Using the bar-code system for issuing medical supplies to the nursing units takes longer than using the manual system. However, the accuracy of the perpetual inventory was significantly improved with the implementation of the bar-code system.

  13. Experience of development of the national surgical interventions coding system in Russia.

    PubMed

    Shtevnina, Julia I; Rauzina, Svetlana E; Shvyrev, Sergey L; Zarubina, Tatyana V

    2014-01-01

    The paper discusses development issues of surgical procedures coding systems for use at the national and international levels within the health information systems. The work was carried out using the Russian and foreign experiences, including international standard ISO/FDIS 1828:2012. The development system structure contains basic categories of medical entities (axes): surgical deed and surgical subdeed, objects, site and interventional equipment. Abdominal surgeries (528 procedures) were entered in the coding system database and structured according defined categories.

  14. DANTSYS: A diffusion accelerated neutral particle transport code system

    SciTech Connect

    Alcouffe, R.E.; Baker, R.S.; Brinkley, F.W.; Marr, D.R.; O`Dell, R.D.; Walters, W.F.

    1995-06-01

    The DANTSYS code package includes the following transport codes: ONEDANT, TWODANT, TWODANT/GQ, TWOHEX, and THREEDANT. The DANTSYS code package is a modular computer program package designed to solve the time-independent, multigroup discrete ordinates form of the boltzmann transport equation in several different geometries. The modular construction of the package separates the input processing, the transport equation solving, and the post processing (or edit) functions into distinct code modules: the Input Module, one or more Solver Modules, and the Edit Module, respectively. The Input and Edit Modules are very general in nature and are common to all the Solver Modules. The ONEDANT Solver Module contains a one-dimensional (slab, cylinder, and sphere), time-independent transport equation solver using the standard diamond-differencing method for space/angle discretization. Also included in the package are solver Modules named TWODANT, TWODANT/GQ, THREEDANT, and TWOHEX. The TWODANT Solver Module solves the time-independent two-dimensional transport equation using the diamond-differencing method for space/angle discretization. The authors have also introduced an adaptive weighted diamond differencing (AWDD) method for the spatial and angular discretization into TWODANT as an option. The TWOHEX Solver Module solves the time-independent two-dimensional transport equation on an equilateral triangle spatial mesh. The THREEDANT Solver Module solves the time independent, three-dimensional transport equation for XYZ and RZ{Theta} symmetries using both diamond differencing with set-to-zero fixup and the AWDD method. The TWODANT/GQ Solver Module solves the 2-D transport equation in XY and RZ symmetries using a spatial mesh of arbitrary quadrilaterals. The spatial differencing method is based upon the diamond differencing method with set-to-zero fixup with changes to accommodate the generalized spatial meshing.

  15. Overview of the arthritis Cost Consequence Evaluation System (ACCES): a pharmacoeconomic model for celecoxib.

    PubMed

    Pettitt, D; Goldstein, J L; McGuire, A; Schwartz, J S; Burke, T; Maniadakis, N

    2000-12-01

    Pharmacoeconomic analyses have become useful and essential tools for health care decision makers who increasingly require such analyses prior to placing a drug on a national, regional or hospital formulary. Previous health economic models of non-steroidal anti-inflammatory drugs (NSAIDs) have been restricted to evaluating a narrow range of agents within specific health care delivery systems using medical information derived from homogeneous clinical trial data. This paper summarizes the Arthritis Cost Consequence Evaluation System (ACCES)--a pharmacoeconomic model that has been developed to predict and evaluate the costs and consequences associated with the use of celecoxib in patients with arthritis, compared with other NSAIDs and NSAIDs plus gastroprotective agents. The advantage of this model is that it can be customized to reflect local practice patterns, resource utilization and costs, as well as provide context-specific health economic information to a variety of providers and/or decision makers.

  16. VENTURE/PC manual: A multidimensional multigroup neutron diffusion code system

    SciTech Connect

    Shapiro, A.; Huria, H.C.; Cho, K.W. )

    1991-12-01

    VENTURE/PC is a recompilation of part of the Oak Ridge BOLD VENTURE code system, which will operate on an IBM PC or compatible computer. Neutron diffusion theory solutions are obtained for multidimensional, multigroup problems. This manual contains information associated with operating the code system. The purpose of the various modules used in the code system, and the input for these modules are discussed. The PC code structure is also given. Version 2 included several enhancements not given in the original version of the code. In particular, flux iterations can be done in core rather than by reading and writing to disk, for problems which allow sufficient memory for such in-core iterations. This speeds up the iteration process. Version 3 does not include any of the special processors used in the previous versions. These special processors utilized formatted input for various elements of the code system. All such input data is now entered through the Input Processor, which produces standard interface files for the various modules in the code system. In addition, a Standard Interface File Handbook is included in the documentation which is distributed with the code, to assist in developing the input for the Input Processor.

  17. VENTURE/PC manual: A multidimensional multigroup neutron diffusion code system. Version 3

    SciTech Connect

    Shapiro, A.; Huria, H.C.; Cho, K.W.

    1991-12-01

    VENTURE/PC is a recompilation of part of the Oak Ridge BOLD VENTURE code system, which will operate on an IBM PC or compatible computer. Neutron diffusion theory solutions are obtained for multidimensional, multigroup problems. This manual contains information associated with operating the code system. The purpose of the various modules used in the code system, and the input for these modules are discussed. The PC code structure is also given. Version 2 included several enhancements not given in the original version of the code. In particular, flux iterations can be done in core rather than by reading and writing to disk, for problems which allow sufficient memory for such in-core iterations. This speeds up the iteration process. Version 3 does not include any of the special processors used in the previous versions. These special processors utilized formatted input for various elements of the code system. All such input data is now entered through the Input Processor, which produces standard interface files for the various modules in the code system. In addition, a Standard Interface File Handbook is included in the documentation which is distributed with the code, to assist in developing the input for the Input Processor.

  18. Tracking "Large" or "Smal": Boundaries and their Consequences for Veterinary Students within the Tracking System

    NASA Astrophysics Data System (ADS)

    Vermilya, Jenny R.

    In this dissertation, I use 42 in-depth qualitative interviews with veterinary medical students to explore the experience of being in an educational program that tracks students based on the species of non-human animals that they wish to treat. Specifically, I examine how tracking produces multiple boundaries for veterinary students. The boundaries between different animal species produce consequences for the treatment of those animals; this has been well documented. Using a symbolic interactionist perspective, my research extends the body of knowledge on species boundaries by revealing other consequences of this boundary work. For example, I analyze the symbolic boundaries involved in the gendering of animals, practitioners, and professions. I also examine how boundaries influence the collective identity of students entering an occupation segmented into various specialties. The collective identity of veterinarian is one characterized by care, thus students have to construct different definitions of care to access and maintain the collective identity. The tracking system additionally produces consequences for the knowledge created and reproduced in different areas of animal medicine, creating a system of power and inequality based on whose knowledge is privileged, how, and why. Finally, socially constructed boundaries generated from tracking inevitably lead to cases that do not fit. In particular, horses serve as a "border species" for veterinary students who struggle to place them into the tracking system. I argue that border species, like other metaphorical borders, have the potential to challenge discourses and lead to social change.

  19. Turbo product codes and their application in the fourth-generation mobile communication system

    NASA Astrophysics Data System (ADS)

    He, Yejun; Zhu, Guangxi; Liu, Ying Zhuang; Liu, Jian

    2004-04-01

    In this paper, we firstly present turbo product codes (TPCs) for forward error correction (FEC) coding, including TPCs encoding process and decoding principle, and then compare TPCs with turbo convolutional codes (TCCs) error coding solution. The performance of TPCs is shown to be closer to the Shannon limit than TCCs. Secondly, we introduce TPCs" application in the 4th generation (4G) mobile communication system which is being developed in our country at present. The concept of TPC-OFDM system which promises higher code rate than conventional OFDM is first modified. Finally, simulation results show that the simplified 4G uplink systems offer Bit Error Rate of nearly 0 over IMT-2000 channel at Eb/N0 > 15dB.

  20. Radiation Transport Calculations of a Simple Structure Using the Vehicle Code System with 69-Group Cross Sections and the Monte-Carlo Neutron and Photon Code

    DTIC Science & Technology

    1989-08-01

    Code System (VCS) User’s Manual , Oak Ridge National Laboratory, ORNL-TM-4648 (1974). (UNCLASSIFIED) 3. F.R. Mynatt , F.J. Muckenthaler and P.N...and L.M. Petrie, Vehicle Code System (VCS) User’s Manual , Oak Ridge National Laboratory, ORNL-TM-4648 (1974). (UNCLASSIFIED) 3. F.R. Mynatt , F.J

  1. TVENT1: a computer code for analyzing tornado-induced flow in ventilation systems

    SciTech Connect

    Andrae, R.W.; Tang, P.K.; Gregory, W.S.

    1983-07-01

    TVENT1 is a new version of the TVENT computer code, which was designed to predict the flows and pressures in a ventilation system subjected to a tornado. TVENT1 is essentially the same code but has added features for turning blowers off and on, changing blower speeds, and changing the resistance of dampers and filters. These features make it possible to depict a sequence of events during a single run. Other features also have been added to make the code more versatile. Example problems are included to demonstrate the code's applications.

  2. Comparison of procedure coding systems for level 1 and 2 hospitals in South Africa.

    PubMed

    Montewa, Lebogang; Hanmer, Lyn; Reagon, Gavin

    2013-01-01

    The ability of three procedure coding systems to reflect the procedure concepts extracted from patient records from six hospitals was compared, in order to inform decision making about a procedure coding standard for South Africa. A convenience sample of 126 procedure concepts was extracted from patient records at three level 1 hospitals and three level 2 hospitals. Each procedure concept was coded using ICPC-2, ICD-9-CM, and CCSA-2001. The extent to which each code assigned actually reflected the procedure concept was evaluated (between 'no match' and 'complete match'). For the study sample, CCSA-2001 was found to reflect the procedure concepts most completely, followed by ICD-9-CM and then ICPC-2. In practice, decision making about procedure coding standards would depend on multiple factors in addition to coding accuracy.

  3. Demonstration of Vibrational Braille Code Display Using Large Displacement Micro-Electro-Mechanical Systems Actuators

    NASA Astrophysics Data System (ADS)

    Watanabe, Junpei; Ishikawa, Hiroaki; Arouette, Xavier; Matsumoto, Yasuaki; Miki, Norihisa

    2012-06-01

    In this paper, we present a vibrational Braille code display with large-displacement micro-electro-mechanical systems (MEMS) actuator arrays. Tactile receptors are more sensitive to vibrational stimuli than to static ones. Therefore, when each cell of the Braille code vibrates at optimal frequencies, subjects can recognize the codes more efficiently. We fabricated a vibrational Braille code display that used actuators consisting of piezoelectric actuators and a hydraulic displacement amplification mechanism (HDAM) as cells. The HDAM that encapsulated incompressible liquids in microchambers with two flexible polymer membranes could amplify the displacement of the MEMS actuator. We investigated the voltage required for subjects to recognize Braille codes when each cell, i.e., the large-displacement MEMS actuator, vibrated at various frequencies. Lower voltages were required at vibration frequencies higher than 50 Hz than at vibration frequencies lower than 50 Hz, which verified that the proposed vibrational Braille code display is efficient by successfully exploiting the characteristics of human tactile receptors.

  4. A Mechanism to Avoid Collusion Attacks Based on Code Passing in Mobile Agent Systems

    NASA Astrophysics Data System (ADS)

    Jaimez, Marc; Esparza, Oscar; Muñoz, Jose L.; Alins-Delgado, Juan J.; Mata-Díaz, Jorge

    Mobile agents are software entities consisting of code, data, state and itinerary that can migrate autonomously from host to host executing their code. Despite its benefits, security issues strongly restrict the use of code mobility. The protection of mobile agents against the attacks of malicious hosts is considered the most difficult security problem to solve in mobile agent systems. In particular, collusion attacks have been barely studied in the literature. This paper presents a mechanism that avoids collusion attacks based on code passing. Our proposal is based on a Multi-Code agent, which contains a different variant of the code for each host. A Trusted Third Party is responsible for providing the information to extract its own variant to the hosts, and for taking trusted timestamps that will be used to verify time coherence.

  5. The Marriage of Residential Energy Codes and Rating Systems: Conflict Resolution or Just Conflict?

    SciTech Connect

    Taylor, Zachary T.; Mendon, Vrushali V.

    2014-08-21

    After three decades of coexistence at a distance, model residential energy codes and residential energy rating systems have come together in the 2015 International Energy Conservation Code. At the October, 2013, International Code Council’s Public Comment Hearing, a new compliance path based on an Energy Rating Index was added to the IECC. Although not specifically named in the code, RESNET’s HERS rating system is the likely candidate Index for most jurisdictions. While HERS has been a mainstay in various beyond-code programs for many years, its direct incorporation into the most popular model energy code raises questions about the equivalence of a HERS-based compliance path and the traditional IECC performance compliance path, especially because the two approaches use different efficiency metrics, are governed by different simulation rules, and have different scopes with regard to energy impacting house features. A detailed simulation analysis of more than 15,000 house configurations reveals a very large range of HERS Index values that achieve equivalence with the IECC’s performance path. This paper summarizes the results of that analysis and evaluates those results against the specific Energy Rating Index values required by the 2015 IECC. Based on the home characteristics most likely to result in disparities between HERS-based compliance and performance path compliance, potential impacts on the compliance process, state and local adoption of the new code, energy efficiency in the next generation of homes subject to this new code, and future evolution of model code formats are discussed.

  6. Health Physics Code System for Evaluating Accidents Involving Radioactive Materials.

    SciTech Connect

    2014-10-01

    Version 03 The HOTSPOT Health Physics codes were created to provide Health Physics personnel with a fast, field-portable calculational tool for evaluating accidents involving radioactive materials. HOTSPOT codes provide a first-order approximation of the radiation effects associated with the atmospheric release of radioactive materials. The developer's website is: http://www.llnl.gov/nhi/hotspot/. Four general programs, PLUME, EXPLOSION, FIRE, and RESUSPENSION, calculate a downwind assessment following the release of radioactive material resulting from a continuous or puff release, explosive release, fuel fire, or an area contamination event. Additional programs deal specifically with the release of plutonium, uranium, and tritium to expedite an initial assessment of accidents involving nuclear weapons. The FIDLER program can calibrate radiation survey instruments for ground survey measurements and initial screening of personnel for possible plutonium uptake in the lung. The HOTSPOT codes are fast, portable, easy to use, and fully documented in electronic help files. HOTSPOT supports color high resolution monitors and printers for concentration plots and contours. The codes have been extensively used by the DOS community since 1985. Tables and graphical output can be directed to the computer screen, printer, or a disk file. The graphical output consists of dose and ground contamination as a function of plume centerline downwind distance, and radiation dose and ground contamination contours. Users have the option of displaying scenario text on the plots. HOTSPOT 3.0.1 fixes three significant Windows 7 issues: � Executable installed properly under "Program Files/HotSpot 3.0". Installation package now smaller: removed dependency on older Windows DLL files which previously needed to \\ � Forms now properly scale based on DPI instead of font for users who change their screen resolution to something other than 100%. This is a more common feature in Windows 7.

  7. Vision Aided Inertial Navigation System Augmented with a Coded Aperture

    DTIC Science & Technology

    2011-03-24

    diameter of focal blur for clear aperture number n m C (x, y ) Laplacian of Gaussian for image over x and y n/a F(2p, e) Fourier transform of image in...polar coordinates Ap and e n/a F (x) Fourier transform of x n/a it Focal length of lens m J; (p,e) Image in polar coordinates p and e m g...captures a Fourier transform of each image at various angles rather than low resolution images [38]. Multiple coded images have also been used, with

  8. Health consequences and health systems response to the Pacific U.S. Nuclear Weapons Testing Program.

    PubMed

    Palafox, Neal A; Riklon, Sheldon; Alik, Wilfred; Hixon, Allen L

    2007-03-01

    Between 1946 and 1958, the United States detonated 67 thermonuclear devices in the Pacific as part of their U.S. Nuclear Weapons Testing Program (USNWTP). The aggregate explosive power was equal to 7,200 Hiroshima atomic bombs. Recent documents released by the U.S. government suggest that the deleterious effects of the nuclear testing were greater and extended farther than previously known. The Republic of the Marshall Islands (RMI) government and affected communities have sought refress through diplomatic routes with the U.S. government, however, existing medical programs and financial reparations have not adequately addressed many of the health consequences of the USNWTP. Since radiation-induced cancers may have a long latency, a healthcare infrastructure is needed to address both cancer and related health issues. This article reviews the health consequences of the Pacific USNWTP and the current health systems ability to respond.

  9. The sympathetic nervous system and the physiologic consequences of spaceflight: a hypothesis

    NASA Technical Reports Server (NTRS)

    Robertson, D.; Convertino, V. A.; Vernikos, J.

    1994-01-01

    Many of the physiologic consequences of weightlessness and the cardiovascular abnormalities on return from space could be due, at least in part, to alterations in the regulation of the autonomic nervous system. In this article, the authors review the rationale and evidence for an autonomic mediation of diverse changes that occur with spaceflight, including the anemia and hypovolemia of weightlessness and the tachycardia and orthostatic intolerance on return from space. This hypothesis is supported by studies of two groups of persons known to have low catecholamine levels: persons subjected to prolonged bedrest and persons with syndromes characterized by low circulating catecholamines (Bradbury-Eggleston syndrome and dopamine beta-hydroxylase deficiency). Both groups exhibit the symptoms mentioned. The increasing evidence that autonomic mechanisms underlie many of the physiologic consequences of weightlessness suggests that new pharmacologic approaches (such as administration of beta-blockers and/or sympathomimetic amines) based on these findings may attenuate these unwanted effects.

  10. Lattice physics capabilities of the SCALE code system using TRITON

    SciTech Connect

    DeHart, M. D.

    2006-07-01

    This paper describes ongoing calculations used to validate the TRITON depletion module in SCALE for light water reactor (LWR) fuel lattices. TRITON has been developed to provide improved resolution for lattice physics mixed-oxide fuel assemblies as programs to burn such fuel in the United States begin to come online. Results are provided for coupled TRITON/PARCS analyses of an LWR core in which TRITON was employed for generation of appropriately weighted few-group nodal cross-sectional sets for use in core-level calculations using PARCS. Additional results are provided for code-to-code comparisons for TRITON and a suite of other depletion packages in the modeling of a conceptual next-generation boiling water reactor fuel assembly design. Results indicate that the set of SCALE functional modules used within TRITON provide an accurate means for lattice physics calculations. Because the transport solution within TRITON provides a generalized-geometry capability, this capability is extensible to a wide variety of non-traditional and advanced fuel assembly designs. (authors)

  11. A user's manual for MASH 1. 0: A Monte Carlo Adjoint Shielding Code System

    SciTech Connect

    Johnson, J.O.

    1992-03-01

    The Monte Carlo Adjoint Shielding Code System, MASH, calculates neutron and gamma-ray environments and radiation protection factors for armored military vehicles, structures, trenches, and other shielding configurations by coupling a forward discrete ordinates air-over-ground transport calculation with an adjoint Monte Carlo treatment of the shielding geometry. Efficiency and optimum use of computer time are emphasized. The code system include the GRTUNCL and DORT codes for air-over-ground transport calculations, the MORSE code with the GIFT5 combinatorial geometry package for adjoint shielding calculations, and several peripheral codes that perform the required data preparations, transformations, and coupling functions. MASH is the successor to the Vehicle Code System (VCS) initially developed at Oak Ridge National Laboratory (ORNL). The discrete ordinates calculation determines the fluence on a coupling surface surrounding the shielding geometry due to an external neutron/gamma-ray source. The Monte Carlo calculation determines the effectiveness of the fluence at that surface in causing a response in a detector within the shielding geometry, i.e., the dose importance'' of the coupling surface fluence. A coupling code folds the fluence together with the dose importance, giving the desired dose response. The coupling code can determine the dose response a a function of the shielding geometry orientation relative to the source, distance from the source, and energy response of the detector. This user's manual includes a short description of each code, the input required to execute the code along with some helpful input data notes, and a representative sample problem (input data and selected output edits) for each code.

  12. Nurses' attitudes toward the use of the bar-coding medication administration system.

    PubMed

    Marini, Sana Daya; Hasman, Arie; Huijer, Huda Abu-Saad; Dimassi, Hani

    2010-01-01

    This study determines nurses' attitudes toward bar-coding medication administration system use. Some of the factors underlying the successful use of bar-coding medication administration systems that are viewed as a connotative indicator of users' attitudes were used to gather data that describe the attitudinal basis for system adoption and use decisions in terms of subjective satisfaction. Only 67 nurses in the United States had the chance to respond to the e-questionnaire posted on the CARING list server for the months of June and July 2007. Participants rated their satisfaction with bar-coding medication administration system use based on system functionality, usability, and its positive/negative impact on the nursing practice. Results showed, to some extent, positive attitude, but the image profile draws attention to nurses' concerns for improving certain system characteristics. The high bar-coding medication administration system skills revealed a more negative perception of the system by the nursing staff. The reasons underlying dissatisfaction with bar-coding medication administration use by skillful users are an important source of knowledge that can be helpful for system development as well as system deployment. As a result, strengthening bar-coding medication administration system usability by magnifying its ability to eliminate medication errors and the contributing factors, maximizing system functionality by ascertaining its power as an extra eye in the medication administration process, and impacting the clinical nursing practice positively by being helpful to nurses, speeding up the medication administration process, and being user-friendly can offer a congenial settings for establishing positive attitude toward system use, which in turn leads to successful bar-coding medication administration system use.

  13. Surety of human elements of high consequence systems: An organic model

    SciTech Connect

    FORSYTHE,JAMES C.; WENNER,CAREN A.

    2000-04-25

    Despite extensive safety analysis and application of safety measures, there is a frequent lament, ``Why do we continue to have accidents?'' Two breakdowns are prevalent in risk management and prevention. First, accidents result from human actions that engineers, analysts and management never envisioned and second, controls, intended to preclude/mitigate accident sequences, prove inadequate. This paper addresses the first breakdown, the inability to anticipate scenarios involving human action/inaction. The failure of controls has been addressed in a previous publication (Forsythe and Grose, 1998). Specifically, this paper presents an approach referred to as surety. The objective of this approach is to provide high levels of assurance in situations where potential system failure paths cannot be fully characterized. With regard to human elements of complex systems, traditional approaches to human reliability are not sufficient to attain surety. Consequently, an Organic Model has been developed to account for the organic properties exhibited by engineered systems that result from human involvement in those systems.

  14. Economic consequences of aviation system disruptions: A reduced-form computable general equilibrium analysis

    SciTech Connect

    Chen, Zhenhua; Rose, Adam Z.; Prager, Fynnwin; Chatterjee, Samrat

    2017-01-01

    The state of the art approach to economic consequence analysis (ECA) is computable general equilibrium (CGE) modeling. However, such models contain thousands of equations and cannot readily be incorporated into computerized systems used by policy analysts to yield estimates of economic impacts of various types of transportation system failures due to natural hazards, human related attacks or technological accidents. This paper presents a reduced-form approach to simplify the analytical content of CGE models to make them more transparent and enhance their utilization potential. The reduced-form CGE analysis is conducted by first running simulations one hundred times, varying key parameters, such as magnitude of the initial shock, duration, location, remediation, and resilience, according to a Latin Hypercube sampling procedure. Statistical analysis is then applied to the “synthetic data” results in the form of both ordinary least squares and quantile regression. The analysis yields linear equations that are incorporated into a computerized system and utilized along with Monte Carlo simulation methods for propagating uncertainties in economic consequences. Although our demonstration and discussion focuses on aviation system disruptions caused by terrorist attacks, the approach can be applied to a broad range of threat scenarios.

  15. Comprehensive transportation risk assessment system based on unit-consequence factors

    SciTech Connect

    Biwer, B.M.; Monette, F.A.; LePoire, D.J.; Chen, S.Y.

    1994-02-01

    The U.S. Department of Energy (DOE) Environmental Restoration and Waste Management Programmatic Environmental Impact Statement requires a comprehensive transportation risk analysis of radioactive waste shipments for large shipping campaigns. Thousands of unique shipments involving truck and rail transport must be analyzed; a comprehensive risk analysis is impossible with currently available methods. Argonne National Laboratory developed a modular transportation model that can handle the demands imposed by such an analysis. The modular design of the model facilitates the simple addition/updating of transportation routes and waste inventories, as required, and reduces the overhead associated with file maintenance and quality assurance. The model incorporates unit-consequences factors generated with the RADTRAN 4 transportation risk analysis code that are combined with an easy-to-use, menu-driven interface on IBM-compatible computers running under DOS. User selection of multiple origin/destination site pairs for the shipment of multiple radioactive waste inventories is permitted from pop-up lists. Over 800 predefined routes are available among more than 30 DOE sites and waste inventories that include high-level waste, spent nuclear fuel, transuranic waste, low-level waste, low-level mixed waste, and greater-than-Class C waste.

  16. A systematic literature review of automated clinical coding and classification systems.

    PubMed

    Stanfill, Mary H; Williams, Margaret; Fenton, Susan H; Jenders, Robert A; Hersh, William R

    2010-01-01

    Clinical coding and classification processes transform natural language descriptions in clinical text into data that can subsequently be used for clinical care, research, and other purposes. This systematic literature review examined studies that evaluated all types of automated coding and classification systems to determine the performance of such systems. Studies indexed in Medline or other relevant databases prior to March 2009 were considered. The 113 studies included in this review show that automated tools exist for a variety of coding and classification purposes, focus on various healthcare specialties, and handle a wide variety of clinical document types. Automated coding and classification systems themselves are not generalizable, nor are the results of the studies evaluating them. Published research shows these systems hold promise, but these data must be considered in context, with performance relative to the complexity of the task and the desired outcome.

  17. Code division multiple-access techniques in optical fiber networks. II - Systems performance analysis

    NASA Astrophysics Data System (ADS)

    Salehi, Jawad A.; Brackett, Charles A.

    1989-08-01

    A technique based on optical orthogonal codes was presented by Salehi (1989) to establish a fiber-optic code-division multiple-access (FO-CDMA) communications system. The results are used to derive the bit error rate of the proposed FO-CDMA system as a function of data rate, code length, code weight, number of users, and receiver threshold. The performance characteristics for a variety of system parameters are discussed. A means of reducing the effective multiple-access interference signal by placing an optical hard-limiter at the front end of the desired optical correlator is presented. Performance calculations are shown for the FO-CDMA with an ideal optical hard-limiter, and it is shown that using a optical hard-limiter would, in general, improve system performance.

  18. Reporting Codes and Fuel Pathways for the EPA Moderated Transaction System (EMTS)

    EPA Pesticide Factsheets

    Users should reference this document for a complete list of all reporting codes and all possible fuel pathways for Renewable Fuel Standard (RFS) and Fuels Averaging, Banking and Trading (ABT) users of the EPA Moderated Transaction System (EMTS).

  19. Use of generalized curvilinear coordinate systems in electromagnetic and hybrid codes

    SciTech Connect

    Swift, D.W.

    1995-07-01

    The author develops a code to simulate the dynamics in the magnetosphere system. The calculation involves a single level, structured, curvilinear 2D mesh. The mesh density is varied to support regions which demand higher resolution.

  20. Feasibility and Reliability of a Coding System to Capture In-Session Group Behavior in Adolescents.

    PubMed

    Ladd, Benjamin O; Tomlinson, Kristin; Myers, Mark G; Anderson, Kristen G

    2016-01-01

    Limited research has explored the role of in-session behavior during motivational enhancement (ME) in group formats. The current study presents initial feasibility of assessing behavior of high school students (N = 425) attending Project Options, a voluntary secondary drug and alcohol prevention program utilizing ME techniques. Building on previous research exploring client language supporting/opposing health behavior, student group behavior was coded live at the specific utterance and global level; group leader behavior was also coded globally. Interrater reliability of the coding system was assessed, and preliminary validity of the coding system was examined by exploring associations between characteristics of group members and in-session group behavior. Initial reliability estimates were excellent for the specific behavior codes. Reliability of the global codes was mixed, with raters demonstrating good reliability on support for unhealthy behavior, opposition to unhealthy behavior, and support for healthy behavior. Reliability of the group leader codes was fair to poor. Greater percent healthy talk was associated with a lower percentage of group members reporting lifetime alcohol use. The results of the current study suggest that some in-session behavior at the group level can be coded reliably via live observation and that in-session behavior at the group level is associated with alcohol use prior to attending the program. Future research is needed to explore the utility of in-session behavior in terms of predicting future behavior at the group and individual level.

  1. Feasibility and Reliability of a Coding System to Capture In-Session Group Behavior in Adolescents

    PubMed Central

    Tomlinson, Kristin; Myers, Mark G.; Anderson, Kristen G.

    2016-01-01

    Limited research has explored the role of in-session behavior during motivational enhancement (ME) in group formats. The current study presents initial feasibility of assessing behavior of high school students (N=425) attending Project Options, a voluntary secondary drug and alcohol prevention program utilizing ME techniques. Building on previous research exploring client language supporting/opposing health behavior, student group behavior was coded live at the specific utterance and global level; group leader behavior was also coded globally. Interrater reliability of the coding system was assessed, and preliminary validity of the coding system was examined by exploring associations between characteristics of group members and in-session group behavior. Initial reliability estimates were excellent for the specific behavior codes. Reliability of the global codes was mixed, with raters demonstrating good reliability on support for unhealthy behavior, opposition to unhealthy behavior, and support for healthy behavior. Reliability of the group leader codes was fair to poor. Greater percent healthy talk was associated with a lower percentage of group members reporting lifetime alcohol use. The results of the current study suggest that some in-session behavior at the group level can be coded reliably via live observation and that in-session behavior at the group level is associated with alcohol use prior to attending the program. Future research is needed to explore the utility of in-session behavior in terms of predicting future behavior at the group and individual level. PMID:26271299

  2. Power optimization of wireless media systems with space-time block codes.

    PubMed

    Yousefi'zadeh, Homayoun; Jafarkhani, Hamid; Moshfeghi, Mehran

    2004-07-01

    We present analytical and numerical solutions to the problem of power control in wireless media systems with multiple antennas. We formulate a set of optimization problems aimed at minimizing total power consumption of wireless media systems subject to a given level of QoS and an available bit rate. Our formulation takes into consideration the power consumption related to source coding, channel coding, and transmission of multiple-transmit antennas. In our study, we consider Gauss-Markov and video source models, Rayleigh fading channels along with the Bernoulli/Gilbert-Elliott loss models, and space-time block codes.

  3. Systems guide to MCNP (Monte Carlo Neutron and Photon Transport Code)

    SciTech Connect

    Kirk, B.L.; West, J.T.

    1984-06-01

    The subject of this report is the implementation of the Los Alamos National Laboratory Monte Carlo Neutron and Photon Transport Code - Version 3 (MCNP) on the different types of computer systems, especially the IBM MVS system. The report supplements the documentation of the RSIC computer code package CCC-200/MCNP. Details of the procedure to follow in executing MCNP on the IBM computers, either in batch mode or interactive mode, are provided.

  4. A User's Manual for MASH V1.5 - A Monte Carlo Adjoint Shielding Code System

    SciTech Connect

    C. O. Slater; J. M. Barnes; J. O. Johnson; J.D. Drischler

    1998-10-01

    The Monte Carlo ~djoint ~ielding Code System, MASH, calculates neutron and gamma- ray environments and radiation protection factors for armored military vehicles, structures, trenches, and other shielding configurations by coupling a forward discrete ordinates air- over-ground transport calculation with an adjoint Monte Carlo treatment of the shielding geometry. Efficiency and optimum use of computer time are emphasized. The code system includes the GRTUNCL and DORT codes for air-over-ground transport calculations, the MORSE code with the GIFT5 combinatorial geometry package for adjoint shielding calculations, and several peripheral codes that perform the required data preparations, transformations, and coupling functions. The current version, MASH v 1.5, is the successor to the original MASH v 1.0 code system initially developed at Oak Ridge National Laboratory (ORNL). The discrete ordinates calculation determines the fluence on a coupling surface surrounding the shielding geometry due to an external neutron/gamma-ray source. The Monte Carlo calculation determines the effectiveness of the fluence at that surface in causing a response in a detector within the shielding geometry, i.e., the "dose importance" of the coupling surface fluence. A coupling code folds the fluence together with the dose importance, giving the desired dose response. The coupling code can determine the dose response as a function of the shielding geometry orientation relative to the source, distance from the source, and energy response of the detector. This user's manual includes a short description of each code, the input required to execute the code along with some helpful input data notes, and a representative sample problem.

  5. The health system consequences of agency nursing and moonlighting in South Africa.

    PubMed

    Rispel, Laetitia C; Blaauw, Duane

    2015-01-01

    Background Worldwide, there is an increased reliance on casual staff in the health sector. Recent policy attention in South Africa has focused on the interrelated challenges of agency nursing and moonlighting in the health sector. Objective This paper examines the potential health system consequences of agency nursing and moonlighting among South African nurses. Methods During 2010, a cluster random sample of 80 hospitals was selected in four South African provinces. On the survey day, all nurses providing clinical care completed a self-administered questionnaire after giving informed consent. The questionnaire obtained information on socio-demographics, involvement in agency nursing and moonlighting, and self-reported indicators of potential health system consequences of agency nursing and moonlighting. A weighted analysis was done using STATA(®) 13. Results In the survey, 40.7% of nurses reported moonlighting or working for an agency in the preceding year. Of all participants, 51.5% reported feeling too tired to work, 11.5% paid less attention to nursing work on duty, and 10.9% took sick leave when not actually sick in the preceding year. Among the moonlighters, 11.9% had taken vacation leave to do agency work or moonlighting, and 9.8% reported conflicting schedules between their primary and secondary jobs. In the bivariate analysis, moonlighting nurses were significantly more likely than non-moonlighters to take sick leave when not sick (p=0.011) and to pay less attention to nursing work on duty (p=0.035). However, in a multiple logistic regression analysis, the differences between moonlighters and non-moonlighters did not remain statistically significant after adjusting for other socio-demographic variables. Conclusion Although moonlighting did not emerge as a statistically significant predictor, the reported health system consequences are serious. A combination of strong nursing leadership, effective management, and consultation with and buy-in from front

  6. Forest fire management to avoid unintended consequences: a case study of Portugal using system dynamics.

    PubMed

    Collins, Ross D; de Neufville, Richard; Claro, João; Oliveira, Tiago; Pacheco, Abílio P

    2013-11-30

    Forest fires are a serious management challenge in many regions, complicating the appropriate allocation to suppression and prevention efforts. Using a System Dynamics (SD) model, this paper explores how interactions between physical and political systems in forest fire management impact the effectiveness of different allocations. A core issue is that apparently sound management can have unintended consequences. An instinctive management response to periods of worsening fire severity is to increase fire suppression capacity, an approach with immediate appeal as it directly treats the symptom of devastating fires and appeases the public. However, the SD analysis indicates that a policy emphasizing suppression can degrade the long-run effectiveness of forest fire management. By crowding out efforts to preventative fuel removal, it exacerbates fuel loads and leads to greater fires, which further balloon suppression budgets. The business management literature refers to this problem as the firefighting trap, wherein focus on fixing problems diverts attention from preventing them, and thus leads to inferior outcomes. The paper illustrates these phenomena through a case study of Portugal, showing that a balanced approach to suppression and prevention efforts can mitigate the self-reinforcing consequences of this trap, and better manage long-term fire damages. These insights can help policymakers and fire managers better appreciate the interconnected systems in which their authorities reside and the dynamics that may undermine seemingly rational management decisions.

  7. Financial system loss as an example of high consequence, high frequency events

    SciTech Connect

    McGovern, D.E.

    1996-07-01

    Much work has been devoted to high consequence events with low frequency of occurrence. Characteristic of these events are bridge failure (such as that of the Tacoma Narrows), building failure (such as the collapse of a walkway at a Kansas City hotel), or compromise of a major chemical containment system (such as at Bhopal, India). Such events, although rare, have an extreme personal, societal, and financial impact. An interesting variation is demonstrated by financial losses due to fraud and abuse in the money management system. The impact can be huge, entailing very high aggregate costs, but these are a result of the contribution of many small attacks and not the result of a single (or few) massive events. Public awareness is raised through publicized events such as the junk bond fraud perpetrated by Milikin or gross mismanagement in the failure of the Barings Bank through unsupervised trading activities by Leeson in Singapore. These event,s although seemingly large (financial losses may be on the order of several billion dollars), are but small contributors to the estimated $114 billion loss to all types of financial fraud in 1993. This paper explores the magnitude of financial system losses and identifies new areas for analysis of high consequence events including the potential effect of malevolent intent.

  8. Computationally Efficient Blind Code Synchronization for Asynchronous DS-CDMA Systems with Adaptive Antenna Arrays

    NASA Astrophysics Data System (ADS)

    Hu, Chia-Chang

    2005-12-01

    A novel space-time adaptive near-far robust code-synchronization array detector for asynchronous DS-CDMA systems is developed in this paper. There are the same basic requirements that are needed by the conventional matched filter of an asynchronous DS-CDMA system. For the real-time applicability, a computationally efficient architecture of the proposed detector is developed that is based on the concept of the multistage Wiener filter (MWF) of Goldstein and Reed. This multistage technique results in a self-synchronizing detection criterion that requires no inversion or eigendecomposition of a covariance matrix. As a consequence, this detector achieves a complexity that is only a linear function of the size of antenna array ([InlineEquation not available: see fulltext.]), the rank of the MWF ([InlineEquation not available: see fulltext.]), the system processing gain ([InlineEquation not available: see fulltext.]), and the number of samples in a chip interval ([InlineEquation not available: see fulltext.]), that is,[InlineEquation not available: see fulltext.]. The complexity of the equivalent detector based on the minimum mean-squared error (MMSE) or the subspace-based eigenstructure analysis is a function of[InlineEquation not available: see fulltext.]. Moreover, this multistage scheme provides a rapid adaptive convergence under limited observation-data support. Simulations are conducted to evaluate the performance and convergence behavior of the proposed detector with the size of the[InlineEquation not available: see fulltext.]-element antenna array, the amount of the[InlineEquation not available: see fulltext.]-sample support, and the rank of the[InlineEquation not available: see fulltext.]-stage MWF. The performance advantage of the proposed detector over other DS-CDMA detectors is investigated as well.

  9. Code regenerative clean-up loop transponder for a mu-type ranging system

    NASA Technical Reports Server (NTRS)

    Hurd, W. J. (Inventor)

    1973-01-01

    A loop transponder for regenerating the code of a mu type ranging system is disclosed. It includes a phase locked loop, a code generator, and a loop detector. The function of the phase locked loop is to provide phase lock between a received component wk of the range signal and a replica rafter wk of the received component, provided by the code generator. The code generator also provides a replica of the next component rafter w(w+1). The loop detector responds to wk rafler wk and rafter w(k+1) to determine when the next component w(k+1) is received and controls the code generator to supply w(k+1) to the phase locked loop and to generate a replica rafter w(k+2) of the next component.

  10. Coded acoustic wave sensors and system using time diversity

    NASA Technical Reports Server (NTRS)

    Solie, Leland P. (Inventor); Hines, Jacqueline H. (Inventor)

    2012-01-01

    An apparatus and method for distinguishing between sensors that are to be wirelessly detected is provided. An interrogator device uses different, distinct time delays in the sensing signals when interrogating the sensors. The sensors are provided with different distinct pedestal delays. Sensors that have the same pedestal delay as the delay selected by the interrogator are detected by the interrogator whereas other sensors with different pedestal delays are not sensed. Multiple sensors with a given pedestal delay are provided with different codes so as to be distinguished from one another by the interrogator. The interrogator uses a signal that is transmitted to the sensor and returned by the sensor for combination and integration with the reference signal that has been processed by a function. The sensor may be a surface acoustic wave device having a differential impulse response with a power spectral density consisting of lobes. The power spectral density of the differential response is used to determine the value of the sensed parameter or parameters.

  11. History by history statistical estimators in the BEAM code system.

    PubMed

    Walters, B R B; Kawrakow, I; Rogers, D W O

    2002-12-01

    A history by history method for estimating uncertainties has been implemented in the BEAMnrc and DOSXYznrc codes replacing the method of statistical batches. This method groups scored quantities (e.g., dose) by primary history. When phase-space sources are used, this method groups incident particles according to the primary histories that generated them. This necessitated adding markers (negative energy) to phase-space files to indicate the first particle generated by a new primary history. The new method greatly reduces the uncertainty in the uncertainty estimate. The new method eliminates one dimension (which kept the results for each batch) from all scoring arrays, resulting in memory requirement being decreased by a factor of 2. Correlations between particles in phase-space sources are taken into account. The only correlations with any significant impact on uncertainty are those introduced by particle recycling. Failure to account for these correlations can result in a significant underestimate of the uncertainty. The previous method of accounting for correlations due to recycling by placing all recycled particles in the same batch did work. Neither the new method nor the batch method take into account correlations between incident particles when a phase-space source is restarted so one must avoid restarts.

  12. An 8-PSK TDMA uplink modulation and coding system

    NASA Technical Reports Server (NTRS)

    Ames, S. A.

    1992-01-01

    The combination of 8-phase shift keying (8PSK) modulation and greater than 2 bits/sec/Hz drove the design of the Nyquist filter to one specified to have a rolloff factor of 0.2. This filter when built and tested was found to produce too much intersymbol interference and was abandoned for a design with a rolloff factor of 0.4. The preamble is limited to 100 bit periods of the uncoded bit period of 5 ns for a maximum preamble length of 500 ns or 40 8PSK symbol times at 12.5 ns per symbol. For 8PSK modulation, the required maximum degradation of 1 dB in -20 dB cochannel interference (CCI) drove the requirement for forward error correction coding. In this contract, the funding was not sufficient to develop the proposed codec so the codec was limited to a paper design during the preliminary design phase. The mechanization of the demodulator is digital, starting from the output of the analog to digital converters which quantize the outputs of the quadrature phase detectors. This approach is amenable to an application specific integrated circuit (ASIC) replacement in the next phase of development.

  13. Propulsion stability codes for liquid propellant propulsion systems developed for use on a PC computer

    NASA Technical Reports Server (NTRS)

    Doane, George B., III; Armstrong, Wilbur C.

    1991-01-01

    Research into component modeling and system synthesis leading to the analysis of the major types of propulsion system instabilities and the characterization of various components characteristics are presented. Last year, several programs designed to run on a PC were developed for Marshall Space Flight Center. These codes covered the low, intermediate, and high frequency modes of oscillation of a liquid rocket propulsion system. No graphics were built into these programs and only simple piping layouts were supported. This year's effort was to add run time graphics to the low and intermediate frequency codes, allow new types of piping elements (accumulators, pumps, and split pipes) in the low frequency code, and develop a new code for the PC to generate Nyquist plots.

  14. Code System to Calculate Tornado-Induced Flow Material Transport.

    SciTech Connect

    ANDRAE, R. W.

    1999-11-18

    Version: 00 TORAC models tornado-induced flows, pressures, and material transport within structures. Its use is directed toward nuclear fuel cycle facilities and their primary release pathway, the ventilation system. However, it is applicable to other structures and can model other airflow pathways within a facility. In a nuclear facility, this network system could include process cells, canyons, laboratory offices, corridors, and offgas systems. TORAC predicts flow through a network system that also includes ventilation system components such as filters, dampers, ducts, and blowers. These ventilation system components are connected to the rooms and corridors of the facility to form a complete network for moving air through the structure and, perhaps, maintaining pressure levels in certain areas. The material transport capability in TORAC is very basic and includes convection, depletion, entrainment, and filtration of material.

  15. The next big challenge for EPs: The transition to ICD-10-CM coding system.

    PubMed

    2015-08-01

    The long-delayed transition to the International Classification of Diseases, Clinical Modification administrative codes (lCD-10-CM) is set to take place in October, presenting a host of challenges for EPs. A new analysis suggests roughly a quarter of the clinical encounters that take place in the ED will involve complexity in the transition to the new system. Further, experts anticipate workflow challenges as well as new considerations when making planning decisions and reporting to public health departments. The number of codes available to providers will jump from 14,000 to 80,000 with the transition to the new coding system. Investigators found that that 23% of the visits, or 27% of the codes, emergency medicine physicians use are complex. The new coding system requires much more specificity, but there are also instances in which definitions have been altered or blended together, essentially changing the concepts described. While all EPs will face some challenges with the new coding system, analysts are particularly concerned about smaller EDs and physician groups because these practices typically don't have the ICD-10-CM implementation teams that larger systems have.

  16. SCALE: A modular code system for performing standardized computer analyses for licensing evaluation

    SciTech Connect

    1997-03-01

    This Manual represents Revision 5 of the user documentation for the modular code system referred to as SCALE. The history of the SCALE code system dates back to 1969 when the current Computational Physics and Engineering Division at Oak Ridge National Laboratory (ORNL) began providing the transportation package certification staff at the U.S. Atomic Energy Commission with computational support in the use of the new KENO code for performing criticality safety assessments with the statistical Monte Carlo method. From 1969 to 1976 the certification staff relied on the ORNL staff to assist them in the correct use of codes and data for criticality, shielding, and heat transfer analyses of transportation packages. However, the certification staff learned that, with only occasional use of the codes, it was difficult to become proficient in performing the calculations often needed for an independent safety review. Thus, shortly after the move of the certification staff to the U.S. Nuclear Regulatory Commission (NRC), the NRC staff proposed the development of an easy-to-use analysis system that provided the technical capabilities of the individual modules with which they were familiar. With this proposal, the concept of the Standardized Computer Analyses for Licensing Evaluation (SCALE) code system was born. This manual covers an array of modules written for the SCALE package, consisting of drivers, system libraries, cross section and materials properties libraries, input/output routines, storage modules, and help files.

  17. Magnetic Resonance Image Phantom Code System to Calibrate in vivo Measurement Systems.

    SciTech Connect

    HICKMAN, DAVE

    1997-07-17

    Version 00 MRIPP provides relative calibration factors for the in vivo measurement of internally deposited photon emitting radionuclides within the human body. The code includes a database of human anthropometric structures (phantoms) that were constructed from whole body Magnetic Resonance Images. The database contains a large variety of human images with varying anatomical structure. Correction factors are obtained using Monte Carlo transport of photons through the voxel geometry of the phantom. Correction factors provided by MRIPP allow users of in vivo measurement systems (e.g., whole body counters) to calibrate these systems with simple sources and obtain subject specific calibrations. Note that the capability to format MRI data for use with this system is not included; therefore, one must use the phantom data included in this package. MRIPP provides a simple interface to perform Monte Carlo simulation of photon transport through the human body. MRIPP also provides anthropometric information (e.g., height, weight, etc.) for individuals used to generate the phantom database. A modified Voxel version of the Los Alamos National Laboratory MCNP4A code is used for the Monte Carlo simulation. The Voxel version Fortran patch to MCNP4 and MCNP4A (Monte Carlo N-Particle transport simulation) and the MCNP executable are included in this distribution, but the MCNP Fortran source is not included. It was distributed by RSICC as CCC-200 but is now obsoleted by the current release MCNP4B.

  18. Racial disparity and the legitimacy of the criminal justice system: exploring consequences for deterrence.

    PubMed

    Taxman, Faye S; Byrne, James M; Pattavina, April

    2005-11-01

    Minority (over) representation in the criminal justice system remains a puzzle, both from a policy and an intervention perspective. Cross-sectional reviews of the policies and practices of the criminal justice system often find differential rates of involvement in the criminal justice system that are associated with the nature of the criminal charge/act or characteristics of the offender; however, longitudinal reviews of the race effect often show it to be confounded by procedural and extralegal variables. This review focuses on how the cumulative policies and practices of the criminal justice system contribute to churning, or the recycling of individuals through the system. In conducting our review, we describe how the same criminal justice processes and practices adversely affect select communities. The consequences of policies and procedures that contribute to churning may affect the legitimacy of the criminal justice system as a deterrent to criminal behavior. A research agenda on issues related to legitimacy of the criminal justice system aimed at a better understanding of how this affects individual and community behavior is presented.

  19. Development of New Generation of Multibody System Computer Codes

    DTIC Science & Technology

    2012-11-02

    Book ,” 2nd edn. Springer , New York. 12. Roberson, R.E., and Schwertassek, R., 1988, Dynamics of Multibody Systems, Springer Verlag, Berlin, Germany...vehicle, machine, aerospace, biomechanics , and biological system components such as tires, belt drives, rubber chains, soil, cables, ligaments, soft

  20. Wave-front coded optical readout for the MEMS-based uncooled infrared imaging system

    NASA Astrophysics Data System (ADS)

    Li, Tian; Zhao, Yuejin; Dong, Liquan; Liu, Xiaohua; Jia, Wei; Hui, Mei; Yu, Xiaomei; Gong, Cheng; Liu, Weiyu

    2012-11-01

    In the space limited infrared imaging system based MEMS, the adjustment of optical readout part is inconvenient. This paper proposed a method of wave-front coding to extend the depth of focus/field of the optical readout system, to solve the problem above, and to reduce the demanding for precision in processing and assemblage of the optical readout system itself as well. The wave-front coded imaging system consists of optical coding and digital decoding. By adding a CPM (Cubic Phase Mask) on the pupil plane, it becomes non-sensitive to defocussing within an extended range. The system has similar PSFs and almost equally blurred intermediate images can be obtained. Sharp images are supposed to be acquired based on image restoration algorithms, with the same PSF as a decoding core. We studied the conventional optical imaging system, which had the same optical performance with the wave-front coding one for comparing. Analogue imaging experiments were carried out. And one PSF was used as a simple direct inverse filter, for imaging restoration. Relatively sharp restored images were obtained. Comparatively, the analogue defocussing images of the conventional system were badly destroyed. Using the decrease of the MTF as a standard, we found the depth of focus/field of the wave-front coding system had been extended significantly.

  1. Code System for Three-Dimensional Hydraulic Reactor Core Analysis.

    SciTech Connect

    ROBERT,; BENEDETTI, L.

    2001-03-05

    Version 00 SCORE-EVET was developed to study multidimensional transient fluid flow in nuclear reactor fuel rod arrays. The conservation equations used were derived by volume averaging the transient compressible three-dimensional local continuum equations in Cartesian coordinates. No assumptions associated with subchannel flow have been incorporated into the derivation of the conservation equations. In addition to the three-dimensional fluid flow equations, the SCORE-EVET code contains a one-dimensional steady state solution scheme to initialize the flow field, steady state and transient fuel rod conduction models, and comprehensive correlation packages to describe fluid-to-fuel rod interfacial energy and momentum exchange. Velocity and pressure boundary conditions can be specified as a function of time and space to model reactor transient conditions, such as a hypothesized loss-of-coolant accident (LOCA) or flow blockage. The basic volume-averaged transient three-dimensional equations for flow in porous media are solved in their general form with constitutive relationships and boundary conditions tailored to define the porous medium as a matrix of fuel rods. By retaining generality in the form of the conservation equations, a wide range of fluid flow problem configurations, from computational regions representing a single fuel rod subchannel to multichannels, or even regions without a fuel rod, can be modeled without restrictive assumptions. The completeness of the conservation equations has allowed SCORE-EVET to be used, with modification to the constitutive relationships, to calculate three-dimensional laminar boundary layer development, flow fields in large bodies of water, and, with the addition of a turbulence model, turbulent flow in pipe expansions and tees.

  2. International Space Station Electric Power System Performance Code-SPACE

    NASA Technical Reports Server (NTRS)

    Hojnicki, Jeffrey; McKissock, David; Fincannon, James; Green, Robert; Kerslake, Thomas; Delleur, Ann; Follo, Jeffrey; Trudell, Jeffrey; Hoffman, David J.; Jannette, Anthony; Rodriguez, Carlos

    2005-01-01

    The System Power Analysis for Capability Evaluation (SPACE) software analyzes and predicts the minute-by-minute state of the International Space Station (ISS) electrical power system (EPS) for upcoming missions as well as EPS power generation capacity as a function of ISS configuration and orbital conditions. In order to complete the Certification of Flight Readiness (CoFR) process in which the mission is certified for flight each ISS System must thoroughly assess every proposed mission to verify that the system will support the planned mission operations; SPACE is the sole tool used to conduct these assessments for the power system capability. SPACE is an integrated power system model that incorporates a variety of modules tied together with integration routines and graphical output. The modules include orbit mechanics, solar array pointing/shadowing/thermal and electrical, battery performance, and power management and distribution performance. These modules are tightly integrated within a flexible architecture featuring data-file-driven configurations, source- or load-driven operation, and event scripting. SPACE also predicts the amount of power available for a given system configuration, spacecraft orientation, solar-array-pointing conditions, orbit, and the like. In the source-driven mode, the model must assure that energy balance is achieved, meaning that energy removed from the batteries must be restored (or balanced) each and every orbit. This entails an optimization scheme to ensure that energy balance is maintained without violating any other constraints.

  3. Systemizers Are Better Code-Breakers: Self-Reported Systemizing Predicts Code-Breaking Performance in Expert Hackers and Naïve Participants.

    PubMed

    Harvey, India; Bolgan, Samuela; Mosca, Daniel; McLean, Colin; Rusconi, Elena

    2016-01-01

    Studies on hacking have typically focused on motivational aspects and general personality traits of the individuals who engage in hacking; little systematic research has been conducted on predispositions that may be associated not only with the choice to pursue a hacking career but also with performance in either naïve or expert populations. Here, we test the hypotheses that two traits that are typically enhanced in autism spectrum disorders-attention to detail and systemizing-may be positively related to both the choice of pursuing a career in information security and skilled performance in a prototypical hacking task (i.e., crypto-analysis or code-breaking). A group of naïve participants and of ethical hackers completed the Autism Spectrum Quotient, including an attention to detail scale, and the Systemizing Quotient (Baron-Cohen et al., 2001, 2003). They were also tested with behavioral tasks involving code-breaking and a control task involving security X-ray image interpretation. Hackers reported significantly higher systemizing and attention to detail than non-hackers. We found a positive relation between self-reported systemizing (but not attention to detail) and code-breaking skills in both hackers and non-hackers, whereas attention to detail (but not systemizing) was related with performance in the X-ray screening task in both groups, as previously reported with naïve participants (Rusconi et al., 2015). We discuss the theoretical and translational implications of our findings.

  4. The integration of system specifications and program coding

    NASA Technical Reports Server (NTRS)

    Luebke, W. R.

    1970-01-01

    Experience in maintaining up-to-date documentation for one module of the large-scale Medical Literature Analysis and Retrieval System 2 (MEDLARS 2) is described. Several innovative techniques were explored in the development of this system's data management environment, particularly those that use PL/I as an automatic documenter. The PL/I data description section can provide automatic documentation by means of a master description of data elements that has long and highly meaningful mnemonic names and a formalized technique for the production of descriptive commentary. The techniques discussed are practical methods that employ the computer during system development in a manner that assists system implementation, provides interim documentation for customer review, and satisfies some of the deliverable documentation requirements.

  5. Intensity-Invariant Coding in the Auditory System

    PubMed Central

    Barbour, Dennis L.

    2011-01-01

    The auditory system faithfully represents sufficient details from sound sources such that downstream cognitive processes are capable of acting upon this information effectively even in the face of signal uncertainty, degradation or interference. This robust sound source representation leads to an invariance in perception vital for animals to interact effectively with their environment. Due to unique nonlinearities in the cochlea, sound representations early in the auditory system exhibit a large amount of variability as a function of stimulus intensity. In other words, changes in stimulus intensity, such as for sound sources at differing distances, create a unique challenge for the auditory system to encode sounds invariantly across the intensity dimension. This challenge and some strategies available to sensory systems to eliminate intensity as an encoding variable are discussed, with a special emphasis upon sound encoding. PMID:21540053

  6. Novel secure and bandwidth efficient optical code division multiplexed system for future access networks

    NASA Astrophysics Data System (ADS)

    Singh, Simranjit

    2016-12-01

    In this paper, a spectrally coded optical code division multiple access (OCDMA) system using a hybrid modulation scheme has been investigated. The idea is to propose an effective approach for simultaneous improvement of the system capacity and security. Data formats, NRZ (non-return to zero), DQPSK (differential quadrature phase shift keying), and PoISk (polarisation shift keying) are used to get the orthogonal modulated signal. It is observed that the proposed hybrid modulation provides efficient utilisation of bandwidth, increases the data capacity and enhances the data confidentiality over existing OCDMA systems. Further, the proposed system performance is compared with the current state-of-the-art OCDMA schemes.

  7. Field-based tests of geochemical modeling codes: New Zealand hydrothermal systems

    SciTech Connect

    Bruton, C.J.; Glassley, W.E.; Bourcier, W.L.

    1993-12-01

    Hydrothermal systems in the Taupo Volcanic Zone, North Island, New Zealand are being used as field-based modeling exercises for the EQ3/6 geochemical modeling code package. Comparisons of the observed state and evolution of the hydrothermal systems with predictions of fluid-solid equilibria made using geochemical modeling codes will determine how the codes can be used to predict the chemical and mineralogical response of the environment to nuclear waste emplacement. Field-based exercises allow us to test the models on time scales unattainable in the laboratory. Preliminary predictions of mineral assemblages in equilibrium with fluids sampled from wells in the Wairakei and Kawerau geothermal field suggest that affinity-temperature diagrams must be used in conjunction with EQ6 to minimize the effect of uncertainties in thermodynamic and kinetic data on code predictions.

  8. Field-based tests of geochemical modeling codes using New Zealand hydrothermal systems

    SciTech Connect

    Bruton, C.J.; Glassley, W.E.; Bourcier, W.L.

    1994-06-01

    Hydrothermal systems in the Taupo Volcanic Zone, North Island, New Zealand are being used as field-based modeling exercises for the EQ3/6 geochemical modeling code package. Comparisons of the observed state and evolution of the hydrothermal systems with predictions of fluid-solid equilibria made using geochemical modeling codes will determine how the codes can be used to predict the chemical and mineralogical response of the environment to nuclear waste emplacement. Field-based exercises allow us to test the models on time scales unattainable in the laboratory. Preliminary predictions of mineral assemblages in equilibrium with fluids sampled from wells in the Wairakei and Kawerau geothermal field suggest that affinity-temperature diagrams must be used in conjunction with EQ6 to minimize the effect of uncertainties in thermodynamic and kinetic data on code predictions.

  9. EBT reactor systems analysis and cost code: description and users guide (Version 1)

    SciTech Connect

    Santoro, R.T.; Uckan, N.A.; Barnes, J.M.; Driemeyer, D.E.

    1984-06-01

    An ELMO Bumpy Torus (EBT) reactor systems analysis and cost code that incorporates the most recent advances in EBT physics has been written. The code determines a set of reactors that fall within an allowed operating window determined from the coupling of ring and core plasma properties and the self-consistent treatment of the coupled ring-core stability and power balance requirements. The essential elements of the systems analysis and cost code are described, along with the calculational sequences leading to the specification of the reactor options and their associated costs. The input parameters, the constraints imposed upon them, and the operating range over which the code provides valid results are discussed. A sample problem and the interpretation of the results are also presented.

  10. Auto Code Generation for Simulink-Based Attitude Determination Control System

    NASA Technical Reports Server (NTRS)

    MolinaFraticelli, Jose Carlos

    2012-01-01

    This paper details the work done to auto generate C code from a Simulink-Based Attitude Determination Control System (ADCS) to be used in target platforms. NASA Marshall Engineers have developed an ADCS Simulink simulation to be used as a component for the flight software of a satellite. This generated code can be used for carrying out Hardware in the loop testing of components for a satellite in a convenient manner with easily tunable parameters. Due to the nature of the embedded hardware components such as microcontrollers, this simulation code cannot be used directly, as it is, on the target platform and must first be converted into C code; this process is known as auto code generation. In order to generate C code from this simulation; it must be modified to follow specific standards set in place by the auto code generation process. Some of these modifications include changing certain simulation models into their atomic representations which can bring new complications into the simulation. The execution order of these models can change based on these modifications. Great care must be taken in order to maintain a working simulation that can also be used for auto code generation. After modifying the ADCS simulation for the auto code generation process, it is shown that the difference between the output data of the former and that of the latter is between acceptable bounds. Thus, it can be said that the process is a success since all the output requirements are met. Based on these results, it can be argued that this generated C code can be effectively used by any desired platform as long as it follows the specific memory requirements established in the Simulink Model.

  11. Modelling of Be Disks in Binary Systems Using the Hydrodynamic Code PLUTO

    NASA Astrophysics Data System (ADS)

    Cyr, I. H.; Panoglou, D.; Jones, C. E.; Carciofi, A. C.

    2016-11-01

    The study of the gas structure and dynamics of Be star disks is critical to our understanding of the Be star phenomenon. The central star is the major force driving the evolution of these disks, however other external forces may also affect the formation of the disk, for example, the gravitational torque produced in a close binary system. We are interested in understanding the gravitational effects of a low-mass binary companion on the formation and growth of a disk in a close binary system. To study these effects, we used the grid-based hydrodynamic code PLUTO. Because this code has not been used to study such systems before, we compared our simulations against codes used in previous work on binary systems. We were able to simulate the formation of a disk in both an isolated and binary system. Our current results suggest that PLUTO is in fact a well suited tool to study the dynamics of Be disks.

  12. Medical terminology coding systems and medicolegal death investigation data: searching for a standardized method of electronic coding at a statewide medical examiner's office.

    PubMed

    Lathrop, Sarah L; Davis, Wayland L; Nolte, Kurt B

    2009-01-01

    Medical examiner and coroner reports are a rich source of data for epidemiologic research. To maximize the utility of this information, medicolegal death investigation data need to be electronically coded. In order to determine the best option for coding, we evaluated four different options (Current Procedural Terminology [CPT], International Classification of Disease [ICD] coding, Systematized Nomenclature of Medicine Clinical Terms [SNOMED CT], and an in-house system), then conducted internal and external needs assessments to determine which system best met the needs of a centralized, statewide medical examiner's office. Although all four systems offer distinct advantages and disadvantages, SNOMED CT is the most accurate for coding pathologic diagnoses, with ICD-10 the best option for classifying the cause of death. For New Mexico's Office of the Medical Investigator, the most feasible coding option is an upgrade of an in-house coding system, followed by linkage to ICD codes for cause of death from the New Mexico Bureau of Vital Records and Health Statistics, and ideally, SNOMED classification of pathologic diagnoses.

  13. Central nervous system-immune system interactions: psychoneuroendocrinology of stress and its immune consequences.

    PubMed Central

    Black, P H

    1994-01-01

    Psychoneuroimmunology is a relatively new discipline which deals with CNS-immune system interactions. The evidence for such interactions was reviewed, as was the neuroendocrinologic response to stress. Recent evidence indicates that the behavioral, nervous system, and neuroendocrine responses to stress are mediated by hypothalamic CRF, which acts on both the sympathetic nervous system and the HPA axis, resulting in increased levels of corticosteroids, catecholamines, and certain opiates, substances which are generally immunosuppressive. Concentrations of growth hormone and prolactin, which are immunoenhancing, are elevated early during the response to stress but are later suppressed. Although several other neuromediators may also be released with stress, the net effect of a variety of acute stressors is down regulation of the immune system function. In the following minireview, I consider whether stress alters the resistance of the host to infection as well as the immunomodulatory effects of released immune system mediators on the brain. PMID:8141561

  14. Systemizers Are Better Code-Breakers: Self-Reported Systemizing Predicts Code-Breaking Performance in Expert Hackers and Naïve Participants

    PubMed Central

    Harvey, India; Bolgan, Samuela; Mosca, Daniel; McLean, Colin; Rusconi, Elena

    2016-01-01

    Studies on hacking have typically focused on motivational aspects and general personality traits of the individuals who engage in hacking; little systematic research has been conducted on predispositions that may be associated not only with the choice to pursue a hacking career but also with performance in either naïve or expert populations. Here, we test the hypotheses that two traits that are typically enhanced in autism spectrum disorders—attention to detail and systemizing—may be positively related to both the choice of pursuing a career in information security and skilled performance in a prototypical hacking task (i.e., crypto-analysis or code-breaking). A group of naïve participants and of ethical hackers completed the Autism Spectrum Quotient, including an attention to detail scale, and the Systemizing Quotient (Baron-Cohen et al., 2001, 2003). They were also tested with behavioral tasks involving code-breaking and a control task involving security X-ray image interpretation. Hackers reported significantly higher systemizing and attention to detail than non-hackers. We found a positive relation between self-reported systemizing (but not attention to detail) and code-breaking skills in both hackers and non-hackers, whereas attention to detail (but not systemizing) was related with performance in the X-ray screening task in both groups, as previously reported with naïve participants (Rusconi et al., 2015). We discuss the theoretical and translational implications of our findings. PMID:27242491

  15. Efficient Frequency Sharing of Baseband and Subcarrier Coding UHF RFID Systems

    NASA Astrophysics Data System (ADS)

    Mitsugi, Jin; Kawakita, Yuusuke

    UHF band passive RFID systems are being steadily adopted by industries because of their capability of long range automatic identification with passive tags. For an application which demands a large number of readers located in a limited geographical area, referred to as dense reader mode, interference rejection among readers is important. The coding method, baseband or subcarrier coding, in the tag-to-reader communication link results in a significant influence on the interference rejection performance. This paper examines the frequency sharing of baseband and subcarrier coding UHF RFID systems from the perspective of their transmission delay using a media access control (MAC) simulator. The validity of the numerical simulation was verified by an experiment. It is revealed that, in a mixed operation of baseband and subcarrier systems, assigning as many channels as possible to baseband system unless they do not exploit the subcarrier channels is the general principle for efficient frequency sharing. This frequency sharing principle is effective both to baseband and subcarrier coding systems. Otherwise, mixed operation fundamentally increases the transmission delay in subcarrier coding systems.

  16. The development of efficient coding for an electronic mail system

    NASA Technical Reports Server (NTRS)

    Rice, R. F.

    1983-01-01

    Techniques for efficiently representing scanned electronic documents were investigated. Major results include the definition and preliminary performance results of a Universal System for Efficient Electronic Mail (USEEM), offering a potential order of magnitude improvement over standard facsimile techniques for representing textual material.

  17. Risk from a pressurized toxic gas system: Part 2, Dispersal consequences

    SciTech Connect

    Brereton, S.J.; Martin, D.; Lane, S.G.; Altenbach, T.J.

    1995-04-01

    During the preparation of a Safety Analysis Report at the Lawrence Livermore National Laboratory, we studied the release of chlorine from a compressed gas experimental apparatus. This paper presents the second part in a series of two papers on this topic. The first paper focuses on the frequency of an unmitigated release from the system; this paper discusses the consequences of the release. The release of chlorine from the experimental apparatus was modeled as an unmitigated blowdown through a 0.25 inch (0.0064 m) outside diameter tube. The physical properties of chlorine were considered as were the dynamics of the fluid flow problem. The calculated release rate was used as input for the consequence assessment. Downwind concentrations as a function of time were evaluated and then compared to suggested guidelines. For comparison purposes, a typical water treatment plant was briefly studied. The lower hazard presented by the LLNL operation becomes evident when its release is compared to the release of material from a water treatment plant, a hazard which is generally accepted by the public.

  18. Risk from a pressurized toxic gas system: Part 2, Dispersal consequences

    SciTech Connect

    Brereton, S.J.; Altenbach, T.J.; Lane, S.G.; Martin, D.W.

    1995-02-01

    During the preparation of a Safety Analysis Report at the Lawrence Livermore National Laboratory. we studied the release of chlorine from a compressed gas experimental apparatus. This paper presents the second pan in a series of two papers on this topic. The first paper focuses on the frequency of an unmitigated release from the system; paper focuses the consequences of the release. The release of chlorine from the experimental apparatus was modeled as an unmitigated blowdown through a 0.25 inch (0.006.4 m) outside diameter tube. The physical properties of chlorine were considered as were the dynamics of the fluid flow problem. The calculated release rate was used as input for the consequence assessment. Downwind concentrations as a function of time were evaluated and then compared to suggested guidelines. For comparison purposes, a typical water treatment plant was briefly studied. The lower hazard presented by the LLNL operation becomes evident when its release is compared to the release of material from a water treatment plant, a hazard which is generally accepted by the public.

  19. The CCONE Code System and its Application to Nuclear Data Evaluation for Fission and Other Reactions

    SciTech Connect

    Iwamoto, O. Iwamoto, N.; Kunieda, S.; Minato, F.; Shibata, K.

    2016-01-15

    A computer code system, CCONE, was developed for nuclear data evaluation within the JENDL project. The CCONE code system integrates various nuclear reaction models needed to describe nucleon, light charged nuclei up to alpha-particle and photon induced reactions. The code is written in the C++ programming language using an object-oriented technology. At first, it was applied to neutron-induced reaction data on actinides, which were compiled into JENDL Actinide File 2008 and JENDL-4.0. It has been extensively used in various nuclear data evaluations for both actinide and non-actinide nuclei. The CCONE code has been upgraded to nuclear data evaluation at higher incident energies for neutron-, proton-, and photon-induced reactions. It was also used for estimating β-delayed neutron emission. This paper describes the CCONE code system indicating the concept and design of coding and inputs. Details of the formulation for modelings of the direct, pre-equilibrium and compound reactions are presented. Applications to the nuclear data evaluations such as neutron-induced reactions on actinides and medium-heavy nuclei, high-energy nucleon-induced reactions, photonuclear reaction and β-delayed neutron emission are mentioned.

  20. The CCONE Code System and its Application to Nuclear Data Evaluation for Fission and Other Reactions

    NASA Astrophysics Data System (ADS)

    Iwamoto, O.; Iwamoto, N.; Kunieda, S.; Minato, F.; Shibata, K.

    2016-01-01

    A computer code system, CCONE, was developed for nuclear data evaluation within the JENDL project. The CCONE code system integrates various nuclear reaction models needed to describe nucleon, light charged nuclei up to alpha-particle and photon induced reactions. The code is written in the C++ programming language using an object-oriented technology. At first, it was applied to neutron-induced reaction data on actinides, which were compiled into JENDL Actinide File 2008 and JENDL-4.0. It has been extensively used in various nuclear data evaluations for both actinide and non-actinide nuclei. The CCONE code has been upgraded to nuclear data evaluation at higher incident energies for neutron-, proton-, and photon-induced reactions. It was also used for estimating β-delayed neutron emission. This paper describes the CCONE code system indicating the concept and design of coding and inputs. Details of the formulation for modelings of the direct, pre-equilibrium and compound reactions are presented. Applications to the nuclear data evaluations such as neutron-induced reactions on actinides and medium-heavy nuclei, high-energy nucleon-induced reactions, photonuclear reaction and β-delayed neutron emission are mentioned.

  1. Object-adaptive depth compensated inter prediction for depth video coding in 3D video system

    NASA Astrophysics Data System (ADS)

    Kang, Min-Koo; Lee, Jaejoon; Lim, Ilsoon; Ho, Yo-Sung

    2011-01-01

    Nowadays, the 3D video system using the MVD (multi-view video plus depth) data format is being actively studied. The system has many advantages with respect to virtual view synthesis such as an auto-stereoscopic functionality, but compression of huge input data remains a problem. Therefore, efficient 3D data compression is extremely important in the system, and problems of low temporal consistency and viewpoint correlation should be resolved for efficient depth video coding. In this paper, we propose an object-adaptive depth compensated inter prediction method to resolve the problems where object-adaptive mean-depth difference between a current block, to be coded, and a reference block are compensated during inter prediction. In addition, unique properties of depth video are exploited to reduce side information required for signaling decoder to conduct the same process. To evaluate the coding performance, we have implemented the proposed method into MVC (multiview video coding) reference software, JMVC 8.2. Experimental results have demonstrated that our proposed method is especially efficient for depth videos estimated by DERS (depth estimation reference software) discussed in the MPEG 3DV coding group. The coding gain was up to 11.69% bit-saving, and it was even increased when we evaluated it on synthesized views of virtual viewpoints.

  2. Hierarchical security system using real-valued data and orthogonal code in Fourier domain

    NASA Astrophysics Data System (ADS)

    Kim, Hyun-Jun; Seo, Dong-Hoan; Hwang, Kwang-Il; Lim, Tae-Woo

    2014-02-01

    We propose a novel hierarchical encryption scheme using orthogonal code in Fourier domain and decryption based on interferometer system. The proposed system is composed of hierarchical ciphertexts with positive real values which can be applied for practical transmission such as Internet, and decryption keys with real valued function which has orthogonal characteristic in the decryption system. Since the original information is encrypted on the Fourier plane, the proposed encryption is more tolerant to loss of key information by scratching or cutting than encryption in a spatial domain. The resulting image using Fourier transform and an interferometer system with constant phase retarder is then decrypted by use of a ciphertext with different security level and each of decryption keys made from the multiplication of orthogonal code and random phase code in order to enhance the level of security. We demonstrate the efficiency of the proposed method and the fault-tolerance properties of data loss through several simulations.

  3. Friendly Fire: Biological Functions and Consequences of Chromosomal Targeting by CRISPR-Cas Systems

    PubMed Central

    Heussler, Gary E.

    2016-01-01

    Clustered regularly interspaced short palindromic repeat (CRISPR)-associated (Cas) systems in bacteria and archaea target foreign elements, such as bacteriophages and conjugative plasmids, through the incorporation of short sequences (termed spacers) from the foreign element into the CRISPR array, thereby allowing sequence-specific targeting of the invader. Thus, CRISPR-Cas systems are typically considered a microbial adaptive immune system. While many of these incorporated spacers match targets on bacteriophages and plasmids, a noticeable number are derived from chromosomal DNA. While usually lethal to the self-targeting bacteria, in certain circumstances, these self-targeting spacers can have profound effects in regard to microbial biology, including functions beyond adaptive immunity. In this minireview, we discuss recent studies that focus on the functions and consequences of CRISPR-Cas self-targeting, including reshaping of the host population, group behavior modification, and the potential applications of CRISPR-Cas self-targeting as a tool in microbial biotechnology. Understanding the effects of CRISPR-Cas self-targeting is vital to fully understanding the spectrum of function of these systems. PMID:26929301

  4. "Going solid": a model of system dynamics and consequences for patient safety

    PubMed Central

    Cook, R; Rasmussen, J

    2005-01-01

    

 Rather than being a static property of hospitals and other healthcare facilities, safety is dynamic and often on short time scales. In the past most healthcare delivery systems were loosely coupled—that is, activities and conditions in one part of the system had only limited effect on those elsewhere. Loose coupling allowed the system to buffer many conditions such as short term surges in demand. Modern management techniques and information systems have allowed facilities to reduce inefficiencies in operation. One side effect is the loss of buffers that previously accommodated demand surges. As a result, situations occur in which activities in one area of the hospital become critically dependent on seemingly insignificant events in seemingly distant areas. This tight coupling condition is called "going solid". Rasmussen's dynamic model of risk and safety can be used to formulate a model of patient safety dynamics that includes "going solid" and its consequences. Because the model addresses the dynamic aspects of safety, it is particularly suited to understanding current conditions in modern healthcare delivery and the way these conditions may lead to accidents. PMID:15805459

  5. Different features of work systems in Indonesia and their consequent approaches.

    PubMed

    Manuaba, A

    1997-12-01

    Indonesia, with its ultimate development goal of "developing the people and the community as a whole," in fact is facing problems in the execution of this goal. With a population of more than 200 million persons, different in sociocultural background, educational level and environmental conditions, it is understandable that the process and results of technological choices and transfers for various target groups will be different. A wide range of work systems is found, from the simplest man-tool system to the most complex. The conditions are becoming even more complex, a phenomenon especially evident through studies of their sociocultural, psychological, and environmental factors. As a consequence, if success is to be gained in anticipating and understanding the role of Indonesia in the global competition that lies ahead, a very wise approach to the situation by using local values that are often based on traditional habits and customs in a modern context should be carried out. This approach will require an immense amount of time, dedication and effort. Improvement endeavors that have been carried out in different work systems in different types of activities and industries, showed that if the improvement to be sustained, a holistic, systemic, and interdisciplined participatory approach should be taken into consideration where the technical, economical, ergonomic, sociocultural, energy, and environmental factors will play significant roles.

  6. Homebirth, freebirth and doulas: casualty and consequences of a broken maternity system.

    PubMed

    Dahlen, H G; Jackson, M; Stevens, J

    2011-03-01

    In Australia private homebirth remains unfunded and uninsured and publicly funded homebirth models are not widely available. Doulas are increasingly hired by women for support during childbirth and freebirth (birth intentionally unattended by a health professional) appears to be on the rise. The recently released Improving Maternity Services in Australia--The Report of the Maternity Services Review (MSR) exclude homebirth from the funding and insurance reforms proposed. Drawing on recent research we argue that homebirth has become a casualty of a broken maternity system. The recent rise in the numbers of women employing doulas and choosing to birth at home unattended by any health professional, we argue, is in part a consequence of not adequately meeting the needs of women for continuity of midwifery care and non-medicalised birthing options.

  7. Implementing a bar-coded bedside medication administration system.

    PubMed

    Yates, Cindy

    2007-01-01

    Hospitals across the nation are struggling with implementing electronic medication administration and reporting (eMAR) systems as part of patient safety programs. St Luke's Hospital in Chesterfield, Mo, initiated their eMAR initiative in June 2003, initiating program start-up in September 2004. This case study documents how the project was approached, its overall success, and what was learned along the way. Also included is a recent update highlighting the expansion of St Luke's patient safety initiative, adapting eMAR to two specialty units: dialysis and laboratory processes.

  8. Blood withdrawal affects iron store dynamics in primates with consequences on monoaminergic system function.

    PubMed

    Hyacinthe, C; De Deurwaerdere, P; Thiollier, T; Li, Q; Bezard, E; Ghorayeb, I

    2015-04-02

    Iron homeostasis is essential for the integrity of brain monoaminergic functions and its deregulation might be involved in neurological movement disorders such as the restless legs syndrome (RLS). Although iron metabolism breakdown concomitantly appears with monoaminergic system dysfunction in iron-deficient rodents and in RLS patients, the direct consequences of peripheral iron deficiency in the central nervous system (CNS) of non-human primates have received little attention. Here, we evaluated the peripheral iron-depletion impact on brain monoamine levels in macaque monkeys. After documenting circadian variations of iron and iron-related proteins (hemoglobin, ferritin and transferrin) in both serum and cerebrospinal fluid (CSF) of normal macaques, repeated blood withdrawals (RBW) were used to reduce peripheral iron-related parameter levels. Decreased serum iron levels were paradoxically associated with increased CSF iron concentrations. Despite limited consequences on tissue monoamine contents (dopamine - DA, 3, 4-dihydroxyphenylacetic acid - DOPAC, homovanillic acid, L-3, 4-dihydroxyphenylalanine - L-DOPA, 5-8 hydroxytryptamine - 5-HT, 5-hydroxyindoleacetic acid - 5-HIAA and noradrenaline) measured with post-mortem chromatography, we found distinct and region-dependent relationships of these tissue concentrations with CSF iron and/or serum iron and/or blood hemoglobin. Additionally, striatal extracellular DA, DOPAC and 5-HIAA levels evaluated by in vivo microdialysis showed a substantial increase, suggesting an overall increase in both DA and 5-HT tones. Finally, a trending increase in general locomotor activity, measured by actimetry, was observed in the most serum iron-depleted macaques. Taken together, our data are compatible with an increase in nigrostriatal DAergic function in the event of iron deficiency and point to a specific alteration of the 5-HT/DA interaction in the CNS that is possibly involved in the etiology of RLS.

  9. Monte Carlo capabilities of the SCALE code system

    DOE PAGES

    Rearden, Bradley T.; Petrie, Jr., Lester M.; Peplow, Douglas E.; ...

    2014-09-12

    SCALE is a broadly used suite of tools for nuclear systems modeling and simulation that provides comprehensive, verified and validated, user-friendly capabilities for criticality safety, reactor physics, radiation shielding, and sensitivity and uncertainty analysis. For more than 30 years, regulators, licensees, and research institutions around the world have used SCALE for nuclear safety analysis and design. SCALE provides a “plug-and-play” framework that includes three deterministic and three Monte Carlo radiation transport solvers that can be selected based on the desired solution, including hybrid deterministic/Monte Carlo simulations. SCALE includes the latest nuclear data libraries for continuous-energy and multigroup radiation transport asmore » well as activation, depletion, and decay calculations. SCALE’s graphical user interfaces assist with accurate system modeling, visualization, and convenient access to desired results. SCALE 6.2 will provide several new capabilities and significant improvements in many existing features, especially with expanded continuous-energy Monte Carlo capabilities for criticality safety, shielding, depletion, and sensitivity and uncertainty analysis. Finally, an overview of the Monte Carlo capabilities of SCALE is provided here, with emphasis on new features for SCALE 6.2.« less

  10. Monte Carlo capabilities of the SCALE code system

    SciTech Connect

    Rearden, Bradley T.; Petrie, Jr., Lester M.; Peplow, Douglas E.; Bekar, Kursat B.; Wiarda, Dorothea; Celik, Cihangir; Perfetti, Christopher M.; Ibrahim, Ahmad M.; Hart, S. W. D.; Dunn, Michael E.; Marshall, William J.

    2014-09-12

    SCALE is a broadly used suite of tools for nuclear systems modeling and simulation that provides comprehensive, verified and validated, user-friendly capabilities for criticality safety, reactor physics, radiation shielding, and sensitivity and uncertainty analysis. For more than 30 years, regulators, licensees, and research institutions around the world have used SCALE for nuclear safety analysis and design. SCALE provides a “plug-and-play” framework that includes three deterministic and three Monte Carlo radiation transport solvers that can be selected based on the desired solution, including hybrid deterministic/Monte Carlo simulations. SCALE includes the latest nuclear data libraries for continuous-energy and multigroup radiation transport as well as activation, depletion, and decay calculations. SCALE’s graphical user interfaces assist with accurate system modeling, visualization, and convenient access to desired results. SCALE 6.2 will provide several new capabilities and significant improvements in many existing features, especially with expanded continuous-energy Monte Carlo capabilities for criticality safety, shielding, depletion, and sensitivity and uncertainty analysis. Finally, an overview of the Monte Carlo capabilities of SCALE is provided here, with emphasis on new features for SCALE 6.2.

  11. Input coding for neuro-electronic hybrid systems.

    PubMed

    George, Jude Baby; Abraham, Grace Mathew; Singh, Katyayani; Ankolekar, Shreya M; Amrutur, Bharadwaj; Sikdar, Sujit Kumar

    2014-12-01

    Liquid State Machines have been proposed as a framework to explore the computational properties of neuro-electronic hybrid systems (Maass et al., 2002). Here the neuronal culture implements a recurrent network and is followed by an array of linear discriminants implemented using perceptrons in electronics/software. Thus in this framework, it is desired that the outputs of the neuronal network, corresponding to different inputs, be linearly separable. Previous studies have demonstrated this by either using only a small set of input stimulus patterns to the culture (Hafizovic et al., 2007), large number of input electrodes (Dockendorf et al., 2009) or by using complex schemes to post-process the outputs of the neuronal culture prior to linear discriminance (Ortman et al., 2011). In this study we explore ways to temporally encode inputs into stimulus patterns using a small set of electrodes such that the neuronal culture's output can be directly decoded by simple linear discriminants based on perceptrons. We demonstrate that network can detect the timing and order of firing of inputs on multiple electrodes. Based on this, we demonstrate that the neuronal culture can be used as a kernel to transform inputs which are not linearly separable in a low dimensional space, into outputs in a high dimension where they are linearly separable. Thus simple linear discriminants can now be directly connected to outputs of the neuronal culture and allow for implementation of any function for such a hybrid system.

  12. Monte Carlo Capabilities of the SCALE Code System

    NASA Astrophysics Data System (ADS)

    Rearden, B. T.; Petrie, L. M.; Peplow, D. E.; Bekar, K. B.; Wiarda, D.; Celik, C.; Perfetti, C. M.; Ibrahim, A. M.; Hart, S. W. D.; Dunn, M. E.

    2014-06-01

    SCALE is a widely used suite of tools for nuclear systems modeling and simulation that provides comprehensive, verified and validated, user-friendly capabilities for criticality safety, reactor physics, radiation shielding, and sensitivity and uncertainty analysis. For more than 30 years, regulators, licensees, and research institutions around the world have used SCALE for nuclear safety analysis and design. SCALE provides a "plug-and-play" framework that includes three deterministic and three Monte Carlo radiation transport solvers that can be selected based on the desired solution, including hybrid deterministic/Monte Carlo simulations. SCALE includes the latest nuclear data libraries for continuous-energy and multigroup radiation transport as well as activation, depletion, and decay calculations. SCALE's graphical user interfaces assist with accurate system modeling, visualization, and convenient access to desired results. SCALE 6.2, to be released in 2014, will provide several new capabilities and significant improvements in many existing features, especially with expanded continuous-energy Monte Carlo capabilities for criticality safety, shielding, depletion, and sensitivity and uncertainty analysis. An overview of the Monte Carlo capabilities of SCALE is provided here, with emphasis on new features for SCALE 6.2.

  13. In Vivo Imaging Reveals Composite Coding for Diagonal Motion in the Drosophila Visual System

    PubMed Central

    Zhou, Wei; Chang, Jin

    2016-01-01

    Understanding information coding is important for resolving the functions of visual neural circuits. The motion vision system is a classic model for studying information coding as it contains a concise and complete information-processing circuit. In Drosophila, the axon terminals of motion-detection neurons (T4 and T5) project to the lobula plate, which comprises four regions that respond to the four cardinal directions of motion. The lobula plate thus represents a topographic map on a transverse plane. This enables us to study the coding of diagonal motion by investigating its response pattern. By using in vivo two-photon calcium imaging, we found that the axon terminals of T4 and T5 cells in the lobula plate were activated during diagonal motion. Further experiments showed that the response to diagonal motion is distributed over the following two regions compared to the cardinal directions of motion—a diagonal motion selective response region and a non-selective response region—which overlap with the response regions of the two vector-correlated cardinal directions of motion. Interestingly, the sizes of the non-selective response regions are linearly correlated with the angle of the diagonal motion. These results revealed that the Drosophila visual system employs a composite coding for diagonal motion that includes both independent coding and vector decomposition coding. PMID:27695103

  14. The Development of a Portable Hard Disk Encryption/Decryption System with a MEMS Coded Lock.

    PubMed

    Zhang, Weiping; Chen, Wenyuan; Tang, Jian; Xu, Peng; Li, Yibin; Li, Shengyong

    2009-01-01

    In this paper, a novel portable hard-disk encryption/decryption system with a MEMS coded lock is presented, which can authenticate the user and provide the key for the AES encryption/decryption module. The portable hard-disk encryption/decryption system is composed of the authentication module, the USB portable hard-disk interface card, the ATA protocol command decoder module, the data encryption/decryption module, the cipher key management module, the MEMS coded lock controlling circuit module, the MEMS coded lock and the hard disk. The ATA protocol circuit, the MEMS control circuit and AES encryption/decryption circuit are designed and realized by FPGA(Field Programmable Gate Array). The MEMS coded lock with two couplers and two groups of counter-meshing-gears (CMGs) are fabricated by a LIGA-like process and precision engineering method. The whole prototype was fabricated and tested. The test results show that the user's password could be correctly discriminated by the MEMS coded lock, and the AES encryption module could get the key from the MEMS coded lock. Moreover, the data in the hard-disk could be encrypted or decrypted, and the read-write speed of the dataflow could reach 17 MB/s in Ultra DMA mode.

  15. Design of Optical Systems with Extended Depth of Field: An Educational Approach to Wavefront Coding Techniques

    ERIC Educational Resources Information Center

    Ferran, C.; Bosch, S.; Carnicer, A.

    2012-01-01

    A practical activity designed to introduce wavefront coding techniques as a method to extend the depth of field in optical systems is presented. The activity is suitable for advanced undergraduate students since it combines different topics in optical engineering such as optical system design, aberration theory, Fourier optics, and digital image…

  16. Translating expert system rules into Ada code with validation and verification

    NASA Technical Reports Server (NTRS)

    Becker, Lee; Duckworth, R. James; Green, Peter; Michalson, Bill; Gosselin, Dave; Nainani, Krishan; Pease, Adam

    1991-01-01

    The purpose of this ongoing research and development program is to develop software tools which enable the rapid development, upgrading, and maintenance of embedded real-time artificial intelligence systems. The goals of this phase of the research were to investigate the feasibility of developing software tools which automatically translate expert system rules into Ada code and develop methods for performing validation and verification testing of the resultant expert system. A prototype system was demonstrated which automatically translated rules from an Air Force expert system was demonstrated which detected errors in the execution of the resultant system. The method and prototype tools for converting AI representations into Ada code by converting the rules into Ada code modules and then linking them with an Activation Framework based run-time environment to form an executable load module are discussed. This method is based upon the use of Evidence Flow Graphs which are a data flow representation for intelligent systems. The development of prototype test generation and evaluation software which was used to test the resultant code is discussed. This testing was performed automatically using Monte-Carlo techniques based upon a constraint based description of the required performance for the system.

  17. An Integrated Program Structure and System of Account Codes for PPBS in Local School Districts.

    ERIC Educational Resources Information Center

    Miller, Donald R.

    This monograph presents a comprehensive but tentative matrix program structure and system of account codes that have been integrated to facilitate the implementation of PPB systems in local school districts. It is based on the results of an extensive analysis of K-12 public school district programs and management practices. In its entirety, the…

  18. [Alternative tactile system: C-fibers coding the affective aspect].

    PubMed

    Hua, Qing-Ping; Luo, Fei

    2007-10-01

    It has been accepted that human tactile sensation is mediated exclusively by large myelinated (Abeta) fibres. Nevertheless, recent studies indicated a dual mechanoceptive innervation of the skin in various mammals. Besides the known A fibers, the skin is also innervated by slow-conducting, low-threshold, small unmyelinated (C) afferents. These unmyelinated fibers respond vigorously to innocuous skin deformation, but poorly to rapid skin movement. They project to outer lamina II of spinal cord, and form synapse with the secondary sensory neurons. The latter then project to insular cortex via spinothalamic tracts. Functional magnetic resonance imaging (fMRI) studies showed that a slowly moving tactile stimulus along hairy skin produced a strong activation of the insular cortex. Pleasant touch has also been demonstrated to activate orbitofrontal cortex adjacent to areas responding to pleasant taste and smell. Overall, the response characteristics and activated brain regions suggest that they are related with the limbic system and affective aspect rather than tactile discriminative function.

  19. Symbolic coding for noninvertible systems: uniform approximation and numerical computation

    NASA Astrophysics Data System (ADS)

    Beyn, Wolf-Jürgen; Hüls, Thorsten; Schenke, Andre

    2016-11-01

    It is well known that the homoclinic theorem, which conjugates a map near a transversal homoclinic orbit to a Bernoulli subshift, extends from invertible to specific noninvertible dynamical systems. In this paper, we provide a unifying approach that combines such a result with a fully discrete analog of the conjugacy for finite but sufficiently long orbit segments. The underlying idea is to solve appropriate discrete boundary value problems in both cases, and to use the theory of exponential dichotomies to control the errors. This leads to a numerical approach that allows us to compute the conjugacy to any prescribed accuracy. The method is demonstrated for several examples where invertibility of the map fails in different ways.

  20. A contextual coding system for transplantation and end stage diseases.

    PubMed

    Jacquelinet, Christian; Burgun, Anita; Djabbour, Sami; Delamarre, Denis; Clerc, Patrick; Boutin, Bernard; Le Beux, Pierre

    2003-01-01

    The Establissement français des Greffes (EfG) is a state agency dealing with Public Health issues related to organ, tissue and cell transplantation in France. EfG maintains a national information system (EfG-IS) for the evaluation of organ transplantation activities. The EfG-IS is moving toward a new n-tier architecture comprising a terminological server. Because this terminological server is shared by various kind of transplant teams and dialysis centers to record patients data at different time point, contextualisation of terms appeared as a functional requirement. We report in this paper various contexts for medical terms and how they have been taken into account.

  1. Engine structures modeling software system: Computer code. User's manual

    NASA Technical Reports Server (NTRS)

    1992-01-01

    ESMOSS is a specialized software system for the construction of geometric descriptive and discrete analytical models of engine parts, components and substructures which can be transferred to finite element analysis programs such as NASTRAN. The software architecture of ESMOSS is designed in modular form with a central executive module through which the user controls and directs the development of the analytical model. Modules consist of a geometric shape generator, a library of discretization procedures, interfacing modules to join both geometric and discrete models, a deck generator to produce input for NASTRAN and a 'recipe' processor which generates geometric models from parametric definitions. ESMOSS can be executed both in interactive and batch modes. Interactive mode is considered to be the default mode and that mode will be assumed in the discussion in this document unless stated otherwise.

  2. SUBAERF2 - WING AND FLAP SYSTEM ANALYSIS CODE

    NASA Technical Reports Server (NTRS)

    Carlson, H. W.

    1994-01-01

    The SUBAERF2 program was developed to provide for the aerodynamic analysis and design of low speed wing flap systems. SUBAERF2 is based on a linearized theory lifting surface solution. It is particularly well suited to configurations which, because of high speed flight requirements, must employ thin wings with highly swept leading edges. The program is applicable to wings with either sharp or rounded leading edges. This program is a new and improved version of LAR-13116 and LAR-12987, which it replaces. The low speed aerodynamic analysis method used in SUBAERF2 provides estimates of wing performance which include the effects of attainable leading-edge thrust and vortex lift. This basic aerodynamic analysis method has been improved to provide for the convenient, efficient and accurate treatment of simple leading-edge and trailing-edge flap systems. The user inputs flap geometry directly. Solutions can be found for various combinations of leading and trailing edge flap deflections. The program provides for the simultaneous analysis of up to 25 pairs of leading-edge and trailing-edge flap deflection schedules. A revised attainable thrust algorithm improves accuracy at the low Mach numbers sometimes encountered in wind tunnel testing. Also added is a means of estimating the distribution of leading edge separation vortex forces. The revised program has been particularly useful in the subsonic analysis of vehicles designed for supersonic cruise. The SUBAERF2 program is written in FORTRAN V for batch execution and has been implemented on a CDC 175 computer operating under NOS 2.4 with a central memory requirement of approximately 115K (octal) of 60 bit words. This program was originally developed in 1983 and later revised in 1988.

  3. Validation of the U.S. NRC coupled code system TRITON/TRACE/PARCS with the special power excursion reactor test III (SPERT III)

    SciTech Connect

    Wang, R. C.; Xu, Y.; Downar, T.; Hudson, N.

    2012-07-01

    The Special Power Excursion Reactor Test III (SPERT III) was a series of reactivity insertion experiments conducted in the 1950's. This paper describes the validation of the U.S. NRC Coupled Code system TRITON/PARCS/TRACE to simulate reactivity insertion accidents (RIA) by using several of the SPERT III tests. The work here used the SPERT III E-core configuration tests in which the RIA was initiated by ejecting a control rod. The resulting super-prompt reactivity excursion and negative reactivity feedback produced the familiar bell shaped power increase and decrease. The energy deposition during such a power peak has important safety consequences and provides validation basis for core coupled multi-physics codes. The transients of five separate tests are used to benchmark the PARCS/TRACE coupled code. The models were thoroughly validated using the original experiment documentation. (authors)

  4. Error-correction coding

    NASA Technical Reports Server (NTRS)

    Hinds, Erold W. (Principal Investigator)

    1996-01-01

    This report describes the progress made towards the completion of a specific task on error-correcting coding. The proposed research consisted of investigating the use of modulation block codes as the inner code of a concatenated coding system in order to improve the overall space link communications performance. The study proposed to identify and analyze candidate codes that will complement the performance of the overall coding system which uses the interleaved RS (255,223) code as the outer code.

  5. Pulse Code Modulation (PCM) encoder handbook for Aydin Vector MMP-900 series system

    NASA Technical Reports Server (NTRS)

    Raphael, David

    1995-01-01

    This handbook explicates the hardware and software properties of a time division multiplex system. This system is used to sample analog and digital data. The data is then merged with frame synchronization information to produce a serial pulse coded modulation (PCM) bit stream. Information in this handbook is required by users to design congruous interface and attest effective utilization of this encoder system. Aydin Vector provides all of the components for these systems to Goddard Space Flight Center/Wallops Flight Facility.

  6. Security of Classic PN-Spreading Codes for Hybrid DS/FH Spread-Spectrum Systems

    SciTech Connect

    Ma, Xiao; Olama, Mohammed M; Kuruganti, Phani Teja; Smith, Stephen Fulton; Djouadi, Seddik M

    2013-01-01

    Hybrid direct sequence/frequency hopping (DS/FH) spread-spectrum communication systems have recently received considerable interest in commercial applications in addition to their use in military communications because they accommodate high data rates with high link integrity, even in the presence of significant multipath effects and interfering signals. The security of hybrid DS/FH systems strongly depends on the choice of PN-spreading code employed. In this paper, we examine the security, in terms of unicity distance, of linear maximal-length, Gold, and Kasami PN-spreading codes for DS, FH, and hybrid DS/FH spread-spectrum systems without additional encryption methods. The unicity distance is a measure of the minimum amount of ciphertext required by an eavesdropper to uniquely determine the specific key used in a cryptosystem and hence break the cipher. Numerical results are presented to compare the security of the considered PN-spreading codes under known-ciphertext attacks.

  7. Architectural and Algorithmic Requirements for a Next-Generation System Analysis Code

    SciTech Connect

    V.A. Mousseau

    2010-05-01

    This document presents high-level architectural and system requirements for a next-generation system analysis code (NGSAC) to support reactor safety decision-making by plant operators and others, especially in the context of light water reactor plant life extension. The capabilities of NGSAC will be different from those of current-generation codes, not only because computers have evolved significantly in the generations since the current paradigm was first implemented, but because the decision-making processes that need the support of next-generation codes are very different from the decision-making processes that drove the licensing and design of the current fleet of commercial nuclear power reactors. The implications of these newer decision-making processes for NGSAC requirements are discussed, and resulting top-level goals for the NGSAC are formulated. From these goals, the general architectural and system requirements for the NGSAC are derived.

  8. Commissioning of a motion system to investigate dosimetric consequences due to variability of respiratory waveforms.

    PubMed

    Cetnar, Ashley J; James, Joshua; Wang, Brain

    2016-01-01

    A commercially available six-dimensional (6D) motion system was assessed for accuracy and clinical use in our department. Positional accuracy and respiratory waveform reproducibility were evaluated for the motion system. The system was then used to investigate the dosimetric consequences of respiratory waveform variation when an internal target volume (ITV) approach is used for motion management. The maximum deviations are 0.3 mm and 0.22° for translation and rotation accuracy, respectively, for the tested clinical ranges. The origin reproducibility is less than ±0.1 mm. The average differences are less than 0.1 mm with a maximum standard deviation of 0.8 mm between waveforms of actual patients and replication of those waveforms by HexaMotion for three breath-hold and one free-breathing waveform. A modified gamma analysis shows greater than 98% agreement with a 0.5 mm and 100 ms threshold. The motion system was used to investigate respiratory waveform variation and showed that, as the amplitude of the treatment waveform increases above that of the simulation waveform, the periphery of the target volume receives less dose than expected. However, by using gating limits to terminate the beam outside of the simulation amplitude, the results are as expected dosimetrically. Specifically, the average dose difference in the periphery between treating with the simulation waveform and the larger amplitude waveform could be up to 12% less without gating limits, but only differed 2% or less with the gating limits in place. The general functionality of the system performs within the manufacturer's specifications and can accurately replicate patient specific waveforms. When an ITV approach is used for motion management, we found the use of gating limits that coincide with the amplitude of the patient waveform at simulation helpful to prevent the potential underdosing of the target due to changes in patient respiration. PACS numbers: 87.55.Kh, 87.55.Qr, 87.56.Fc.

  9. Commissioning of a motion system to investigate dosimetric consequences due to variability of respiratory waveforms.

    PubMed

    Cetnar, Ashley J; James, Joshua; Wang, Brain

    2016-01-08

    A commercially available six-dimensional (6D) motion system was assessed for accuracy and clinical use in our department. Positional accuracy and respiratory waveform reproducibility were evaluated for the motion system. The system was then used to investigate the dosimetric consequences of respiratory waveform variation when an internal target volume (ITV) approach is used for motion management. The maximum deviations are 0.3 mm and 0.22° for translation and rotation accuracy, respectively, for the tested clinical ranges. The origin reproducibility is less than±0.1 mm. The average differences are less than 0.1 mm with a maximum standard deviation of 0.8 mm between waveforms of actual patients and replication of those waveforms by HexaMotion for three breath-hold and one free-breathing waveform. A modified gamma analysis shows greater than 98% agreement with a 0.5 mm and 100 ms threshold. The motion system was used to investigate respiratory waveform variation and showed that, as the amplitude of the treatment waveform increases above that of the simulation waveform, the periphery of the target volume receives less dose than expected. However, by using gating limits to terminate the beam outside of the simulation amplitude, the results are as expected dosimetrically. Specifically, the average dose difference in the periphery between treating with the simulation waveform and the larger amplitude waveform could be up to 12% less without gating limits, but only differed 2% or less with the gating limits in place. The general functionality of the system performs within the manufacturer's specifications and can accurately replicate patient specific waveforms. When an ITV approach is used for motion management, we found the use of gating limits that coincide with the amplitude of the patient waveform at simulation helpful to prevent the potential underdosing of the target due to changes in patient respiration.

  10. BCH Codes for Coherent Star DQAM Systems with Laser Phase Noise

    NASA Astrophysics Data System (ADS)

    Leong, Miu Yoong; Larsen, Knud J.; Jacobsen, Gunnar; Zibar, Darko; Sergeyev, Sergey; Popov, Sergei

    2017-03-01

    Coherent optical systems have relatively high laser phase noise, which affects the performance of forward error correction (FEC) codes. In this paper, we propose a method for selecting Bose-Chaudhuri-Hocquenghem (BCH) codes for coherent systems with star-shaped constellations and M-ary differential quadrature amplitude modulation (DQAM). Our method supports constellations of any order M which is a power of 2, and includes differential M-ary phase shift keying as a special case. Our approach is straightforward, requiring only short pre-FEC simulations to parameterize a statistical model, based on which we select codes analytically. It is applicable to pre-FEC bit error rates (BERs) of around 10-3. We evaluate the accuracy of our approach using numerical simulations. For a target post-FEC BER of 10-5, codes selected with our method yield BERs within 2× target. Lastly, we extend our method to systems with interleaving, which enables us to use codes with lower overhead.

  11. Feasibility of coded vibration in a vibro-ultrasound system for tissue elasticity measurement.

    PubMed

    Zhao, Jinxin; Wang, Yuanyuan; Yu, Jinhua; Li, Tianjie; Zheng, Yong-Ping

    2016-07-01

    The ability of various methods for elasticity measurement and imaging is hampered by the vibration amplitude on biological tissues. Based on the inference that coded excitation will improve the performance of the cross-correlation function of the tissue displacement waves, the idea of exerting encoded external vibration on tested samples for measuring its elasticity is proposed. It was implemented by integrating a programmable vibration generation function into a customized vibro-ultrasound system to generate Barker coded vibration for elasticity measurement. Experiments were conducted on silicone phantoms and porcine muscles. The results showed that coded excitation of the vibration enhanced the accuracy and robustness of the elasticity measurement especially in low signal-to-noise ratio scenarios. In the phantom study, the measured shear modulus values with coded vibration had an R(2 )= 0.993 linear correlation to that of referenced indentation, while for single-cycle pulse the R(2) decreased to 0.987. In porcine muscle study, the coded vibration also obtained a shear modulus value which is more accurate than the single-cycle pulse by 0.16 kPa and 0.33 kPa at two different depths. These results demonstrated the feasibility and potentiality of the coded vibration for enhancing the quality of elasticity measurement and imaging.

  12. Development of a numerical computer code and circuit element models for simulation of firing systems

    SciTech Connect

    Carpenter, K.H. . Dept. of Electrical and Computer Engineering)

    1990-07-02

    Numerical simulation of firing systems requires both the appropriate circuit analysis framework and the special element models required by the application. We have modified the SPICE circuit analysis code (version 2G.6), developed originally at the Electronic Research Laboratory of the University of California, Berkeley, to allow it to be used on MSDOS-based, personal computers and to give it two additional circuit elements needed by firing systems--fuses and saturating inductances. An interactive editor and a batch driver have been written to ease the use of the SPICE program by system designers, and the interactive graphical post processor, NUTMEG, supplied by U. C. Berkeley with SPICE version 3B1, has been interfaced to the output from the modified SPICE. Documentation and installation aids have been provided to make the total software system accessible to PC users. Sample problems show that the resulting code is in agreement with the FIRESET code on which the fuse model was based (with some modifications to the dynamics of scaling fuse parameters). In order to allow for more complex simulations of firing systems, studies have been made of additional special circuit elements--switches and ferrite cored inductances. A simple switch model has been investigated which promises to give at least a first approximation to the physical effects of a non ideal switch, and which can be added to the existing SPICE circuits without changing the SPICE code itself. The effect of fast rise time pulses on ferrites has been studied experimentally in order to provide a base for future modeling and incorporation of the dynamic effects of changes in core magnetization into the SPICE code. This report contains detailed accounts of the work on these topics performed during the period it covers, and has appendices listing all source code written documentation produced.

  13. Ecosocial consequences and policy implications of disease management in East African agropastoral systems

    PubMed Central

    Gutierrez, Andrew Paul; Gilioli, Gianni; Baumgärtner, Johann

    2009-01-01

    International research and development efforts in Africa have brought ecological and social change, but analyzing the consequences of this change and developing policy to manage it for sustainable development has been difficult. This has been largely due to a lack of conceptual and analytical models to access the interacting dynamics of the different components of ecosocial systems. Here, we examine the ecological and social changes resulting from an ongoing suppression of trypanosomiasis disease in cattle in an agropastoral community in southwest Ethiopia to illustrate how such problems may be addressed. The analysis combines physiologically based demographic models of pasture, cattle, and pastoralists and a bioeconomic model that includes the demographic models as dynamic constraints in the economic objective function that maximizes the utility of individual consumption under different level of disease risk in cattle. Field data and model analysis show that suppression of trypanosomiasis leads to increased cattle and human populations and to increased agricultural development. However, in the absence of sound management, these changes will lead to a decline in pasture quality and increase the risk from tick-borne diseases in cattle and malaria in humans that would threaten system sustainability and resilience. The analysis of these conflicting outcomes of trypanosomiasis suppression is used to illustrate the need for and utility of conceptual bioeconomic models to serve as a basis for developing policy for sustainable agropastoral resource management in sub-Saharan Africa. PMID:19620722

  14. Consequences of Circadian and Sleep Disturbances for the Cardiovascular System.

    PubMed

    Alibhai, Faisal J; Tsimakouridze, Elena V; Reitz, Cristine J; Pyle, W Glen; Martino, Tami A

    2015-07-01

    Circadian rhythms play a crucial role in our cardiovascular system. Importantly, there has been a recent flurry of clinical and experimental studies revealing the profound adverse consequences of disturbing these rhythms on the cardiovascular system. For example, circadian disturbance worsens outcome after myocardial infarction with implications for patients in acute care settings. Moreover, disturbing rhythms exacerbates cardiac remodelling in heart disease models. Also, circadian dyssynchrony is a causal factor in the pathogenesis of heart disease. These discoveries have profound implications for the cardiovascular health of shift workers, individuals with circadian and sleep disorders, or anyone subjected to the 24/7 demands of society. Moreover, these studies give rise to 2 new frontiers for translational research: (1) circadian rhythms and the cardiac sarcomere, which sheds new light on our understanding of myofilament structure, signalling, and electrophysiology; and (2) knowledge translation, which includes biomarker discovery (chronobiomarkers), timing of therapies (chronotherapy), and other new promising approaches to improve the management and treatment of cardiovascular disease. Reconsidering circadian rhythms in the clinical setting benefits repair mechanisms, and offers new promise for patients.

  15. Heatwave Early Warning Systems and Adaptation Advice to Reduce Human Health Consequences of Heatwaves

    PubMed Central

    Lowe, Dianne; Ebi, Kristie L.; Forsberg, Bertil

    2011-01-01

    Introduction: With climate change, there has been an increase in the frequency, intensity and duration of heatwave events. In response to the devastating mortality and morbidity of recent heatwave events, many countries have introduced heatwave early warning systems (HEWS). HEWS are designed to reduce the avoidable human health consequences of heatwaves through timely notification of prevention measures to vulnerable populations. Objective: To identify the key characteristics of HEWS in European countries to help inform modification of current, and development of, new systems and plans. Methods: We searched the internet to identify HEWS policy or government documents for 33 European countries and requested information from relevant organizations. We translated the HEWS documents and extracted details on the trigger indicators, thresholds for action, notification strategies, message intermediaries, communication and dissemination strategies, prevention strategies recommended and specified target audiences. Findings and Conclusions: Twelve European countries have HEWS. Although there are many similarities among the HEWS, there also are differences in key characteristics that could inform improvements in heatwave early warning plans. PMID:22408593

  16. [Cannabis: Effects in the Central Nervous System. Therapeutic, societal and legal consequences].

    PubMed

    Rivera-Olmos, Víctor Manuel; Parra-Bernal, Marisela C

    2016-01-01

    The consumption of marijuana extracted from Cannabis sativa and indica plants involves an important cultural impact in Mexico. Their psychological stimulatory effect is widely recognized; their biochemical and molecular components interact with CB1 and CB2 (endocannabinoid system) receptors in various central nervous system structures (CNS) and immune cells. The psychoactive element Δ-9-tetrahydrocannabinol (THC) can be reproduced synthetically. Systematic reviews show evidence of therapeutic effectiveness of therapeutic marijuana only for certain symptoms of multiple sclerosis (spasticity, spasms and pain), despite attempts for its widespread use, including refractory childhood epilepsy. Evidence indicates significant adverse effects of smoked marijuana on the structure, functioning and brain connectivity. Cannabis exposure during pregnancy affects fetal brain development, potentially leading to later behavioral problems in children. Neuropsychological tests and advanced imaging techniques show involvement in the learning process in adolescents with substance use. Also, marijuana increases the cognitive impairment in patients with multiple sclerosis. Social and ethical consequences to legally free marijuana for recreational use may be deleterious transcendentally. The medicinal or psychoactive cannabinol no addictive effect requires controlled proven efficacy and safety before regulatory approval studies.

  17. Development of OCDMA system based on Flexible Cross Correlation (FCC) code with OFDM modulation

    NASA Astrophysics Data System (ADS)

    Aldhaibani, A. O.; Aljunid, S. A.; Anuar, M. S.; Arief, A. R.; Rashidi, C. B. M.

    2015-03-01

    The performance of the OCDMA systems is governed by numerous quantitative parameters such as the data rate, simultaneous number of users, the powers of transmitter and receiver, and the type of codes. This paper analyzes the performance of the OCDMA system using OFDM technique to enhance the channel data rate, to save power and increase the number of user of OSCDMA systems compared with previous hybrid subcarrier multiplexing/optical spectrum code division multiplexing (SCM/OSCDM) system. The average received signal to noise ratio (SNR) with the nonlinearity of subcarriers is derived. The theoretical results have been evaluated based on BER and number of users as well as amount of power saved. The proposed system gave better performance and save around -6 dBm of the power as well as increase the number of users twice compare to SCM/OCDMA system. In addition it is robust against interference and much more spectrally efficient than SCM/OCDMA system. The system was designed based on Flexible Cross Correlation (FCC) code which is easier construction, less complexity of encoder/decoder design and flexible in-phase cross-correlation for uncomplicated to implement using Fiber Bragg Gratings (FBGs) for the OCDMA systems for any number of users and weights. The OCDMA-FCC_OFDM improves the number of users (cardinality) 108% compare to SCM/ODCMA-FCC system.

  18. Application of P4 Polyphase codes pulse compression method to air-coupled ultrasonic testing systems.

    PubMed

    Li, Honggang; Zhou, Zhenggan

    2017-03-03

    Air-coupled ultrasonic testing systems are usually restricted by low signal-to-noise ratios (SNR). The use of pulse compression techniques based on P4 Polyphase codes can improve the ultrasound SNR. This type of codes can generate higher Peak Side Lobe (PSL) ratio and lower noise of compressed signal. This paper proposes the use of P4 Polyphase sequences to code ultrasound with a NDT system based on air-coupled piezoelectric transducer. Furthermore, the principle of selecting parameters of P4 Polyphase sequence for obtaining optimal pulse compression effect is also studied. Successful results are presented in molded composite material. A hybrid signal processing method for improvement in SNR up to 12.11dB and in time domain resolution about 35% are achieved when compared with conventional pulse compression technique.

  19. Recommended requirements to code officials for solar heating, cooling, and hot water systems. Model document for code officials on solar heating and cooling of buildings

    SciTech Connect

    1980-06-01

    These recommended requirements include provisions for electrical, building, mechanical, and plumbing installations for active and passive solar energy systems used for space or process heating and cooling, and domestic water heating. The provisions in these recommended requirements are intended to be used in conjunction with the existing building codes in each jurisdiction. Where a solar relevant provision is adequately covered in an existing model code, the section is referenced in the Appendix. Where a provision has been drafted because there is no counterpart in the existing model code, it is found in the body of these recommended requirements. Commentaries are included in the text explaining the coverage and intent of present model code requirements and suggesting alternatives that may, at the discretion of the building official, be considered as providing reasonable protection to the public health and safety. Also included is an Appendix which is divided into a model code cross reference section and a reference standards section. The model code cross references are a compilation of the sections in the text and their equivalent requirements in the applicable model codes. (MHR)

  20. A Modular Computer Code for Simulating Reactive Multi-Species Transport in 3-Dimensional Groundwater Systems

    SciTech Connect

    TP Clement

    1999-06-24

    RT3DV1 (Reactive Transport in 3-Dimensions) is computer code that solves the coupled partial differential equations that describe reactive-flow and transport of multiple mobile and/or immobile species in three-dimensional saturated groundwater systems. RT3D is a generalized multi-species version of the US Environmental Protection Agency (EPA) transport code, MT3D (Zheng, 1990). The current version of RT3D uses the advection and dispersion solvers from the DOD-1.5 (1997) version of MT3D. As with MT3D, RT3D also requires the groundwater flow code MODFLOW for computing spatial and temporal variations in groundwater head distribution. The RT3D code was originally developed to support the contaminant transport modeling efforts at natural attenuation demonstration sites. As a research tool, RT3D has also been used to model several laboratory and pilot-scale active bioremediation experiments. The performance of RT3D has been validated by comparing the code results against various numerical and analytical solutions. The code is currently being used to model field-scale natural attenuation at multiple sites. The RT3D code is unique in that it includes an implicit reaction solver that makes the code sufficiently flexible for simulating various types of chemical and microbial reaction kinetics. RT3D V1.0 supports seven pre-programmed reaction modules that can be used to simulate different types of reactive contaminants including benzene-toluene-xylene mixtures (BTEX), and chlorinated solvents such as tetrachloroethene (PCE) and trichloroethene (TCE). In addition, RT3D has a user-defined reaction option that can be used to simulate any other types of user-specified reactive transport systems. This report describes the mathematical details of the RT3D computer code and its input/output data structure. It is assumed that the user is familiar with the basics of groundwater flow and contaminant transport mechanics. In addition, RT3D users are expected to have some experience in

  1. Increasing average power in medical ultrasonic endoscope imaging system by coded excitation

    NASA Astrophysics Data System (ADS)

    Chen, Xiaodong; Zhou, Hao; Wen, Shijie; Yu, Daoyin

    2008-12-01

    Medical ultrasonic endoscope is the combination of electronic endoscope and ultrasonic sensor technology. Ultrasonic endoscope sends the ultrasonic probe into coelom through biopsy channel of electronic endoscope and rotates it by a micro pre-motor, which requires that the length of ultrasonic probe is no more than 14mm and the diameter is no more than 2.2mm. As a result, the ultrasonic excitation power is very low and it is difficult to obtain a sharp image. In order to increase the energy and SNR of ultrasonic signal, we introduce coded excitation into the ultrasonic imaging system, which is widely used in radar system. Coded excitation uses a long coded pulse to drive ultrasonic transducer, which can increase the average transmitting power accordingly. In this paper, in order to avoid the overlapping between adjacent echo, we used a four-figure Barker code to drive the ultrasonic transducer, which is modulated at the operating frequency of transducer to improve the emission efficiency. The implementation of coded excitation is closely associated with the transient operating characteristic of ultrasonic transducer. In this paper, the transient operating characteristic of ultrasonic transducer excited by a shock pulse δ(t) is firstly analyzed, and then the exciting pulse generated by special ultrasonic transmitting circuit composing of MD1211 and TC6320. In the final part of the paper, we designed an experiment to validate the coded excitation with transducer operating at 5MHz and a glass filled with ultrasonic coupling liquid as the object. Driven by a FPGA, the ultrasonic transmitting circuit output a four-figure Barker excitation pulse modulated at 5MHz, +/-20 voltage and is consistent with the transient operating characteristic of ultrasonic transducer after matched by matching circuit. The reflected echo from glass possesses coded character, which is identical with the simulating result by Matlab. Furthermore, the signal's amplitude is higher.

  2. Automatic construction of rule-based ICD-9-CM coding systems

    PubMed Central

    Farkas, Richárd; Szarvas, György

    2008-01-01

    Background In this paper we focus on the problem of automatically constructing ICD-9-CM coding systems for radiology reports. ICD-9-CM codes are used for billing purposes by health institutes and are assigned to clinical records manually following clinical treatment. Since this labeling task requires expert knowledge in the field of medicine, the process itself is costly and is prone to errors as human annotators have to consider thousands of possible codes when assigning the right ICD-9-CM labels to a document. In this study we use the datasets made available for training and testing automated ICD-9-CM coding systems by the organisers of an International Challenge on Classifying Clinical Free Text Using Natural Language Processing in spring 2007. The challenge itself was dominated by entirely or partly rule-based systems that solve the coding task using a set of hand crafted expert rules. Since the feasibility of the construction of such systems for thousands of ICD codes is indeed questionable, we decided to examine the problem of automatically constructing similar rule sets that turned out to achieve a remarkable accuracy in the shared task challenge. Results Our results are very promising in the sense that we managed to achieve comparable results with purely hand-crafted ICD-9-CM classifiers. Our best model got a 90.26% F measure on the training dataset and an 88.93% F measure on the challenge test dataset, using the micro-averaged Fβ=1 measure, the official evaluation metric of the International Challenge on Classifying Clinical Free Text Using Natural Language Processing. This result would have placed second in the challenge, with a hand-crafted system achieving slightly better results. Conclusions Our results demonstrate that hand-crafted systems – which proved to be successful in ICD-9-CM coding – can be reproduced by replacing several laborious steps in their construction with machine learning models. These hybrid systems preserve the favourable

  3. Code-division multiple-access protocol for active RFID systems

    NASA Astrophysics Data System (ADS)

    Mazurek, Gustaw; Szabatin, Jerzy

    2008-01-01

    Most of the Radio Frequency Identification (RFID) systems operating in HF and UHF bands employ narrowband modulations (FSK or ASK) with Manchester coding. However, these simple transmission schemes are vulnerable to narrowband interference (NBI) generated by other radio systems working in the same frequency band, and also suffer from collision problem and need special anti-collision procedures. This becomes especially important when operating in a noisy, crowded industrial environment. In this paper we show the performance of RFID system with DS-CDMA transmission in comparison to a standard system with FSK modulation defined in ISO 18000-7. Our simulation results show that without any bandwidth expansion the immunity against NBI can be improved by 8 dB and the system capacity can be 7 times higher when using DS-CDMA transmission instead of FSK modulation with Manchester coding.

  4. The Development of a Systematic Coding System for Elementary Students' Drawings of Engineers

    ERIC Educational Resources Information Center

    Weber, Nicole; Duncan, Daphne; Dyehouse, Melissa; Strobel, Johannes; Diefes-Dux, Heidi A.

    2011-01-01

    The Draw an Engineer Test (DAET) is a common measure of students' perceptions of engineers. The coding systems currently used for K-12 research are general rubrics or checklists to capture the images presented in the drawing, which leave out some of the richness of students' perceptions, currently only captured with an accompanying student…

  5. MatLab script to C code converter for embedded processors of FLASH LLRF control system

    NASA Astrophysics Data System (ADS)

    Bujnowski, K.; Siemionczyk, A.; Pucyk, P.; Szewiński, J.; Pożniak, K. T.; Romaniuk, R. S.

    2008-01-01

    The low level RF control system (LLRF) of FEL serves for stabilization of the electromagnetic (EM) field in the superconducting niobium, resonant, microwave cavities and for controlling high power (MW) klystron. LLRF system of FLASH accelerator bases on FPGA technology and embedded microprocessors. Basic and auxiliary functions of the systems are listed as well as used algorithms for superconductive cavity parameters identification. These algorithms were prepared originally in Matlab. The main part of the paper presents implementation of the cavity parameters identification algorithm in a PowerPC processor embedded in the FPGA circuit VirtexIIPro. A construction of a very compact Matlab script converter to C code was presented, referred to as M2C. The application is designed specifically for embedded systems of very confined resources. The generated code is optimized for the weight. The code should be transferable between different hardware platforms. The converter generates a code for Linux and for stand-alone applications. Functional structure of the program was described and the way it is acting. FLEX and BIZON tools were used for construction of the converter. The paper concludes with an example of the M2C application to convert a complex identification algorithm for superconductive cavities in FLASH laser.

  6. Randomization of Symbol Repetition of Punch Cards with Superimposed Coding in Information-Search Systems.

    ERIC Educational Resources Information Center

    Pirovich, L. Ya

    The article shows the effect of the irregularity of using separate symbols on search noise on punch cards with superimposed symbol coding in information-search system (IPS). A binomial law of random value distribution of repetition of each symbol is established and analyzed. A method of determining the maximum value of symbol repetition is…

  7. A System for English Vocabulary Acquisition Based on Code-Switching

    ERIC Educational Resources Information Center

    Mazur, Michal; Karolczak, Krzysztof; Rzepka, Rafal; Araki, Kenji

    2016-01-01

    Vocabulary plays an important part in second language learning and there are many existing techniques to facilitate word acquisition. One of these methods is code-switching, or mixing the vocabulary of two languages in one sentence. In this paper the authors propose an experimental system for computer-assisted English vocabulary learning in…

  8. Threshold-Based OSIC Detection Algorithm for Per-Antenna-Coded TIMO-OFDM Systems

    NASA Astrophysics Data System (ADS)

    Wang, Xinzheng; Chen, Ming; Zhu, Pengcheng

    Threshold-based ordered successive interference cancellation (OSIC) detection algorithm is proposed for per-antenna-coded (PAC) two-input multiple-output (TIMO) orthogonal frequency division multiplexing (OFDM) systems. Successive interference cancellation (SIC) is performed selectively according to channel conditions. Compared with the conventional OSIC algorithm, the proposed algorithm reduces the complexity significantly with only a slight performance degradation.

  9. Duct System Flammability and Air Sealing Fire Separation Assemblies in the International Residential Code

    SciTech Connect

    Rudd, A.; Prahl, D.

    2014-12-01

    IBACOS identified two barriers that limit the ability of builders to cost-effectively achieve higher energy efficiency levels in housing. These are the use of duct system materials that inherently achieve airtightness and are appropriately sized for low-load houses and the ability to air seal fire separation assemblies. The issues identified fall into a gray area of the codes.

  10. Duct System Flammability and Air Sealing Fire Separation Assemblies in the International Residential Code

    SciTech Connect

    Rudd, A.; Prahl, D.

    2014-12-01

    IBACOS identified two barriers that limit the ability of builders to cost-effectively achieve higher energy efficiency levels in housing. These are (1) the use of duct system materials that inherently achieve airtightness and are appropriately sized for low-load houses and (2) the ability to air seal fire separation assemblies. The issues identified fall into a gray area of the codes.

  11. Paired Learners' Verbalised Strategies for Determining Grammatical Correctness: A Turn-Based System for Coding Metatalk

    ERIC Educational Resources Information Center

    Ishii, David N.

    2011-01-01

    The purpose of this paper is to explore the use of a new coding system that incorporates the various types of metatalk that occurred during paired learners' engagement in a consciousness-raising task. On the basis of previous studies, metalanguage (e.g. with or without terminology), knowledge sources (e.g. intuition), and verbalisation strategies…

  12. Recent development for the ITS code system: Parallel processing and visualization

    SciTech Connect

    Fan, W.C.; Turner, C.D.; Halbleib, J.A. Sr.; Kensek, R.P.

    1996-03-01

    A brief overview is given for two software developments related to the ITS code system. These developments provide parallel processing and visualization capabilities and thus allow users to perform ITS calculations more efficiently. Timing results and a graphical example are presented to demonstrate these capabilities.

  13. Analyzing Multiple Dimensions of Web-based Courses: The Development and Piloting of a Coding System.

    ERIC Educational Resources Information Center

    Jiang, Mingming; Meskill, Carla

    2000-01-01

    Presents the development and piloting of a coding system for evaluation of asynchronous Web-based instruction/learning. Processes were guided by four prominent education perspectives and the extended examination of, and survey data from, 17 archived Web-based courses. These served as the bases for the development and application of the coding…

  14. [Procedure for coding the causes of death in some circulatory system diseases].

    PubMed

    Kakorina, E P; Aleksandrova, G A; Frank, G A; Mal'kov, P G; Zaĭratians, O V; Vaĭsman, D Sh

    2014-01-01

    The paper deals with the unification of requirements for coding the causes of death in circulatory system diseases, by taking into account the recently updated ICD-10 and the Consensus of the Expert Council Task Force on Pathological Anatomy, Ministry of Health of the Russian Federation (27 February 2014).

  15. Digital-coded matrix system simplifies design and construction of flow charts

    NASA Technical Reports Server (NTRS)

    Otoole, E.

    1971-01-01

    Matrix system utilizing unique digital code enables drawing block diagrams with parallel blocks. Complete freedom is obtained in laying out diagram, and it is possible to go directly from matrix to finished drawing. Need to rough out diagram is eliminated and time involved is greatly reduced.

  16. Users manual for the FORSS sensitivity and uncertainty analysis code system

    SciTech Connect

    Lucius, J.L.; Weisbin, C.R.; Marable, J.H.; Drischler, J.D.; Wright, R.Q.; White, J.E.

    1981-01-01

    FORSS is a code system used to study relationships between nuclear reaction cross sections, integral experiments, reactor performance parameter predictions and associated uncertainties. This report describes the computing environment and the modules currently used to implement FORSS Sensitivity and Uncertainty Methodology.

  17. GRAVE: An Interactive Geometry Construction and Visualization Software System for the TORT Nuclear Radiation Transport Code

    SciTech Connect

    Blakeman, E.D.

    2000-05-07

    A software system, GRAVE (Geometry Rendering and Visual Editor), has been developed at the Oak Ridge National Laboratory (ORNL) to perform interactive visualization and development of models used as input to the TORT three-dimensional discrete ordinates radiation transport code. Three-dimensional and two-dimensional visualization displays are included. Display capabilities include image rotation, zoom, translation, wire-frame and translucent display, geometry cuts and slices, and display of individual component bodies and material zones. The geometry can be interactively edited and saved in TORT input file format. This system is an advancement over the current, non-interactive, two-dimensional display software. GRAVE is programmed in the Java programming language and can be implemented on a variety of computer platforms. Three- dimensional visualization is enabled through the Visualization Toolkit (VTK), a free-ware C++ software library developed for geometric and data visual display. Future plans include an extension of the system to read inputs using binary zone maps and combinatorial geometry models containing curved surfaces, such as those used for Monte Carlo code inputs. Also GRAVE will be extended to geometry visualization/editing for the DORT two-dimensional transport code and will be integrated into a single GUI-based system for all of the ORNL discrete ordinates transport codes.

  18. GENGA: a GPU code for planet formation and planetary system evolution

    NASA Astrophysics Data System (ADS)

    Lukas Grimm, Simon; Stadel, Joachim

    2015-12-01

    We present GENGA, a GPU code designed and optimised for (exo)planetary formation - and orbital evolution simulations. The use of the parallel computing power of GPUs allows GENGA to achieve a significant speedup compared to other N-body codes. GENGA runs about 30 - 50 times faster than the Mercury code.GENGA can be used with three different computational modes: The main mode permits to integrate a N-body system with up to 8192 fully interacting planetesimals, orbiting a central mass.The test particle mode can include up to 1 million massless bodies in the presence of massive planets or protoplanets. The third mode allows the parallel integration of up to 100000 samples of small exoplanetary systems with different parameters. With this functionality, GENGA can be used in a variety of applications in planetary and exoplanetary science. Possible applications of GENGA are: the late stage of terrestrial planet formation, study core accretion models for gas giants in the presence of planetesimals, simulate the evolution of asteroids and asteroid families, find stable configurations of exoplanetary systems to restrict the detected orbital parameters, and many more.Since such simulations can often take billions of time steps to complete, or require the cover of a very large parameter space, it makes it necessary to use a highly optimised code, running on the today's most efficient hardware. As a bonus, the use of GPUs allows a real time visualisation of the simulations on the screen.In our presentation we will give an overview of the possibilities of the code and discuss the newest results and applications of GENGA. The code is published as open source software under https://bitbucket.org/sigrimm/genga.

  19. Neuroprotective activity of thioctic acid in central nervous system lesions consequent to peripheral nerve injury.

    PubMed

    Tomassoni, Daniele; Amenta, Francesco; Di Cesare Mannelli, Lorenzo; Ghelardini, Carla; Nwankwo, Innocent E; Pacini, Alessandra; Tayebati, Seyed Khosrow

    2013-01-01

    Peripheral neuropathies are heterogeneous disorders presenting often with hyperalgesia and allodynia. This study has assessed if chronic constriction injury (CCI) of sciatic nerve is accompanied by increased oxidative stress and central nervous system (CNS) changes and if these changes are sensitive to treatment with thioctic acid. Thioctic acid is a naturally occurring antioxidant existing in two optical isomers (+)- and (-)-thioctic acid and in the racemic form. It has been proposed for treating disorders associated with increased oxidative stress. Sciatic nerve CCI was made in spontaneously hypertensive rats (SHRs) and in normotensive reference cohorts. Rats were untreated or treated intraperitoneally for 14 days with (+/-)-, (+)-, or (-)-thioctic acid. Oxidative stress, astrogliosis, myelin sheets status, and neuronal injury in motor and sensory cerebrocortical areas were assessed. Increase of oxidative stress markers, astrogliosis, and neuronal damage accompanied by a decreased expression of neurofilament were observed in SHR. This phenomenon was more pronounced after CCI. Thioctic acid countered astrogliosis and neuronal damage, (+)-thioctic acid being more active than (+/-)- or (-)-enantiomers. These findings suggest a neuroprotective activity of thioctic acid on CNS lesions consequent to CCI and that the compound may represent a therapeutic option for entrapment neuropathies.

  20. Evaluation of Possible Consequences of Zika Virus Infection in the Developing Nervous System.

    PubMed

    Walter, Lais Takata; Higa, Guilherme Shigueto Vilar; Ikebara, Juliane Midori; Vedovello, Danila; Salvador, Felipe Scassi; Takada, Silvia Honda; Kinjo, Erika Reime; Whalley, Benjamin J; Sperança, Márcia Aparecida; Kihara, Alexandre Hiroaki

    2017-02-11

    The Zika virus (ZIKV) outbreak that occurred in the northeast of Brazil in 2015 led to alarming numbers of babies born with microcephaly in this region. Since then, several studies have evaluated the relationship between ZIKV infection and development of the malformation although the specific mechanistic interaction between ZIKV and human physiological processes that ultimately manifest as microcephaly remains debated. Importantly, most current studies did not consider the specificities of the biology and life cycle of ZIKV. As a consequence, specificities of the infection on the developing central nervous system (CNS) were frequently disregarded. In order to begin to address this important gap in our knowledge, we have collated and critically reviewed the existing evidence in this area to identify any emerging consensus on this topic and thereafter describe possible mechanisms by which ZIKV infection could interfere with specific processes of CNS development, such as neuronal proliferation, and the complex interactions of immature neurons with radial glial cells. With this, we were able to present the current knowledge on this important topic in the neurobiology field.

  1. Neuroprotective Activity of Thioctic Acid in Central Nervous System Lesions Consequent to Peripheral Nerve Injury

    PubMed Central

    Ghelardini, Carla; Nwankwo, Innocent E.; Pacini, Alessandra

    2013-01-01

    Peripheral neuropathies are heterogeneous disorders presenting often with hyperalgesia and allodynia. This study has assessed if chronic constriction injury (CCI) of sciatic nerve is accompanied by increased oxidative stress and central nervous system (CNS) changes and if these changes are sensitive to treatment with thioctic acid. Thioctic acid is a naturally occurring antioxidant existing in two optical isomers (+)- and (−)-thioctic acid and in the racemic form. It has been proposed for treating disorders associated with increased oxidative stress. Sciatic nerve CCI was made in spontaneously hypertensive rats (SHRs) and in normotensive reference cohorts. Rats were untreated or treated intraperitoneally for 14 days with (+/−)-, (+)-, or (−)-thioctic acid. Oxidative stress, astrogliosis, myelin sheets status, and neuronal injury in motor and sensory cerebrocortical areas were assessed. Increase of oxidative stress markers, astrogliosis, and neuronal damage accompanied by a decreased expression of neurofilament were observed in SHR. This phenomenon was more pronounced after CCI. Thioctic acid countered astrogliosis and neuronal damage, (+)-thioctic acid being more active than (+/−)- or (−)-enantiomers. These findings suggest a neuroprotective activity of thioctic acid on CNS lesions consequent to CCI and that the compound may represent a therapeutic option for entrapment neuropathies. PMID:24527432

  2. A Low Cost Correlator Structure in the Pseudo-Noise Code Acquisition System

    NASA Astrophysics Data System (ADS)

    Lu, Weijun; Li, Ying; Yu, Dunshan; Zhang, Xing

    The critical problem of the pseudo-noise (PN) code acquisition system is the contradiction between the acquisition performance and the calculation complexity. This paper presents a low cost correlator (LCC) structure that can search for two PN code phases in a single accumulation period by eliminating redundant computation. Compared with the part-parallel structure that is composed of two serial correlators (PARALLEL2), the proposed LCC structure has the same performance while saves about 22% chip area and 34% power consumption if uses the Carry-look-ahead (CLA) adder, 17% chip area and 25% power consumption if uses the Ripple-carry (RPL) adder.

  3. Space communication system for compressed data with a concatenated Reed-Solomon-Viterbi coding channel

    NASA Technical Reports Server (NTRS)

    Rice, R. F.; Hilbert, E. E. (Inventor)

    1976-01-01

    A space communication system incorporating a concatenated Reed Solomon Viterbi coding channel is discussed for transmitting compressed and uncompressed data from a spacecraft to a data processing center on Earth. Imaging (and other) data are first compressed into source blocks which are then coded by a Reed Solomon coder and interleaver, followed by a convolutional encoder. The received data is first decoded by a Viterbi decoder, followed by a Reed Solomon decoder and deinterleaver. The output of the latter is then decompressed, based on the compression criteria used in compressing the data in the spacecraft. The decompressed data is processed to reconstruct an approximation of the original data-producing condition or images.

  4. Integration of a supersonic unsteady aerodynamic code into the NASA FASTEX system

    NASA Technical Reports Server (NTRS)

    Appa, Kari; Smith, Michael J. C.

    1987-01-01

    A supersonic unsteady aerodynamic loads prediction method based on the constant pressure method was integrated into the NASA FASTEX system. The updated FASTEX code can be employed for aeroelastic analyses in subsonic and supersonic flow regimes. A brief description of the supersonic constant pressure panel method, as applied to lifting surfaces and body configurations, is followed by a documentation of updates required to incorporate this method in the FASTEX code. Test cases showing correlations of predicted pressure distributions, flutter solutions, and stability derivatives with available data are reported.

  5. Temporal code in the vibrissal system Part II: Roughness surface discrimination

    NASA Astrophysics Data System (ADS)

    Farfán, F. D.; Albarracín, A. L.; Felice, C. J.

    2007-11-01

    Previous works have purposed hypotheses about the neural code of the tactile system in the rat. One of them is based on the physical characteristics of vibrissae, such as frequency of resonance; another is based on discharge patterns on the trigeminal ganglion. In this work, the purpose is to find a temporal code analyzing the afferent signals of two vibrissal nerves while vibrissae sweep surfaces of different roughness. Two levels of pressure were used between the vibrissa and the contact surface. We analyzed the afferent discharge of DELTA and GAMMA vibrissal nerves. The vibrissae movements were produced using electrical stimulation of the facial nerve. The afferent signals were analyzed using an event detection algorithm based on Continuous Wavelet Transform (CWT). The algorithm was able to detect events of different duration. The inter-event times detected were calculated for each situation and represented in box plot. This work allowed establishing the existence of a temporal code at peripheral level.

  6. [An update of the diagnostic coding system by the Spanish Society of Pediatric Emergencies].

    PubMed

    Benito Fernández, J; Luaces Cubells, C; Gelabert Colomé, G; Anso Borda, I

    2015-06-01

    The Quality Working Group of the Spanish Society of Pediatric Emergencies (SEUP) presents an update of the diagnostic coding list. The original list was prepared and published in Anales de Pediatría in 2000, being based on the International Coding system ICD-9-CM current at that time. Following the same methodology used at that time and based on the 2014 edition of the ICD-9-CM, 35 new codes have been added to the list, 15 have been updated, and a list of the most frequent references to trauma diagnoses in pediatrics have been provided. In the current list of diagnoses, SEUP reflects the significant changes that have taken place in Pediatric Emergency Services in the last decade.

  7. A DS-UWB Cognitive Radio System Based on Bridge Function Smart Codes

    NASA Astrophysics Data System (ADS)

    Xu, Yafei; Hong, Sheng; Zhao, Guodong; Zhang, Fengyuan; di, Jinshan; Zhang, Qishan

    This paper proposes a direct-sequence UWB Gaussian pulse of cognitive radio systems based on bridge function smart sequence matrix and the Gaussian pulse. As the system uses the spreading sequence code, that is the bridge function smart code sequence, the zero correlation zones (ZCZs) which the bridge function sequences' auto-correlation functions had, could reduce multipath fading of the pulse interference. The Modulated channel signal was sent into the IEEE 802.15.3a UWB channel. We analysis the ZCZs's inhibition to the interference multipath interference (MPI), as one of the main system sources interferences. The simulation in SIMULINK/MATLAB is described in detail. The result shows the system has better performance by comparison with that employing Walsh sequence square matrix, and it was verified by the formula in principle.

  8. A coded structured light system based on primary color stripe projection and monochrome imaging.

    PubMed

    Barone, Sandro; Paoli, Alessandro; Razionale, Armando Viviano

    2013-10-14

    Coded Structured Light techniques represent one of the most attractive research areas within the field of optical metrology. The coding procedures are typically based on projecting either a single pattern or a temporal sequence of patterns to provide 3D surface data. In this context, multi-slit or stripe colored patterns may be used with the aim of reducing the number of projected images. However, color imaging sensors require the use of calibration procedures to address crosstalk effects between different channels and to reduce the chromatic aberrations. In this paper, a Coded Structured Light system has been developed by integrating a color stripe projector and a monochrome camera. A discrete coding method, which combines spatial and temporal information, is generated by sequentially projecting and acquiring a small set of fringe patterns. The method allows the concurrent measurement of geometrical and chromatic data by exploiting the benefits of using a monochrome camera. The proposed methodology has been validated by measuring nominal primitive geometries and free-form shapes. The experimental results have been compared with those obtained by using a time-multiplexing gray code strategy.

  9. Consequences of insect herbivory on grape fine root systems with different growth rates.

    PubMed

    Bauerle, T L; Eissenstat, D M; Granett, J; Gardner, D M; Smart, D R

    2007-07-01

    Herbivory tolerance has been linked to plant growth rate where plants with fast growth rates are hypothesized to be more tolerant of herbivory than slower-growing plants. Evidence supporting this theory has been taken primarily from observations of aboveground organs but rarely from roots. Grapevines differing in overall rates of new root production, were studied in Napa Valley, California over two growing seasons in an established vineyard infested with the sucking insect, grape phylloxera (Daktulosphaira vitifoliae Fitch). The experimental vineyard allowed for the comparison of two root systems that differed in rates of new root tip production (a 'fast grower', Vitis berlandieri x Vitis rupestris cv. 1103P, and a slower-growing stock, Vitis riparia x Vitis rupestris cv. 101-14 Mgt). Each root system was grafted with a genetically identical shoot system (Vitis vinifera cv. Merlot). Using minirhizotrons, we did not observe any evidence of spatial or temporal avoidance of insect populations by root growth. Insect infestations were abundant throughout the soil profile, and seasonal peaks in phylloxera populations generally closely followed peaks in new root production. Our data supported the hypothesis that insect infestation was proportional to the number of growing tips, as indicated by similar per cent infestation in spite of a threefold difference in root tip production. In addition, infested roots of the fast-growing rootstock exhibited somewhat shorter median lifespans (60 d) than the slower-growing rootstock (85 d). Lifespans of uninfested roots were similar for the two rootstocks (200 d). As a consequence of greater root mortality of younger roots, infested root populations in the fast-growing rootstock had an older age structure. While there does not seem to be a trade-off between potential growth rate and relative rate of root infestation in these cultivars, our study indicates that a fast-growing root system may more readily shed infested roots that are

  10. A novel repetition space-time coding scheme for mobile FSO systems

    NASA Astrophysics Data System (ADS)

    Li, Ming; Cao, Yang; Li, Shu-ming; Yang, Shao-wen

    2015-03-01

    Considering the influence of more random atmospheric turbulence, worse pointing errors and highly dynamic link on the transmission performance of mobile multiple-input multiple-output (MIMO) free space optics (FSO) communication systems, this paper establishes a channel model for the mobile platform. Based on the combination of Alamouti space-time code and time hopping ultra-wide band (TH-UWB) communications, a novel repetition space-time coding (RSTC) method for mobile 2×2 free-space optical communications with pulse position modulation (PPM) is developed. In particular, two decoding methods of equal gain combining (EGC) maximum likelihood detection (MLD) and correlation matrix detection (CMD) are derived. When a quasi-static fading and weak turbulence channel model are considered, simulation results show that whether the channel state information (CSI) is known or not, the coding system demonstrates more significant performance of the symbol error rate (SER) than the uncoding. In other words, transmitting diversity can be achieved while conveying the information only through the time delays of the modulated signals transmitted from different antennas. CMD has almost the same effect of signal combining with maximal ratio combining (MRC). However, when the channel correlation increases, SER performance of the coding 2×2 system degrades significantly.

  11. Channel coding and data compression system considerations for efficient communication of planetary imaging data

    NASA Technical Reports Server (NTRS)

    Rice, R. F.

    1974-01-01

    End-to-end system considerations involving channel coding and data compression are reported which could drastically improve the efficiency in communicating pictorial information from future planetary spacecraft. In addition to presenting new and potentially significant system considerations, this report attempts to fill a need for a comprehensive tutorial which makes much of this very subject accessible to readers whose disciplines lie outside of communication theory.

  12. National Combustion Code, a Multidisciplinary Combustor Design System, Will Be Transferred to the Commercial Sector

    NASA Technical Reports Server (NTRS)

    Steele, Gynelle C.

    1999-01-01

    The NASA Lewis Research Center and Flow Parametrics will enter into an agreement to commercialize the National Combustion Code (NCC). This multidisciplinary combustor design system utilizes computer-aided design (CAD) tools for geometry creation, advanced mesh generators for creating solid model representations, a common framework for fluid flow and structural analyses, modern postprocessing tools, and parallel processing. This integrated system can facilitate and enhance various phases of the design and analysis process.

  13. ETRANS: an energy transport system optimization code for distributed networks of solar collectors

    SciTech Connect

    Barnhart, J.S.

    1980-09-01

    The optimization code ETRANS was developed at the Pacific Northwest Laboratory to design and estimate the costs associated with energy transport systems for distributed fields of solar collectors. The code uses frequently cited layouts for dish and trough collectors and optimizes them on a section-by-section basis. The optimal section design is that combination of pipe diameter and insulation thickness that yields the minimum annualized system-resultant cost. Among the quantities included in the costing algorithm are (1) labor and materials costs associated with initial plant construction, (2) operating expenses due to daytime and nighttime heat losses, and (3) operating expenses due to pumping power requirements. Two preliminary series of simulations were conducted to exercise the code. The results indicate that transport system costs for both dish and trough collector fields increase with field size and receiver exit temperature. Furthermore, dish collector transport systems were found to be much more expensive to build and operate than trough transport systems. ETRANS itself is stable and fast-running and shows promise of being a highly effective tool for the analysis of distributed solar thermal systems.

  14. Code System for Producing Pointwise and Multigroup Neutron and Photon Cross Sections from ENDF/B Data.

    SciTech Connect

    MACFARLANE, ROBERT E.

    1996-12-19

    Version 03 The NJOY nuclear data processing system is a comprehensive computer code system for producing pointwise and multigroup cross sections and related quantities from ENDF/B evaluated nuclear data in the ENDF format, including the latest US library, ENDF/B-VI. The NJOY code works with neutrons, photons, and charged particles and produces libraries for a wide variety of particle transport and reactor analysis codes.

  15. Consequence of Electron Mobility in Icy Grains on Solar System Objects

    NASA Astrophysics Data System (ADS)

    Gudipati, Murthy; Allamandola, L. J.; Cooper, J. F.; Sturner, S.; Johnson, R. E.

    2007-12-01

    Solar system ices have been shown to contain organic molecules, whether in the ice on Mars, comets such as Tempel-1 or on the surfaces of Europa, Ganymede, and Callisto. Sub-surface oceans containing ionic salts have been proposed to interpret the induced components of the local magnetic fields at these Galilean moons. Recent laboratory studies have shown that radiation processing of water-rich ices containing aromatic organic impurities readily ionizes organic molecules imbedded in an ice matrix. As a result, transient charge separation is produced more efficiently in ices containing organic impurities. This charge separation is partially stabilized by electron trapping. This could have important consequences since the icy moons of the giant planets are imbedded in both the magnetic field and trapped particle radiation environments of the planetary magnetospheres. Here we present new experimental results and theoretical modelling that deals with mobility of electrons produced by photoionization of PAHs (polycyclic aromatic hydrocarbons) in an ice matrix. We find that a small portion of the electrons (about 5% of the originally generated) are weakly trapped in the impurity-containing ices and can be made mobile at temperatures between 50 K and 125 K. Current flow of these mobile electrons could affect electrical conductivity of the irradiated surfaces and contribute to induced magnetic fields. This solid-state micro-ionospheric environment, comparable to a thin metallic conducting shell, may then need to be taken into account, along with the above-surface ionosphere, in modelling background variations affecting detection of induced magnetic fields from the sub-surface oceans. References: 1. M. S. Gudipati, L. J. Allamandola, J. F. Cooper, S. Sturner, R. E. Johnson (in preparation) 2. J. F. Cooper, R. E. Johnson, B. H. Mauk, H. B. Garrett, N. Gehrels, Icarus 149, 133 (2001). 3. M. S. Gudipati, L. J. Allamandola, Journal of Physical Chemistry A 110, 9020 (2006).

  16. The grout/glass performance assessment code system (GPACS) with verification and benchmarking

    SciTech Connect

    Piepho, M.G.; Sutherland, W.H.; Rittmann, P.D.

    1994-12-01

    GPACS is a computer code system for calculating water flow (unsaturated or saturated), solute transport, and human doses due to the slow release of contaminants from a waste form (in particular grout or glass) through an engineered system and through a vadose zone to an aquifer, well and river. This dual-purpose document is intended to serve as a user`s guide and verification/benchmark document for the Grout/Glass Performance Assessment Code system (GPACS). GPACS can be used for low-level-waste (LLW) Glass Performance Assessment and many other applications including other low-level-waste performance assessments and risk assessments. Based on all the cses presented, GPACS is adequate (verified) for calculating water flow and contaminant transport in unsaturated-zone sediments and for calculating human doses via the groundwater pathway.

  17. Technique for using a geometry and visualization system to monitor and manipulate information in other codes

    NASA Technical Reports Server (NTRS)

    Dickens, Thomas P.

    1992-01-01

    A technique was developed to allow the Aero Grid and Paneling System (AGPS), a geometry and visualization system, to be used as a dynamic real-time geometry monitor, manipulator, and interrogator for other codes. This technique involves the direct connection of AGPS with one or more external codes through the use of Unix pipes. AGPS has several commands that control communication with the external program. The external program uses several special subroutines that allow simple, direct communication with AGPS. The external program creates AGPS command lines and transmits the line over the pipes or communicates on a subroutine level. AGPS executes the commands, displays graphics/geometry information, and transmits the required solutions back to the external program. The basic ideas discussed in this paper could easily be implemented in other graphics/geometry systems currently in use or under development.

  18. SolTrace: A Ray-Tracing Code for Complex Solar Optical Systems

    SciTech Connect

    Wendelin, Tim; Dobos, Aron; Lewandowski, Allan

    2013-10-01

    SolTrace is an optical simulation tool designed to model optical systems used in concentrating solar power (CSP) applications. The code was first written in early 2003, but has seen significant modifications and changes since its inception, including conversion from a Pascal-based software development platform to C++. SolTrace is unique in that it can model virtually any optical system utilizingthe sun as the source. It has been made available for free and as such is in use worldwide by industry, universities, and research laboratories. The fundamental design of the code is discussed, including enhancements and improvements over the earlier version. Comparisons are made with other optical modeling tools, both non-commercial and commercial in nature. Finally, modeled results are shownfor some typical CSP systems and, in one case, compared to measured optical data.

  19. Image amplification based super-resolution reconstruction procedure designed for wavefront-coded imaging system

    NASA Astrophysics Data System (ADS)

    Zhao, Hui; Zong, Caihui; Wei, Jingxuan; Xie, Xiaopeng

    2016-10-01

    Wave-front coding, proposed by Dowski and Cathey in 1995, is widely known to be capable of extending the depth of focus (DOF) of incoherent imaging systems. However, benefiting from its very large point spread function (PSF) generated by a suitably designed phase mask that is added to the aperture plane, wave-front coding could also be used to achieve super-resolution without replacing the current sensor with one of smaller pitch size. An image amplification based super-resolution reconstruction procedure has been specifically designed for wave-front coded imaging systems and its effectiveness has been tested by experiment. For instance, for a focal length of 50 mm and f-number 4.5, objects within the range [5 m, ∞] are clearly imaged with the help of wave-front coding, which indicates a DOF extension ratio of approximately 20. The proposed super-resolution reconstruction procedure produces at least 3× resolution improvement, with the quality of the reconstructed super-resolution image approaching the diffraction limit.

  20. Context-dependent coding and gain control in the auditory system of crickets.

    PubMed

    Clemens, Jan; Rau, Florian; Hennig, R Matthias; Hildebrandt, K Jannis

    2015-10-01

    Sensory systems process stimuli that greatly vary in intensity and complexity. To maintain efficient information transmission, neural systems need to adjust their properties to these different sensory contexts, yielding adaptive or stimulus-dependent codes. Here, we demonstrated adaptive spectrotemporal tuning in a small neural network, i.e. the peripheral auditory system of the cricket. We found that tuning of cricket auditory neurons was sharper for complex multi-band than for simple single-band stimuli. Information theoretical considerations revealed that this sharpening improved information transmission by separating the neural representations of individual stimulus components. A network model inspired by the structure of the cricket auditory system suggested two putative mechanisms underlying this adaptive tuning: a saturating peripheral nonlinearity could change the spectral tuning, whereas broad feed-forward inhibition was able to reproduce the observed adaptive sharpening of temporal tuning. Our study revealed a surprisingly dynamic code usually found in more complex nervous systems and suggested that stimulus-dependent codes could be implemented using common neural computations.

  1. Rate 8/9 coded 8-PSK system for downlink applications

    NASA Technical Reports Server (NTRS)

    Fang, Russell; Kappes, Mark; Miller, Susan

    1992-01-01

    An advanced Coded Trellis Modulation (CTM) System which achieves a 2 bits/s/Hz bandwidth efficiency at an information rate of 200 Mbit/s while minimizing satellite power requirements, was developed for downlink earth station applications. The CTM system employs a high-speed rate 8/9 convolutional code with Viterbi decoding and an 8-Phase Shift Keying (PSK) modem. The minimum Euclidean distance between the modulated waveforms corresponding to the information sequences are maximized in order to maximize the noise immunity of the system. Nyquist filters with a square-root of 40 percent roll-off are used at the transmit and receive sides of the modem in order to minimize intersymbol interference, adjacent channel interference, and distortion at the nonlinear satellite power amplifier. The use of a coded system here also minimizes the effects of co-channel interference. The developed performance of the hardware system was measured to achieve within 1.5 dB from theory at a bit error rate of 5 x 10(exp -7) over an additive white Gaussian noise channel.

  2. MIMO Radar System for Respiratory Monitoring Using Tx and Rx Modulation with M-Sequence Codes

    NASA Astrophysics Data System (ADS)

    Miwa, Takashi; Ogiwara, Shun; Yamakoshi, Yoshiki

    The importance of respiratory monitoring systems during sleep have increased due to early diagnosis of sleep apnea syndrome (SAS) in the home. This paper presents a simple respiratory monitoring system suitable for home use having 3D ranging of targets. The range resolution and azimuth resolution are obtained by a stepped frequency transmitting signal and MIMO arrays with preferred pair M-sequence codes doubly modulating in transmission and reception, respectively. Due to the use of these codes, Gold sequence codes corresponding to all the antenna combinations are equivalently modulated in receiver. The signal to interchannel interference ratio of the reconstructed image is evaluated by numerical simulations. The results of experiments on a developed prototype 3D-MIMO radar system show that this system can extract only the motion of respiration of a human subject 2m apart from a metallic rotatable reflector. Moreover, it is found that this system can successfully measure the respiration information of sleeping human subjects for 96.6 percent of the whole measurement time except for instances of large posture change.

  3. The EPQ Code System for Simulating the Thermal Response of Plasma-Facing Components to High-Energy Electron Impact

    SciTech Connect

    Ward, Robert Cameron; Steiner, Don

    2004-06-15

    The generation of runaway electrons during a thermal plasma disruption is a concern for the safe and economical operation of a tokamak power system. Runaway electrons have high energy, 10 to 300 MeV, and may potentially cause extensive damage to plasma-facing components (PFCs) through large temperature increases, melting of metallic components, surface erosion, and possible burnout of coolant tubes. The EPQ code system was developed to simulate the thermal response of PFCs to a runaway electron impact. The EPQ code system consists of several parts: UNIX scripts that control the operation of an electron-photon Monte Carlo code to calculate the interaction of the runaway electrons with the plasma-facing materials; a finite difference code to calculate the thermal response, melting, and surface erosion of the materials; a code to process, scale, transform, and convert the electron Monte Carlo data to volumetric heating rates for use in the thermal code; and several minor and auxiliary codes for the manipulation and postprocessing of the data. The electron-photon Monte Carlo code used was Electron-Gamma-Shower (EGS), developed and maintained by the National Research Center of Canada. The Quick-Therm-Two-Dimensional-Nonlinear (QTTN) thermal code solves the two-dimensional cylindrical modified heat conduction equation using the Quickest third-order accurate and stable explicit finite difference method and is capable of tracking melting or surface erosion. The EPQ code system is validated using a series of analytical solutions and simulations of experiments. The verification of the QTTN thermal code with analytical solutions shows that the code with the Quickest method is better than 99.9% accurate. The benchmarking of the EPQ code system and QTTN versus experiments showed that QTTN's erosion tracking method is accurate within 30% and that EPQ is able to predict the occurrence of melting within the proper time constraints. QTTN and EPQ are verified and validated as able

  4. Integrated TIGER Series of Coupled Electron/Photon Monte Carlo Transport Codes System.

    SciTech Connect

    VALDEZ, GREG D.

    2012-11-30

    Version: 00 Distribution is restricted to US Government Agencies and Their Contractors Only. The Integrated Tiger Series (ITS) is a powerful and user-friendly software package permitting state-of-the-art Monte Carlo solution of linear time-independent coupled electron/photon radiation transport problems, with or without the presence of macroscopic electric and magnetic fields of arbitrary spatial dependence. The goal has been to simultaneously maximize operational simplicity and physical accuracy. Through a set of preprocessor directives, the user selects one of the many ITS codes. The ease with which the makefile system is applied combines with an input scheme based on order-independent descriptive keywords that makes maximum use of defaults and internal error checking to provide experimentalists and theorists alike with a method for the routine but rigorous solution of sophisticated radiation transport problems. Physical rigor is provided by employing accurate cross sections, sampling distributions, and physical models for describing the production and transport of the electron/photon cascade from 1.0 GeV down to 1.0 keV. The availability of source code permits the more sophisticated user to tailor the codes to specific applications and to extend the capabilities of the codes to more complex applications. Version 6, the latest version of ITS, contains (1) improvements to the ITS 5.0 codes, and (2) conversion to Fortran 95. The general user friendliness of the software has been enhanced through memory allocation to reduce the need for users to modify and recompile the code.

  5. Sensorineural hearing loss amplifies neural coding of envelope information in the central auditory system of chinchillas.

    PubMed

    Zhong, Ziwei; Henry, Kenneth S; Heinz, Michael G

    2014-03-01

    People with sensorineural hearing loss often have substantial difficulty understanding speech under challenging listening conditions. Behavioral studies suggest that reduced sensitivity to the temporal structure of sound may be responsible, but underlying neurophysiological pathologies are incompletely understood. Here, we investigate the effects of noise-induced hearing loss on coding of envelope (ENV) structure in the central auditory system of anesthetized chinchillas. ENV coding was evaluated noninvasively using auditory evoked potentials recorded from the scalp surface in response to sinusoidally amplitude modulated tones with carrier frequencies of 1, 2, 4, and 8 kHz and a modulation frequency of 140 Hz. Stimuli were presented in quiet and in three levels of white background noise. The latency of scalp-recorded ENV responses was consistent with generation in the auditory midbrain. Hearing loss amplified neural coding of ENV at carrier frequencies of 2 kHz and above. This result may reflect enhanced ENV coding from the periphery and/or an increase in the gain of central auditory neurons. In contrast to expectations, hearing loss was not associated with a stronger adverse effect of increasing masker intensity on ENV coding. The exaggerated neural representation of ENV information shown here at the level of the auditory midbrain helps to explain previous findings of enhanced sensitivity to amplitude modulation in people with hearing loss under some conditions. Furthermore, amplified ENV coding may potentially contribute to speech perception problems in people with cochlear hearing loss by acting as a distraction from more salient acoustic cues, particularly in fluctuating backgrounds.

  6. Code System for Calculating Radiation Exposure Resulting from Accidental Radioactive Releases to the Hydrosphere.

    SciTech Connect

    1982-11-18

    Version 00 LPGS was developed to calculate the radiological impacts resulting from radioactive releases to the hydrosphere. The name LPGS was derived from the Liquid Pathway Generic Study for which the original code was used primarily as an analytic tool in the assessment process. The hydrosphere is represented by the following types of water bodies: estuary, small river, well, lake, and one-dimensional (1-D) river. LPGS is designed to calculate radiation dose (individual and population) to body organs as a function of time for the various exposure pathways. The radiological consequences to the aquatic biota are estimated. Several simplified radionuclide transport models are employed with built-in formulations to describe the release rate of the radionuclides. A tabulated user-supplied release model can be input, if desired. Printer plots of dose versus time for the various exposure pathways are provided.

  7. 13 CFR 121.201 - What size standards has SBA identified by North American Industry Classification System codes?

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... identified by North American Industry Classification System codes? 121.201 Section 121.201 Business Credit... has SBA identified by North American Industry Classification System codes? Link to an amendment...) 221115 Wind Electric Power Generation (see footnote 1) 221116 Geothermal Electric Power Generation...

  8. 13 CFR 121.201 - What size standards has SBA identified by North American Industry Classification System codes?

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... identified by North American Industry Classification System codes? 121.201 Section 121.201 Business Credit... has SBA identified by North American Industry Classification System codes? Link to an amendment... Generation (see footnote 1) 221116 Geothermal Electric Power Generation (see footnote 1) 221117...

  9. Validation of Framework Code Approach to a Life Prediction System for Fiber Reinforced Composites

    NASA Technical Reports Server (NTRS)

    Gravett, Phillip

    1997-01-01

    The grant was conducted by the MMC Life Prediction Cooperative, an industry/government collaborative team, Ohio Aerospace Institute (OAI) acted as the prime contractor on behalf of the Cooperative for this grant effort. See Figure I for the organization and responsibilities of team members. The technical effort was conducted during the period August 7, 1995 to June 30, 1996 in cooperation with Erwin Zaretsky, the LERC Program Monitor. Phil Gravett of Pratt & Whitney was the principal technical investigator. Table I documents all meeting-related coordination memos during this period. The effort under this grant was closely coordinated with an existing USAF sponsored program focused on putting into practice a life prediction system for turbine engine components made of metal matrix composites (MMC). The overall architecture of the NMC life prediction system was defined in the USAF sponsored program (prior to this grant). The efforts of this grant were focussed on implementing and tailoring of the life prediction system, the framework code within it and the damage modules within it to meet the specific requirements of the Cooperative. T'he tailoring of the life prediction system provides the basis for pervasive and continued use of this capability by the industry/government cooperative. The outputs of this grant are: 1. Definition of the framework code to analysis modules interfaces, 2. Definition of the interface between the materials database and the finite element model, and 3. Definition of the integration of the framework code into an FEM design tool.

  10. Validity of the Child Facial Coding System for the Assessment of Acute Pain in Children With Cerebral Palsy.

    PubMed

    Hadden, Kellie L; LeFort, Sandra; O'Brien, Michelle; Coyte, Peter C; Guerriere, Denise N

    2016-04-01

    The purpose of the current study was to examine the concurrent and discriminant validity of the Child Facial Coding System for children with cerebral palsy. Eighty-five children (mean = 8.35 years, SD = 4.72 years) were videotaped during a passive joint stretch with their physiotherapist and during 3 time segments: baseline, passive joint stretch, and recovery. Children's pain responses were rated from videotape using the Numerical Rating Scale and Child Facial Coding System. Results indicated that Child Facial Coding System scores during the passive joint stretch significantly correlated with Numerical Rating Scale scores (r = .72, P < .01). Child Facial Coding System scores were also significantly higher during the passive joint stretch than the baseline and recovery segments (P < .001). Facial activity was not significantly correlated with the developmental measures. These findings suggest that the Child Facial Coding System is a valid method of identifying pain in children with cerebral palsy.

  11. DANDE: a linked code system for core neutronics/depletion analysis

    SciTech Connect

    LaBauve, R.J.; England, T.R.; George, D.C.; MacFarlane, R.E.; Wilson, W.B.

    1985-06-01

    This report describes DANDE - a modular neutronics, depletion code system for reactor analysis. It consists of nuclear data processing, core physics, and fuel depletion modules, and allows one to use diffusion and transport methods interchangeably in core neutronics calculations. This latter capability is especially important in the design of small modular cores. Additional unique features include the capability of updating the nuclear data file during a calculation; a detailed treatment of depletion, burnable poisons as well as fuel; and the ability to make geometric changes such as control rod repositioning and fuel relocation in the course of a calculation. The detailed treatment of reactor fuel burnup, fission-product creation and decay, as well as inventories of higher-order actinides is a necessity when predicting the behavior of reactor fuel under increased burn conditions. The operation of the code system is made clear in this report by following a sample problem.

  12. Measurements with Pinhole and Coded Aperture Gamma-Ray Imaging Systems

    SciTech Connect

    Raffo-Caiado, Ana Claudia; Solodov, Alexander A; Abdul-Jabbar, Najeb M; Hayward, Jason P; Ziock, Klaus-Peter

    2010-01-01

    From a safeguards perspective, gamma-ray imaging has the potential to reduce manpower and cost for effectively locating and monitoring special nuclear material. The purpose of this project was to investigate the performance of pinhole and coded aperture gamma-ray imaging systems at Oak Ridge National Laboratory (ORNL). With the aid of the European Commission Joint Research Centre (JRC), radiometric data will be combined with scans from a three-dimensional design information verification (3D-DIV) system. Measurements were performed at the ORNL Safeguards Laboratory using sources that model holdup in radiological facilities. They showed that for situations with moderate amounts of solid or dense U sources, the coded aperture was able to predict source location and geometry within ~7% of actual values while the pinhole gave a broad representation of source distributions

  13. A computer code for three-dimensional incompressible flows using nonorthogonal body-fitted coordinate systems

    NASA Technical Reports Server (NTRS)

    Chen, Y. S.

    1986-01-01

    In this report, a numerical method for solving the equations of motion of three-dimensional incompressible flows in nonorthogonal body-fitted coordinate (BFC) systems has been developed. The equations of motion are transformed to a generalized curvilinear coordinate system from which the transformed equations are discretized using finite difference approximations in the transformed domain. The hybrid scheme is used to approximate the convection terms in the governing equations. Solutions of the finite difference equations are obtained iteratively by using a pressure-velocity correction algorithm (SIMPLE-C). Numerical examples of two- and three-dimensional, laminar and turbulent flow problems are employed to evaluate the accuracy and efficiency of the present computer code. The user's guide and computer program listing of the present code are also included.

  14. Modular Python-based Code for Thomson Scattering System on NSTX-U

    NASA Astrophysics Data System (ADS)

    Horowitz, Benjamin; Diallo, Ahmed; Feibush, Eliot; Leblanc, Benoit

    2013-10-01

    Fast accurate and reliable measurements of electron temperature and density profiles within magnetically confined plasmas are essential for full operation of fusion devices. We detail the design and implementation of a modular Pythonbased code for the Thomson Scattering diagnostic system of NSTX-U which offers improvements in speed by making full use of the Python's architecture, open-source module packages, and ability to be parallelized across many processors. SciPy's weave package allows the implementation of C/C++ code within our program to clear up bottlenecks in data fitting while not loosing the flexibility and clarity of Python, while Numpy and MatplotLib allow calculations and plotting of the processed data. Using the standard MDSplus input, we create a flexible and expandable algorithm structure which can be implemented on any fusion device utilizing polychromator-based Thomson scattering diagnostic system. Supported by DOE SULI Fellowship at Princeton Plasma Physics Lab.

  15. Reliability studies of incident coding systems in high hazard industries: A narrative review of study methodology.

    PubMed

    Olsen, Nikki S

    2013-03-01

    This paper reviews the current literature on incident coding system reliability and discusses the methods applied in the conduct and measurement of reliability. The search strategy targeted three electronic databases using a list of search terms and the results were examined for relevance, including any additional relevant articles from the bibliographies. Twenty five papers met the relevance criteria and their methods are discussed. Disagreements in the selection of methods between reliability researchers are highlighted as are the effects of method selection on the outcome of the trials. The review provides evidence that the meaningfulness of and confidence in results is directly affected by the methodologies employed by the researcher during the preparation, conduct and analysis of the reliability study. Furthermore, the review highlights the heterogeneity of methodologies employed by researchers measuring reliability of incident coding techniques, reducing the ability to critically compare and appraise techniques being considered for the adoption of report coding and trend analysis by client organisations. It is recommended that future research focuses on the standardisation of reliability research and measurement within the incident coding domain.

  16. Object-oriented Development of an All-electron Gaussian Basis DFT Code for Periodic Systems

    NASA Astrophysics Data System (ADS)

    Alford, John

    2005-03-01

    We report on the construction of an all-electron Gaussian-basis DFT code for systems periodic in one, two, and three dimensions. This is in part a reimplementation of algorithms in the serial code, GTOFF, which has been successfully applied to the study of crystalline solids, surfaces, and ultra-thin films. The current development is being carried out in an object-oriented parallel framework using C++ and MPI. Some rather special aspects of this code are the use of density fitting methodologies and the implementation of a generalized Ewald technique to do lattice summations of Coulomb integrals, which is typically more accurate than multipole methods. Important modules that have already been created will be described, for example, a flexible input parser and storage class that can parse and store generically tagged data (e.g. XML), an easy to use processor communication mechanism, and the integrals package. Though C++ is generally inferior to F77 in terms of optimization, we show that careful redesigning has allowed us to make up the run-time performance difference in the new code. Timing comparisons and scalability features will be presented. The purpose of this reconstruction is to facilitate the inclusion of new physics. Our goal is to study orbital currents using modified gaussian bases and external magnetic field effects in the weak and ultra-strong ( ˜10^5 T) field regimes. This work is supported by NSF-ITR DMR-0218957.

  17. Consequence of Electron Mobility in Icy Grains on Solar System Objects

    NASA Astrophysics Data System (ADS)

    Gudipati, M. S.; Allamandola, L. J.; Cooper, J. F.; Sturner, S. J.; Johnson, R. E.

    2007-12-01

    Solar system ices have been shown to contain organic molecules, whether in the ice on Mars, comets such as Tempel-1 (from the Deep Impact mission) or on the surfaces of Europa, Ganymede, and Callisto. Sub-surface oceans containing ionic salts have been proposed to interpret the induced components of the local magnetic fields at these Galilean moons. Presence of liquid water is thought to be a requirement for potential astrobiological habilitability, particularly on Europa where the putative subsurface ocean is likely closest to the outer surface. Recent laboratory studies have shown that radiation processing of water-rich ices containing aromatic organic impurities readily ionizes organic molecules imbedded in an ice matrix. As a result, transient charge separation is produced more efficiently in ices containing organic impurities. This charge separation is partially stabilized by electron trapping. This could have important consequences since the icy moons of the giant planets are imbedded in both the magnetic field and trapped particle radiation environments of the planetary magnetospheres. Internal discharges of accumulated free charges (i.e. ice lightning) could significantly affect molecular chemistry of the irradiated outer layer beyond the direct effects of irradiation. Here we present new experimental results and theoretical modelling that deals with mobility of electrons produced by photoionization of PAHs (polycyclic aromatic hydrocarbons) in an ice matrix. We find that a small portion of the electrons (about 5% of the originally generated) are weakly trapped in the impurity-containing ices and can be made mobile at temperatures between 50 K and 125 K. Current flow of these mobile electrons could affect electrical conductivity of the irradiated surfaces and contribute to induced magnetic fields. This solid-state micro-ionospheric environment, comparable to a thin metallic conducting shell, may then need to be taken into account, along with the above

  18. Documentation generator for VHDL and MatLab source codes for photonic and electronic systems

    NASA Astrophysics Data System (ADS)

    Niton, B.; Pozniak, K. T.; Romaniuk, R. S.

    2011-06-01

    The UML, which is a complex system modeling and description technology, has recently been expanding its uses in the field of formalization and algorithmic approach to such systems like multiprocessor photonic, optoelectronic and advanced electronics carriers; distributed, multichannel measurement systems; optical networks, industrial electronics, novel R&D solutions. The paper describes a new concept of software dedicated for documenting the source codes written in VHDL and MatLab. The work starts with the analysis of available documentation generators for both programming languages, with an emphasis on the open source solutions. There are presented own solutions which base on the Doxygen program available as a free license with the source code. The supporting tools for parsers building were used like Bison and Flex. The documentation generator application is used for design of large optoelectronic and electronic measurement and control systems. The paper consists of three parts which describe the following components of the documentation generator for photonic and electronic systems: concept, MatLab application and VHDL application. This is part one which describes the system concept. Part two describes the MatLab application. MatLab is used for description of the measured phenomena. Part three describes the VHDL application. VHDL is used for behavioral description of the optoelectronic system. All the proposed approach and application documents big, complex software configurations for large systems.

  19. Detection and reconstruction of error control codes for engineered and biological regulatory systems.

    SciTech Connect

    May, Elebeoba Eni; Rintoul, Mark Daniel; Johnston, Anna Marie; Pryor, Richard J.; Hart, William Eugene; Watson, Jean-Paul

    2003-10-01

    A fundamental challenge for all communication systems, engineered or living, is the problem of achieving efficient, secure, and error-free communication over noisy channels. Information theoretic principals have been used to develop effective coding theory algorithms to successfully transmit information in engineering systems. Living systems also successfully transmit biological information through genetic processes such as replication, transcription, and translation, where the genome of an organism is the contents of the transmission. Decoding of received bit streams is fairly straightforward when the channel encoding algorithms are efficient and known. If the encoding scheme is unknown or part of the data is missing or intercepted, how would one design a viable decoder for the received transmission? For such systems blind reconstruction of the encoding/decoding system would be a vital step in recovering the original message. Communication engineers may not frequently encounter this situation, but for computational biologists and biotechnologist this is an immediate challenge. The goal of this work is to develop methods for detecting and reconstructing the encoder/decoder system for engineered and biological data. Building on Sandia's strengths in discrete mathematics, algorithms, and communication theory, we use linear programming and will use evolutionary computing techniques to construct efficient algorithms for modeling the coding system for minimally errored engineered data stream and genomic regulatory DNA and RNA sequences. The objective for the initial phase of this project is to construct solid parallels between biological literature and fundamental elements of communication theory. In this light, the milestones for FY2003 were focused on defining genetic channel characteristics and providing an initial approximation for key parameters, including coding rate, memory length, and minimum distance values. A secondary objective addressed the question of

  20. First experience with particle-in-cell plasma physics code on ARM-based HPC systems

    NASA Astrophysics Data System (ADS)

    Sáez, Xavier; Soba, Alejandro; Sánchez, Edilberto; Mantsinen, Mervi; Mateo, Sergi; Cela, José M.; Castejón, Francisco

    2015-09-01

    In this work, we will explore the feasibility of porting a Particle-in-cell code (EUTERPE) to an ARM multi-core platform from the Mont-Blanc project. The used prototype is based on a system-on-chip Samsung Exynos 5 with an integrated GPU. It is the first prototype that could be used for High-Performance Computing (HPC), since it supports double precision and parallel programming languages.

  1. An Alternative Coding System Defining the Total and Severity of Wear

    DTIC Science & Technology

    1996-04-01

    of fluid which may be retated to the severity of wear that has occurred in the sampled machine. A 20 to 30ml plastic bottle provides an adequate...University of Wales Swansea Abstract: A ferrous debris monitor is described which is capable of measuring the concentration of ferrous wear debris ...suspended in a lubricant and the severity of wear associated with particle size of this suspended debris . A coding system is proposed : PQ index(total

  2. Evaluation of CFETR as a Fusion Nuclear Science Facility using multiple system codes

    NASA Astrophysics Data System (ADS)

    Chan, V. S.; Costley, A. E.; Wan, B. N.; Garofalo, A. M.; Leuer, J. A.

    2015-02-01

    This paper presents the results of a multi-system codes benchmarking study of the recently published China Fusion Engineering Test Reactor (CFETR) pre-conceptual design (Wan et al 2014 IEEE Trans. Plasma Sci. 42 495). Two system codes, General Atomics System Code (GASC) and Tokamak Energy System Code (TESC), using different methodologies to arrive at CFETR performance parameters under the same CFETR constraints show that the correlation between the physics performance and the fusion performance is consistent, and the computed parameters are in good agreement. Optimization of the first wall surface for tritium breeding and the minimization of the machine size are highly compatible. Variations of the plasma currents and profiles lead to changes in the required normalized physics performance, however, they do not significantly affect the optimized size of the machine. GASC and TESC have also been used to explore a lower aspect ratio, larger volume plasma taking advantage of the engineering flexibility in the CFETR design. Assuming the ITER steady-state scenario physics, the larger plasma together with a moderately higher BT and Ip can result in a high gain Qfus ˜ 12, Pfus ˜ 1 GW machine approaching DEMO-like performance. It is concluded that the CFETR baseline mode can meet the minimum goal of the Fusion Nuclear Science Facility (FNSF) mission and advanced physics will enable it to address comprehensively the outstanding critical technology gaps on the path to a demonstration reactor (DEMO). Before proceeding with CFETR construction steady-state operation has to be demonstrated, further development is needed to solve the divertor heat load issue, and blankets have to be designed with tritium breeding ratio (TBR) >1 as a target.

  3. Mathematical framework for the analysis of dynamic stochastic systems with the RAVEN code

    SciTech Connect

    Rabiti, C.; Mandelli, D.; Alfonsi, A.; Cogliati, J.; Kinoshita, R.

    2013-07-01

    RAVEN (Reactor Analysis and Virtual control Environment) is a software code under development at Idaho National Laboratory aimed at performing probabilistic risk assessment and uncertainty quantification using RELAP-7, for which it acts also as a simulation controller. In this paper we will present the equations characterizing a dynamic stochastic system and we will then discuss the behavior of each stochastic term and how it is accounted for in the RAVEN software design. Moreover we will present preliminary results of the implementation. (authors)

  4. MATHEMATICAL FRAMEWORK FOR THE ANALYSIS OF DYNAMC STOCHASTIC SYSTEMS WITH THE RAVEN CODE

    SciTech Connect

    C. Rabiti; D. Mandelli; J. Cogliati; R. Kinoshita

    2013-05-01

    RAVEN (Reactor Analysis and Virtual control Environment) is a software code under development at Idaho National Laboratory aimed at performing probabilistic risk assessment and uncertainty quantification using RELAP-7, for which it acts also as a simulation controller. In this paper we will present the equations characterizing a dynamic stochastic system and we will then discuss the behavior of each stochastic term and how it is accounted for in the RAVEN software design. Moreover we will present preliminary results of the implementation.

  5. Uncertainties in offsite consequence analysis

    SciTech Connect

    Young, M.L.; Harper, F.T.; Lui, C.H.

    1996-03-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequences from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the U.S. Nuclear Regulatory Commission and the European Commission began co-sponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables using a formal expert judgment elicitation and evaluation process. This paper focuses on the methods used in and results of this on-going joint effort.

  6. The SWAN-SCALE code for the optimization of critical systems

    SciTech Connect

    Greenspan, E.; Karni, Y.; Regev, D.; Petrie, L.M.

    1999-07-01

    The SWAN optimization code was recently developed to identify the maximum value of k{sub eff} for a given mass of fissile material when in combination with other specified materials. The optimization process is iterative; in each iteration SWAN varies the zone-dependent concentration of the system constituents. This change is guided by the equal volume replacement effectiveness functions (EVREF) that SWAN generates using first-order perturbation theory. Previously, SWAN did not have provisions to account for the effect of the composition changes on neutron cross-section resonance self-shielding; it used the cross sections corresponding to the initial system composition. In support of the US Department of Energy Nuclear Criticality Safety Program, the authors recently removed the limitation on resonance self-shielding by coupling SWAN with the SCALE code package. The purpose of this paper is to briefly describe the resulting SWAN-SCALE code and to illustrate the effect that neutron cross-section self-shielding could have on the maximum k{sub eff} and on the corresponding system composition.

  7. Functional annotation of the vlinc class of non-coding RNAs using systems biology approach

    PubMed Central

    Laurent, Georges St.; Vyatkin, Yuri; Antonets, Denis; Ri, Maxim; Qi, Yao; Saik, Olga; Shtokalo, Dmitry; de Hoon, Michiel J.L.; Kawaji, Hideya; Itoh, Masayoshi; Lassmann, Timo; Arner, Erik; Forrest, Alistair R.R.; Nicolas, Estelle; McCaffrey, Timothy A.; Carninci, Piero; Hayashizaki, Yoshihide; Wahlestedt, Claes; Kapranov, Philipp

    2016-01-01

    Functionality of the non-coding transcripts encoded by the human genome is the coveted goal of the modern genomics research. While commonly relied on the classical methods of forward genetics, integration of different genomics datasets in a global Systems Biology fashion presents a more productive avenue of achieving this very complex aim. Here we report application of a Systems Biology-based approach to dissect functionality of a newly identified vast class of very long intergenic non-coding (vlinc) RNAs. Using highly quantitative FANTOM5 CAGE dataset, we show that these RNAs could be grouped into 1542 novel human genes based on analysis of insulators that we show here indeed function as genomic barrier elements. We show that vlincRNAs genes likely function in cis to activate nearby genes. This effect while most pronounced in closely spaced vlincRNA–gene pairs can be detected over relatively large genomic distances. Furthermore, we identified 101 vlincRNA genes likely involved in early embryogenesis based on patterns of their expression and regulation. We also found another 109 such genes potentially involved in cellular functions also happening at early stages of development such as proliferation, migration and apoptosis. Overall, we show that Systems Biology-based methods have great promise for functional annotation of non-coding RNAs. PMID:27001520

  8. A dynamic, dependent type system for nuclear fuel cycle code generation

    SciTech Connect

    Scopatz, A.

    2013-07-01

    The nuclear fuel cycle may be interpreted as a network or graph, thus allowing methods from formal graph theory to be used. Nodes are often idealized as nuclear fuel cycle facilities (reactors, enrichment cascades, deep geologic repositories). With the advent of modern object-oriented programming languages - and fuel cycle simulators implemented in these languages - it is natural to define a class hierarchy of facility types. Bright is a quasi-static simulator, meaning that the number of material passes through a facility is tracked rather than natural time. Bright is implemented as a C++ library that models many canonical components such as reactors, storage facilities, and more. Cyclus is a discrete time simulator, meaning that natural time is tracked through out the simulation. Therefore a robust, dependent type system was developed to enable inter-operability between Bright and Cyclus. This system is capable of representing any fuel cycle facility. Types declared in this system can then be used to automatically generate code which binds a facility implementation to a simulator front end. Facility model wrappers may be used either internally to a fuel cycle simulator or as a mechanism for inter-operating multiple simulators. While such a tool has many potential use cases it has two main purposes: enabling easy performance of code-to-code comparisons and the verification and the validation of user input.

  9. A coded aperture compressive imaging array and its visual detection and tracking algorithms for surveillance systems.

    PubMed

    Chen, Jing; Wang, Yongtian; Wu, Hanxiao

    2012-10-29

    In this paper, we propose an application of a compressive imaging system to the problem of wide-area video surveillance systems. A parallel coded aperture compressive imaging system is proposed to reduce the needed high resolution coded mask requirements and facilitate the storage of the projection matrix. Random Gaussian, Toeplitz and binary phase coded masks are utilized to obtain the compressive sensing images. The corresponding motion targets detection and tracking algorithms directly using the compressive sampling images are developed. A mixture of Gaussian distribution is applied in the compressive image space to model the background image and for foreground detection. For each motion target in the compressive sampling domain, a compressive feature dictionary spanned by target templates and noises templates is sparsely represented. An l(1) optimization algorithm is used to solve the sparse coefficient of templates. Experimental results demonstrate that low dimensional compressed imaging representation is sufficient to determine spatial motion targets. Compared with the random Gaussian and Toeplitz phase mask, motion detection algorithms using a random binary phase mask can yield better detection results. However using random Gaussian and Toeplitz phase mask can achieve high resolution reconstructed image. Our tracking algorithm can achieve a real time speed that is up to 10 times faster than that of the l(1) tracker without any optimization.

  10. Manchester code telemetry system for well logging using quasi-parallel inductive-capacitive resonance.

    PubMed

    Xu, Lijun; Chen, Jianjun; Cao, Zhang; Liu, Xingbin; Hu, Jinhai

    2014-07-01

    In this paper, a quasi-parallel inductive-capacitive (LC) resonance method is proposed to improve the recovery of MIL-STD-1553 Manchester code with several frequency components from attenuated, distorted, and drifted signal for data telemetry in well logging, and corresponding telemetry system is developed. Required resonant frequency and quality factor are derived, and the quasi-parallel LC resonant circuit is established at the receiving end of the logging cable to suppress the low-pass filtering effect caused by the distributed capacitance of the cable and provide balanced pass for all the three frequency components of the Manchester code. The performance of the method for various encoding frequencies and cable lengths at different bit energy to noise density ratios (Eb/No) have been evaluated in the simulation. A 5 km single-core cable used in on-site well logging and various encoding frequencies were employed to verify the proposed telemetry system in the experiment. Results obtained demonstrate that the telemetry system is feasible and effective to improve the code recovery in terms of anti-attenuation, anti-distortion, and anti-drift performances, decrease the bit error rate, and increase the reachable transmission rate and distance greatly.

  11. Extending the Capture Volume of an Iris Recognition System Using Wavefront Coding and Super-Resolution.

    PubMed

    Hsieh, Sheng-Hsun; Li, Yung-Hui; Tien, Chung-Hao; Chang, Chin-Chen

    2016-12-01

    Iris recognition has gained increasing popularity over the last few decades; however, the stand-off distance in a conventional iris recognition system is too short, which limits its application. In this paper, we propose a novel hardware-software hybrid method to increase the stand-off distance in an iris recognition system. When designing the system hardware, we use an optimized wavefront coding technique to extend the depth of field. To compensate for the blurring of the image caused by wavefront coding, on the software side, the proposed system uses a local patch-based super-resolution method to restore the blurred image to its clear version. The collaborative effect of the new hardware design and software post-processing showed great potential in our experiment. The experimental results showed that such improvement cannot be achieved by using a hardware-or software-only design. The proposed system can increase the capture volume of a conventional iris recognition system by three times and maintain the system's high recognition rate.

  12. Optimization technique of wavefront coding system based on ZEMAX externally compiled programs

    NASA Astrophysics Data System (ADS)

    Han, Libo; Dong, Liquan; Liu, Ming; Zhao, Yuejin; Liu, Xiaohua

    2016-10-01

    Wavefront coding technique as a means of athermalization applied to infrared imaging system, the design of phase plate is the key to system performance. This paper apply the externally compiled programs of ZEMAX to the optimization of phase mask in the normal optical design process, namely defining the evaluation function of wavefront coding system based on the consistency of modulation transfer function (MTF) and improving the speed of optimization by means of the introduction of the mathematical software. User write an external program which computes the evaluation function on account of the powerful computing feature of the mathematical software in order to find the optimal parameters of phase mask, and accelerate convergence through generic algorithm (GA), then use dynamic data exchange (DDE) interface between ZEMAX and mathematical software to realize high-speed data exchanging. The optimization of the rotational symmetric phase mask and the cubic phase mask have been completed by this method, the depth of focus increases nearly 3 times by inserting the rotational symmetric phase mask, while the other system with cubic phase mask can be increased to 10 times, the consistency of MTF decrease obviously, the maximum operating temperature of optimized system range between -40°-60°. Results show that this optimization method can be more convenient to define some unconventional optimization goals and fleetly to optimize optical system with special properties due to its externally compiled function and DDE, there will be greater significance for the optimization of unconventional optical system.

  13. Fixed-point Design of the Lattice-reduction-aided Iterative Detection and Decoding Receiver for Coded MIMO Systems

    DTIC Science & Technology

    2011-01-01

    REPORT Fixed-point Design of theLattice-reduction-aided Iterative Detection andDecoding Receiver for Coded MIMO Systems 14. ABSTRACT 16. SECURITY...298 (Rev 8/98) Prescribed by ANSI Std. Z39.18 - Fixed-point Design of theLattice-reduction-aided Iterative Detection andDecoding Receiver for Coded ...important, this report illustrates the performance of coded LR aided detectors. 1 Fixed-point Design of the Lattice-reduction-aided Iterative Detection and

  14. The effects of receiver tracking phase error on the performance of the concatenated Reed-Solomon/Viterbi channel coding system

    NASA Technical Reports Server (NTRS)

    Liu, K. Y.

    1981-01-01

    Analytical and experimental results are presented of the effects of receiver tracking phase error, caused by weak signal conditions on either the uplink or the downlink or both, on the performance of the concatenated Reed-Solomon (RS) Viterbi channel coding system. The test results were obtained under an emulated S band uplink and X band downlink, two way space communication channel in the telecommunication development laboratory of JPL with data rates ranging from 4 kHz to 20 kHz. It is shown that, with ideal interleaving, the concatenated RS/Viterbi coding system is capable of yielding large coding gains at very low bit error probabilities over the Viterbi decoded convolutional only coding system. Results on the effects of receiver tracking phase errors on the performance of the concatenated coding system with antenna array combining are included.

  15. Code Comparison Study Fosters Confidence in the Numerical Simulation of Enhanced Geothermal Systems

    SciTech Connect

    White, Mark D.; Phillips, Benjamin R.

    2015-01-26

    Numerical simulation has become a standard analytical tool for scientists and engineers to evaluate the potential and performance of enhanced geothermal systems. A variety of numerical simulators developed by industry, universities, and national laboratories are currently available and being applied to better understand enhanced geothermal systems at the field scale. To yield credible predictions and be of value to site operators, numerical simulators must be able to accurately represent the complex coupled processes induced by producing geothermal systems, such as fracture aperture changes due to thermal stimulation, fracture shear displacement with fluid injection, rate of thermal depletion of reservoir rocks, and permeability alteration with mineral precipitation or dissolution. A suite of numerical simulators was exercised on a series of test problems that considered coupled thermal, hydraulic, geomechanical, and geochemical (THMC) processes. Problems were selected and designed to isolate selected coupled processes, to be executed on workstation class computers, and have simple but illustrative metrics for result comparisons. This paper summarizes the initial suite of seven benchmark problems, describes the code comparison activities, provides example results for problems and documents the capabilities of currently available numerical simulation codes to represent coupled processes that occur during the production of geothermal resources. Code comparisons described in this paper use the ISO (International Organization for Standardization) standard ISO-13538 for proficiency testing of numerical simulators. This approach was adopted for a recent code comparison study within the radiation transfer-modeling field of atmospheric sciences, which was focused on canopy reflectance models. This standard specifies statistical methods for analyzing laboratory data from proficiency testing schemes to demonstrate that the measurement results do not exhibit evidence of an

  16. EASY-II Renaissance: n, p, d, α, γ-induced Inventory Code System

    NASA Astrophysics Data System (ADS)

    Sublet, J.-Ch.; Eastwood, J. W.; Morgan, J. G.

    2014-04-01

    The European Activation SYstem has been re-engineered and re-written in modern programming languages so as to answer today's and tomorrow's needs in terms of activation, transmutation, depletion, decay and processing of radioactive materials. The new FISPACT-II inventory code development project has allowed us to embed many more features in terms of energy range: up to GeV; incident particles: alpha, gamma, proton, deuteron and neutron; and neutron physics: self-shielding effects, temperature dependence and covariance, so as to cover all anticipated application needs: nuclear fission and fusion, accelerator physics, isotope production, stockpile and fuel cycle stewardship, materials characterization and life, and storage cycle management. In parallel, the maturity of modern, truly general purpose libraries encompassing thousands of target isotopes such as TENDL-2012, the evolution of the ENDF-6 format and the capabilities of the latest generation of processing codes PREPRO, NJOY and CALENDF have allowed the activation code to be fed with more robust, complete and appropriate data: cross sections with covariance, probability tables in the resonance ranges, kerma, dpa, gas and radionuclide production and 24 decay types. All such data for the five most important incident particles (n, p, d, α, γ), are placed in evaluated data files up to an incident energy of 200 MeV. The resulting code system, EASY-II is designed as a functional replacement for the previous European Activation System, EASY-2010. It includes many new features and enhancements, but also benefits already from the feedback from extensive validation and verification activities performed with its predecessor.

  17. Model Children's Code.

    ERIC Educational Resources Information Center

    New Mexico Univ., Albuquerque. American Indian Law Center.

    The Model Children's Code was developed to provide a legally correct model code that American Indian tribes can use to enact children's codes that fulfill their legal, cultural and economic needs. Code sections cover the court system, jurisdiction, juvenile offender procedures, minor-in-need-of-care, and termination. Almost every Code section is…

  18. SCALE: A modular code system for performing standardized computer analyses for licensing evaluation: Functional modules, F9-F11

    SciTech Connect

    1997-03-01

    This Manual represents Revision 5 of the user documentation for the modular code system referred to as SCALE. The history of the SCALE code system dates back to 1969 when the current Computational Physics and Engineering Division at Oak Ridge National Laboratory (ORNL) began providing the transportation package certification staff at the U.S. Atomic Energy Commission with computational support in the use of the new KENO code for performing criticality safety assessments with the statistical Monte Carlo method. From 1969 to 1976 the certification staff relied on the ORNL staff to assist them in the correct use of codes and data for criticality, shielding, and heat transfer analyses of transportation packages. However, the certification staff learned that, with only occasional use of the codes, it was difficult to become proficient in performing the calculations often needed for an independent safety review. Thus, shortly after the move of the certification staff to the U.S. Nuclear Regulatory Commission (NRC), the NRC staff proposed the development of an easy-to-use analysis system that provided the technical capabilities of the individual modules with which they were familiar. With this proposal, the concept of the Standardized Computer Analyses for Licensing Evaluation (SCALE) code system was born. This volume consists of the section of the manual dealing with three of the functional modules in the code. Those are the Morse-SGC for the SCALE system, Heating 7.2, and KENO V.a. The manual describes the latest released versions of the codes.

  19. What Changed in Article 690-Solar Photovoltaic Systems- of the 1999 National Electrical Code?

    SciTech Connect

    Bower, W.; Wiles, J.

    1999-01-12

    Article 690, Solar Photovoltaic Power Systems, has been in the National Electrical Code (NEC) since 1984. An NFPA-appointed Task Group for Article 690 proposed changes to Article 690 for both the 1996 and 1999 codes. The Task Group, supported by more than 50 professionals from throughout the photovoltaic (PV) industry, met seven times during the 1999 code cycle to integrate the needs of the industry with the needs of electrical inspectors and end users to ensure the safety of PV systems. The Task Group proposed 57 changes to Article 690, and all the changes were accepted in the review process. The performance and cost of PV installations were always a consideration as these changes were formed but safety was the number-one priority. All of the proposals were well substantiated and coordinated throughout the PV industry and with representatives of Underwriters Laboratories, Inc (UL). The most significant changes that were made in Article 690 for the 1999 NEC along with some of the rationale are discussed in the remainder of this article.

  20. LAHET Code System/CINDER`90 validation calculations and comparison with experimental data

    SciTech Connect

    Brun, T.O.; Beard, C.A.; Daemen, L.L.; Pitcher, E.J.; Russell, G.J.; Wilson, W.B.

    1994-02-01

    The authors use the Los Alamos LAHET Code System (LCS)/CINDER`90 suite of codes in a variety of spallation neutron source applications to predict neutronic performance and as a basis for making engineering decisions. They have broadened their usage of the suite from designing LANSCE and the next generation of spallation neutron sources for materials science and nuclear physics research to designing a target system for Accelerator Production of Tritium and Accelerator Transmutation of Waste. While designing, they continue to validate the LCS/CINDER`90 code suite against experimental data whenever possible. In the following, they discuss comparisons between calculations and measurements for: integral neutron yields from a bare-target of lead; fertile-to-fissile conversion yields for thorium and depleted uranium targets; dose rates from the LANSCE tungsten target; energy deposition in a variety of light and heavy materials; and neutron spectra from LANSCE water and liquid hydrogen moderators. The accuracy with which the calculations reproduce experimental results is an indication of their confidence in the validity of their design calculations.

  1. FPGA-based LDPC-coded APSK for optical communication systems.

    PubMed

    Zou, Ding; Lin, Changyu; Djordjevic, Ivan B

    2017-02-20

    In this paper, with the aid of mutual information and generalized mutual information (GMI) capacity analyses, it is shown that the geometrically shaped APSK that mimics an optimal Gaussian distribution with equiprobable signaling together with the corresponding gray-mapping rules can approach the Shannon limit closer than conventional quadrature amplitude modulation (QAM) at certain range of FEC overhead for both 16-APSK and 64-APSK. The field programmable gate array (FPGA) based LDPC-coded APSK emulation is conducted on block interleaver-based and bit interleaver-based systems; the results verify a significant improvement in hardware efficient bit interleaver-based systems. In bit interleaver-based emulation, the LDPC-coded 64-APSK outperforms 64-QAM, in terms of symbol signal-to-noise ratio (SNR), by 0.1 dB, 0.2 dB, and 0.3 dB at spectral efficiencies of 4.8, 4.5, and 4.2 b/s/Hz, respectively. It is found by emulation that LDPC-coded 64-APSK for spectral efficiencies of 4.8, 4.5, and 4.2 b/s/Hz is 1.6 dB, 1.7 dB, and 2.2 dB away from the GMI capacity.

  2. The EGS4 Code System: Solution of Gamma-ray and Electron Transport Problems

    DOE R&D Accomplishments Database

    Nelson, W. R.; Namito, Yoshihito

    1990-03-01

    In this paper we present an overview of the EGS4 Code System -- a general purpose package for the Monte Carlo simulation of the transport of electrons and photons. During the last 10-15 years EGS has been widely used to design accelerators and detectors for high-energy physics. More recently the code has been found to be of tremendous use in medical radiation physics and dosimetry. The problem-solving capabilities of EGS4 will be demonstrated by means of a variety of practical examples. To facilitate this review, we will take advantage of a new add-on package, called SHOWGRAF, to display particle trajectories in complicated geometries. These are shown as 2-D laser pictures in the written paper and as photographic slides of a 3-D high-resolution color monitor during the oral presentation. 11 refs., 15 figs.

  3. Coding tradeoffs for improved performance of FH/MFSK systems in partial band noise

    NASA Technical Reports Server (NTRS)

    Levitt, B. K.; Omura, J. K.

    1981-01-01

    Partial band noise jamming can severely degrade the performance of frequency-hopped, M-ary frequency-shift keyed communciation systems. This paper illustrates the tradeoffs between channel coding, diversity, and block orthogonal (MFSK) modulation as a means of overcoming the advantage of worst case, non-adaptive (as opposed to repeat-back) partial band jamming. For ease of computation, the analysis relies on exponentially tight error bounds, and is based on a noncoherent detection metric that requires jamming state information for each hop. A more robust, less complex receiver structure which eliminates the jamming knowledge requirement is shown to degrade performance less than 2-1/2 dB. The coding tradeoffs discussed in this report are exemplified in the design of a hypothetical 32 kb/s military frequency-hopped communication link.

  4. GAS-PASS/H : a simulation code for gas reactor plant systems.

    SciTech Connect

    Vilim, R. B.; Mertyurek, U.; Cahalan, J. E.; Nuclear Engineering Division; Texas A&M Univ.

    2004-01-01

    A simulation code for gas reactor plant systems has been developed. The code is intended for use in safety analysis and control studies for Generation-IV reactor concepts. We developed it anticipating an immediate application to direct cycle gas reactors. By programming in flexibility as to how components can be configured, we believe the code can be adapted for the indirect-cycle gas reactors relatively easy. The use of modular components and a general purpose equation solver allows for this. There are several capabilities that are included for investigating issues associated with direct cycle gas reactors. Issues include the safety characteristics of single shaft plants during coastdown and transition to shutdown heat removal following unprotected accidents, including depressurization, and the need for safety grade control systems. Basic components provided include turbine, compressor, recuperator, cooler, bypass valve, leak, accumulator, containment, and flow junction. The code permits a more rapid assessment of design concepts than is achievable using RELAP. RELAP requires detail beyond what is necessary at the design scoping stage. This increases the time to assemble an input deck and tends to make the code slower running. The core neutronics and decay heat models of GAS-PASS/H are taken from the liquid-metal version of MINISAS. The ex-reactor component models were developed from first principles. The network-based method for assembling component models into a system uses a general nonlinear solver to find the solution to the steady-state equations. The transient time-differenced equations are solved implicitly using the same solver. A direct cycle gas reactor is modeled and a loss of generator load transient is simulated for this reactor. While normally the reactor is scrammed, the plant safety case will require analysis of this event with failure of various safety systems. Therefore, we simulated the loss of load transient with a combined failure of the

  5. Development of a High Fidelity System Analysis Code for Generation IV Reactors

    SciTech Connect

    Hongbin Zhang; Vincent Mousseau; Haihua Zhao

    2008-06-01

    Traditional nuclear reactor system analysis codes such as RELAP and TRAC employ an operator split methodology. In this approach, each of the physics (fluid flow, heat conduction and neutron diffusion) is solved separately and the coupling terms are done explicitly. This approach limits accuracy (first order in time at best) and makes the codes slow in running since the explicit coupling imposes stability restrictions on the time step size. These codes have been extensively tested and validated for the existing LWRs. However, for GEN IV nuclear reactor designs which tend to have long lasting transients resulting from passive safety systems, the performance is questionable and modern high fidelity simulation tools will be required. The requirement for accurate predictability is the motivation for a large scale overhaul of all of the models and assumptions in transient nuclear reactor safety simulation software. At INL we have launched an effort with the long term goal of developing a high fidelity system analysis code that employs modern physical models, numerical methods, and computer science for transient safety analysis of GEN IV nuclear reactors. Modern parallel solution algorithms will be employed through utilizing the nonlinear solution software package PETSc developed by Argonne National Laboratory. The physical models to be developed will have physically realistic length scales and time scales. The solution algorithm will be based on the physics-based preconditioned Jacobian-free Newton-Krylov solution methods. In this approach all of the physical models are solved implicitly and simultaneously in a single nonlinear system. This includes the coolant flow, nonlinear heat conduction, neutron kinetics, and thermal radiation, etc. Including modern physical models and accurate space and time discretizations will allow the simulation capability to be second order accurate in space and in time. This paper presents the current status of the development efforts as

  6. Equivalency Evaluation between IAEA Safety Guidelines and Codes and Standards for Computer-Based Systems

    SciTech Connect

    Ji, S.H.; Kim, DAI. I.; Park, H.S.; Kim, B.R.; Kang, Y.D.; Oh, S.H.

    2002-07-01

    Computer based systems are used in safety related applications in safety critical applications as well as safety related applications, such as reactor protection or actuation of safety features, certain functions of the process control and monitoring system. In this context, the IAEA released the safety standard series, NS-G-1.11 (hereafter: IAEA Guideline), 'Software for Computer Based Systems Important to Safety in NPPs', in 2000 as a guideline for evaluating the software of digitalized computer based system applied in instrumentation and control system of nuclear plants. This paper discusses about the equivalency between IAEA Guideline and codes and standards adopted by Korea Institute Nuclear Safety (hereafter: KINS Guideline) as regulatory basis. (authors)

  7. Image sensor system with bio-inspired efficient coding and adaptation.

    PubMed

    Okuno, Hirotsugu; Yagi, Tetsuya

    2012-08-01

    We designed and implemented an image sensor system equipped with three bio-inspired coding and adaptation strategies: logarithmic transform, local average subtraction, and feedback gain control. The system comprises a field-programmable gate array (FPGA), a resistive network, and active pixel sensors (APS), whose light intensity-voltage characteristics are controllable. The system employs multiple time-varying reset voltage signals for APS in order to realize multiple logarithmic intensity-voltage characteristics, which are controlled so that the entropy of the output image is maximized. The system also employs local average subtraction and gain control in order to obtain images with an appropriate contrast. The local average is calculated by the resistive network instantaneously. The designed system was successfully used to obtain appropriate images of objects that were subjected to large changes in illumination.

  8. Implementation of a tree-code for numerical simulations of stellar systems

    NASA Astrophysics Data System (ADS)

    Marinho, Eraldo Pereira

    1991-10-01

    An implementation of a tree code for the force calculation in gravitational N-body systems simulations is presented. The technique consists of virtualizing the entire system in a tree data-structure, which reduces the computational effort to theta(N log N) instead of the theta(N exp 2), typical of direct summation. The adopted time integrator is the simple leap-frog with second-order accuracy. A brief discussion about the truncation-error effects on the morphology of the system shows them to be essentially negligible. However, these errors do propagate in a Markovian way if a potential-adaptive time-step is used in order to maintain the expected truncation-error approximately constant in the entire system. The tests show that, even with totally arbitrary distributions, the total computation time obeys theta(N log N). As an application of the code, we evolved an initially cold and homogeneous sphere of point masses to simulate a primordial process of galaxy formation. The evolution of the global entropy of the system suggests that a quasi-equilibrium configuration is achieved after approximately 2 x 10 exp 9 years. It is shown that the final configuration displays a close resemblance to the well observed giant elliptical galaxies, in both kinematical and luminosity distribution properties. A discussion is given on the evolution of the important dynamic quantities characterizing the model. During all the computations, the energy is conserved to better than 0.1 percent.

  9. The Use of a Code-generating System for the Derivation of the Equations for Wind Turbine Dynamics

    NASA Astrophysics Data System (ADS)

    Ganander, Hans

    2003-10-01

    For many reasons the size of wind turbines on the rapidly growing wind energy market is increasing. Relations between aeroelastic properties of these new large turbines change. Modifications of turbine designs and control concepts are also influenced by growing size. All these trends require development of computer codes for design and certification. Moreover, there is a strong desire for design optimization procedures, which require fast codes. General codes, e.g. finite element codes, normally allow such modifications and improvements of existing wind turbine models. This is done relatively easy. However, the calculation times of such codes are unfavourably long, certainly for optimization use. The use of an automatic code generating system is an alternative for relevance of the two key issues, the code and the design optimization. This technique can be used for rapid generation of codes of particular wind turbine simulation models. These ideas have been followed in the development of new versions of the wind turbine simulation code VIDYN. The equations of the simulation model were derived according to the Lagrange equation and using Mathematica®, which was directed to output the results in Fortran code format. In this way the simulation code is automatically adapted to an actual turbine model, in terms of subroutines containing the equations of motion, definitions of parameters and degrees of freedom. Since the start in 1997, these methods, constituting a systematic way of working, have been used to develop specific efficient calculation codes. The experience with this technique has been very encouraging, inspiring the continued development of new versions of the simulation code as the need has arisen, and the interest for design optimization is growing.

  10. Development of a framework and coding system for modifications and adaptations of evidence-based interventions

    PubMed Central

    2013-01-01

    Background Evidence-based interventions are frequently modified or adapted during the implementation process. Changes may be made to protocols to meet the needs of the target population or address differences between the context in which the intervention was originally designed and the one into which it is implemented [Addict Behav 2011, 36(6):630–635]. However, whether modification compromises or enhances the desired benefits of the intervention is not well understood. A challenge to understanding the impact of specific types of modifications is a lack of attention to characterizing the different types of changes that may occur. A system for classifying the types of modifications that are made when interventions and programs are implemented can facilitate efforts to understand the nature of modifications that are made in particular contexts as well as the impact of these modifications on outcomes of interest. Methods We developed a system for classifying modifications made to interventions and programs across a variety of fields and settings. We then coded 258 modifications identified in 32 published articles that described interventions implemented in routine care or community settings. Results We identified modifications made to the content of interventions, as well as to the context in which interventions are delivered. We identified 12 different types of content modifications, and our coding scheme also included ratings for the level at which these modifications were made (ranging from the individual patient level up to a hospital network or community). We identified five types of contextual modifications (changes to the format, setting, or patient population that do not in and of themselves alter the actual content of the intervention). We also developed codes to indicate who made the modifications and identified a smaller subset of modifications made to the ways that training or evaluations occur when evidence-based interventions are implemented. Rater

  11. A Pictorial History of the Code 717 Unmanned Systems Group: Air, Land, and Sea. Volume 1: 1970-1999

    DTIC Science & Technology

    2016-04-28

    including a drill press, band saw, sheet-metal sheer, and forming brake. To provide heat during the winter and protect the Unmanned Systems Branch’s...TECHNICAL DOCUMENT 3289 April 2016 A Pictorial History of the Code 717 Unmanned Systems Group: Air, Land, and Sea Volume 1: 1970–1999 H. R...the Code 717 Unmanned Systems Group: Air, Land, and Sea Volume 1: 1970–1999 H. R. Everett Approved for public release

  12. A Bioenergetics Approach to Understanding the Population Consequences of Disturbance: Elephant Seals as a Model System.

    PubMed

    Costa, Daniel P; Schwarz, Lisa; Robinson, Patrick; Schick, Robert S; Morris, Patricia A; Condit, Richard; Crocker, Daniel E; Kilpatrick, A Marm

    2016-01-01

    Using long-term empirical data, we developed a complete population consequences of acoustic disturbance (PCAD) model and application for northern elephant seals. We assumed that the animals would not successfully forage while in a 100-km-diameter disturbance region within their foraging and transit paths. The decrease in lipid gain due to exposure was then translated to changes in birth rate and pup survival. Given their large foraging range, elephant seals were resilient to such a disturbance, showing no population-level effects. However, similar track analysis showed that given their more coastal nature, California sea lions were within a 25-km-diameter region of disturbance more often.

  13. Biallelic insertion of a transcriptional terminator via the CRISPR/Cas9 system efficiently silences expression of protein-coding and non-coding RNA genes.

    PubMed

    Liu, Yangyang; Han, Xiao; Yuan, Junting; Geng, Tuoyu; Chen, Shihao; Hu, Xuming; Cui, Isabelle H; Cui, Hengmi

    2017-04-07

    The type II bacterial CRISPR/Cas9 system is a simple, convenient, and powerful tool for targeted gene editing. Here, we describe a CRISPR/Cas9-based approach for inserting a poly(A) transcriptional terminator into both alleles of a targeted gene to silence protein-coding and non-protein-coding genes, which often play key roles in gene regulation but are difficult to silence via insertion or deletion of short DNA fragments. The integration of 225 bp of bovine growth hormone poly(A) signals into either the first intron or the first exon or behind the promoter of target genes caused efficient termination of expression of PPP1R12C, NSUN2 (protein-coding genes), and MALAT1 (non-protein-coding gene). Both NeoR and PuroR were used as markers in the selection of clonal cell lines with biallelic integration of a poly(A) signal. Genotyping analysis indicated that the cell lines displayed the desired biallelic silencing after a brief selection period. These combined results indicate that this CRISPR/Cas9-based approach offers an easy, convenient, and efficient novel technique for gene silencing in cell lines, especially for those in which gene integration is difficult because of a low efficiency of homology-directed repair.

  14. Depth map coding using residual segmentation for 3D video system

    NASA Astrophysics Data System (ADS)

    Lee, Cheon; Ho, Yo-Sung

    2013-06-01

    Advanced 3D video systems employ multi-view video-plus-depth data to support the free-viewpoint navigation and comfortable 3D viewing; thus efficient depth map coding becomes an important issue. Unlike the color image, the depth map has a property that depth values of the inner part of an object are monotonic, but those of object boundaries change abruptly. Therefore, residual data generated by prediction errors around object boundaries consume many bits in depth map coding. Representing them with segment data can be better than the use of the conventional transformation around the boundary regions. In this paper, we propose an efficient depth map coding method using a residual segmentation instead of using transformation. The proposed residual segmentation divides residual data into two regions with a segment map and two mean values. If the encoder selects the proposed method in terms of rates, two quantized mean values and an index of the segment map are transmitted. Simulation results show significant gains of up to 10 dB compared to the state-of-the-art coders, such as JPEG2000 and H.264/AVC. [Figure not available: see fulltext.

  15. Chirp-Coded Ultraharmonic Imaging with a Modified Clinical Intravascular Ultrasound System.

    PubMed

    Shekhar, Himanshu; Huntzicker, Steven; Awuor, Ivy; Doyley, Marvin M

    2016-11-01

    Imaging plaque microvasculature with contrast-enhanced intravascular ultrasound (IVUS) could help clinicians evaluate atherosclerosis and guide therapeutic interventions. In this study, we evaluated the performance of chirp-coded ultraharmonic imaging using a modified IVUS system (iLab™, Boston Scientific/Scimed) equipped with clinically available peripheral and coronary imaging catheters. Flow phantoms perfused with a phospholipid-encapsulated contrast agent were visualized using ultraharmonic imaging at 12 MHz and 30 MHz transmit frequencies. Flow channels with diameters as small as 0.8 mm and 0.5 mm were visualized using the peripheral and coronary imaging catheters. Radio-frequency signals were acquired at standard IVUS rotation speed, which resulted in a frame rate of 30 frames/s. Contrast-to-tissue ratios up to 17.9 ± 1.11 dB and 10.7 ± 2.85 dB were attained by chirp-coded ultraharmonic imaging at 12 MHz and 30 MHz transmit frequencies, respectively. These results demonstrate the feasibility of performing ultraharmonic imaging at standard frame rates with clinically available IVUS catheters using chirp-coded excitation.

  16. A user`s manual for MASH 1.0: A Monte Carlo Adjoint Shielding Code System

    SciTech Connect

    Johnson, J.O.

    1992-03-01

    The Monte Carlo Adjoint Shielding Code System, MASH, calculates neutron and gamma-ray environments and radiation protection factors for armored military vehicles, structures, trenches, and other shielding configurations by coupling a forward discrete ordinates air-over-ground transport calculation with an adjoint Monte Carlo treatment of the shielding geometry. Efficiency and optimum use of computer time are emphasized. The code system include the GRTUNCL and DORT codes for air-over-ground transport calculations, the MORSE code with the GIFT5 combinatorial geometry package for adjoint shielding calculations, and several peripheral codes that perform the required data preparations, transformations, and coupling functions. MASH is the successor to the Vehicle Code System (VCS) initially developed at Oak Ridge National Laboratory (ORNL). The discrete ordinates calculation determines the fluence on a coupling surface surrounding the shielding geometry due to an external neutron/gamma-ray source. The Monte Carlo calculation determines the effectiveness of the fluence at that surface in causing a response in a detector within the shielding geometry, i.e., the ``dose importance`` of the coupling surface fluence. A coupling code folds the fluence together with the dose importance, giving the desired dose response. The coupling code can determine the dose response a a function of the shielding geometry orientation relative to the source, distance from the source, and energy response of the detector. This user`s manual includes a short description of each code, the input required to execute the code along with some helpful input data notes, and a representative sample problem (input data and selected output edits) for each code.

  17. The SWAN/NPSOL code system for multivariable multiconstraint shield optimization

    SciTech Connect

    Watkins, E.F.; Greenspan, E.

    1995-12-31

    SWAN is a useful code for optimization of source-driven systems, i.e., systems for which the neutron and photon distribution is the solution of the inhomogeneous transport equation. Over the years, SWAN has been applied to the optimization of a variety of nuclear systems, such as minimizing the thickness of fusion reactor blankets and shields, the weight of space reactor shields, the cost for an ICF target chamber shield, and the background radiation for explosive detection systems and maximizing the beam quality for boron neutron capture therapy applications. However, SWAN`s optimization module can handle up to a single constraint and was inefficient in handling problems with many variables. The purpose of this work is to upgrade SWAN`s optimization capability.

  18. A real-time chirp-coded imaging system with tissue attenuation compensation.

    PubMed

    Ramalli, A; Guidi, F; Boni, E; Tortoli, P

    2015-07-01

    In ultrasound imaging, pulse compression methods based on the transmission (TX) of long coded pulses and matched receive filtering can be used to improve the penetration depth while preserving the axial resolution (coded-imaging). The performance of most of these methods is affected by the frequency dependent attenuation of tissue, which causes mismatch of the receiver filter. This, together with the involved additional computational load, has probably so far limited the implementation of pulse compression methods in real-time imaging systems. In this paper, a real-time low-computational-cost coded-imaging system operating on the beamformed and demodulated data received by a linear array probe is presented. The system has been implemented by extending the firmware and the software of the ULA-OP research platform. In particular, pulse compression is performed by exploiting the computational resources of a single digital signal processor. Each image line is produced in less than 20 μs, so that, e.g., 192-line frames can be generated at up to 200 fps. Although the system may work with a large class of codes, this paper has been focused on the test of linear frequency modulated chirps. The new system has been used to experimentally investigate the effects of tissue attenuation so that the design of the receive compression filter can be accordingly guided. Tests made with different chirp signals confirm that, although the attainable compression gain in attenuating media is lower than the theoretical value expected for a given TX Time-Bandwidth product (BT), good SNR gains can be obtained. For example, by using a chirp signal having BT=19, a 13 dB compression gain has been measured. By adapting the frequency band of the receiver to the band of the received echo, the signal-to-noise ratio and the penetration depth have been further increased, as shown by real-time tests conducted on phantoms and in vivo. In particular, a 2.7 dB SNR increase has been measured through a

  19. Long-term consequences of non-intentional flows of substances: Modelling non-intentional flows of lead in the Dutch economic system and evaluating their environmental consequences

    SciTech Connect

    Elshkaki, Ayman Voet, Ester van der; Holderbeke, Mirja van; Timmermans, Veerle

    2009-06-15

    Substances may enter the economy and the environment through both intentional and non-intentional flows. These non-intentional flows, including the occurrence of substances as pollutants in mixed primary resources (metal ores, phosphate ores and fossil fuels) and their presence in re-used waste streams from intentional use may have environmental and economic consequences in terms of pollution and resource availability. On the one hand, these non-intentional flows may cause pollution problems. On the other hand, these flows have the potential to be a secondary source of substances. This article aims to quantify and model the non-intentional flows of lead, to evaluate their long-term environmental consequences, and compare these consequences to those of the intentional flows of lead. To meet this goal, the model combines all the sources of non-intentional flows of lead within one model, which also includes the intentional flows. Application of the model shows that the non-intentional flows of lead related to waste streams associated with intentional use are decreasing over time, due to the increased attention given to waste management. However, as contaminants in mixed primary resources application, lead flows are increasing as demand for these applications is increasing.

  20. Implementation of the probability table method in a continuous-energy Monte Carlo code system

    SciTech Connect

    Sutton, T.M.; Brown, F.B.

    1998-10-01

    RACER is a particle-transport Monte Carlo code that utilizes a continuous-energy treatment for neutrons and neutron cross section data. Until recently, neutron cross sections in the unresolved resonance range (URR) have been treated in RACER using smooth, dilute-average representations. This paper describes how RACER has been modified to use probability tables to treat cross sections in the URR, and the computer codes that have been developed to compute the tables from the unresolved resonance parameters contained in ENDF/B data files. A companion paper presents results of Monte Carlo calculations that demonstrate the effect of the use of probability tables versus the use of dilute-average cross sections for the URR. The next section provides a brief review of the probability table method as implemented in the RACER system. The production of the probability tables for use by RACER takes place in two steps. The first step is the generation of probability tables from the nuclear parameters contained in the ENDF/B data files. This step, and the code written to perform it, are described in Section 3. The tables produced are at energy points determined by the ENDF/B parameters and/or accuracy considerations. The tables actually used in the RACER calculations are obtained in the second step from those produced in the first. These tables are generated at energy points specific to the RACER calculation. Section 4 describes this step and the code written to implement it, as well as modifications made to RACER to enable it to use the tables. Finally, some results and conclusions are presented in Section 5.

  1. A PARALLEL MONTE CARLO CODE FOR SIMULATING COLLISIONAL N-BODY SYSTEMS

    SciTech Connect

    Pattabiraman, Bharath; Umbreit, Stefan; Liao, Wei-keng; Choudhary, Alok; Kalogera, Vassiliki; Memik, Gokhan; Rasio, Frederic A.

    2013-02-15

    We present a new parallel code for computing the dynamical evolution of collisional N-body systems with up to N {approx} 10{sup 7} particles. Our code is based on the Henon Monte Carlo method for solving the Fokker-Planck equation, and makes assumptions of spherical symmetry and dynamical equilibrium. The principal algorithmic developments involve optimizing data structures and the introduction of a parallel random number generation scheme as well as a parallel sorting algorithm required to find nearest neighbors for interactions and to compute the gravitational potential. The new algorithms we introduce along with our choice of decomposition scheme minimize communication costs and ensure optimal distribution of data and workload among the processing units. Our implementation uses the Message Passing Interface library for communication, which makes it portable to many different supercomputing architectures. We validate the code by calculating the evolution of clusters with initial Plummer distribution functions up to core collapse with the number of stars, N, spanning three orders of magnitude from 10{sup 5} to 10{sup 7}. We find that our results are in good agreement with self-similar core-collapse solutions, and the core-collapse times generally agree with expectations from the literature. Also, we observe good total energy conservation, within {approx}< 0.04% throughout all simulations. We analyze the performance of the code, and demonstrate near-linear scaling of the runtime with the number of processors up to 64 processors for N = 10{sup 5}, 128 for N = 10{sup 6} and 256 for N = 10{sup 7}. The runtime reaches saturation with the addition of processors beyond these limits, which is a characteristic of the parallel sorting algorithm. The resulting maximum speedups we achieve are approximately 60 Multiplication-Sign , 100 Multiplication-Sign , and 220 Multiplication-Sign , respectively.

  2. New Site Coefficients and Site Classification System Used in Recent Building Seismic Code Provisions

    USGS Publications Warehouse

    Dobry, R.; Borcherdt, R.D.; Crouse, C.B.; Idriss, I.M.; Joyner, W.B.; Martin, G.R.; Power, M.S.; Rinne, E.E.; Seed, R.B.

    2000-01-01

    Recent code provisions for buildings and other structures (1994 and 1997 NEHRP Provisions, 1997 UBC) have adopted new site amplification factors and a new procedure for site classification. Two amplitude-dependent site amplification factors are specified: Fa for short periods and Fv for longer periods. Previous codes included only a long period factor S and did not provide for a short period amplification factor. The new site classification system is based on definitions of five site classes in terms of a representative average shear wave velocity to a depth of 30 m (V?? s). This definition permits sites to be classified unambiguously. When the shear wave velocity is not available, other soil properties such as standard penetration resistance or undrained shear strength can be used. The new site classes denoted by letters A - E, replace site classes in previous codes denoted by S1 - S4. Site classes A and B correspond to hard rock and rock, Site Class C corresponds to soft rock and very stiff / very dense soil, and Site Classes D and E correspond to stiff soil and soft soil. A sixth site class, F, is defined for soils requiring site-specific evaluations. Both Fa and Fv are functions of the site class, and also of the level of seismic hazard on rock, defined by parameters such as Aa and Av (1994 NEHRP Provisions), Ss and S1 (1997 NEHRP Provisions) or Z (1997 UBC). The values of Fa and Fv decrease as the seismic hazard on rock increases due to soil nonlinearity. The greatest impact of the new factors Fa and Fv as compared with the old S factors occurs in areas of low-to-medium seismic hazard. This paper summarizes the new site provisions, explains the basis for them, and discusses ongoing studies of site amplification in recent earthquakes that may influence future code developments.

  3. A Parallel Monte Carlo Code for Simulating Collisional N-body Systems

    NASA Astrophysics Data System (ADS)

    Pattabiraman, Bharath; Umbreit, Stefan; Liao, Wei-keng; Choudhary, Alok; Kalogera, Vassiliki; Memik, Gokhan; Rasio, Frederic A.

    2013-02-01

    We present a new parallel code for computing the dynamical evolution of collisional N-body systems with up to N ~ 107 particles. Our code is based on the Hénon Monte Carlo method for solving the Fokker-Planck equation, and makes assumptions of spherical symmetry and dynamical equilibrium. The principal algorithmic developments involve optimizing data structures and the introduction of a parallel random number generation scheme as well as a parallel sorting algorithm required to find nearest neighbors for interactions and to compute the gravitational potential. The new algorithms we introduce along with our choice of decomposition scheme minimize communication costs and ensure optimal distribution of data and workload among the processing units. Our implementation uses the Message Passing Interface library for communication, which makes it portable to many different supercomputing architectures. We validate the code by calculating the evolution of clusters with initial Plummer distribution functions up to core collapse with the number of stars, N, spanning three orders of magnitude from 105 to 107. We find that our results are in good agreement with self-similar core-collapse solutions, and the core-collapse times generally agree with expectations from the literature. Also, we observe good total energy conservation, within <~ 0.04% throughout all simulations. We analyze the performance of the code, and demonstrate near-linear scaling of the runtime with the number of processors up to 64 processors for N = 105, 128 for N = 106 and 256 for N = 107. The runtime reaches saturation with the addition of processors beyond these limits, which is a characteristic of the parallel sorting algorithm. The resulting maximum speedups we achieve are approximately 60×, 100×, and 220×, respectively.

  4. Assessment of Spacecraft Systems Integration Using the Electric Propulsion Interactions Code (EPIC)

    NASA Technical Reports Server (NTRS)

    Mikellides, Ioannis G.; Kuharski, Robert A.; Mandell, Myron J.; Gardner, Barbara M.; Kauffman, William J. (Technical Monitor)

    2002-01-01

    SAIC is currently developing the Electric Propulsion Interactions Code 'EPIC', an interactive computer tool that allows the construction of a 3-D spacecraft model, and the assessment of interactions between its subsystems and the plume from an electric thruster. EPIC unites different computer tools to address the complexity associated with the interaction processes. This paper describes the overall architecture and capability of EPIC including the physics and algorithms that comprise its various components. Results from selected modeling efforts of different spacecraft-thruster systems are also presented.

  5. On the Efficacy of Source Code Optimizations for Cache-Based Systems

    NASA Technical Reports Server (NTRS)

    VanderWijngaart, Rob F.; Saphir, William C.; Saini, Subhash (Technical Monitor)

    1998-01-01

    Obtaining high performance without machine-specific tuning is an important goal of scientific application programmers. Since most scientific processing is done on commodity microprocessors with hierarchical memory systems, this goal of "portable performance" can be achieved if a common set of optimization principles is effective for all such systems. It is widely believed, or at least hoped, that portable performance can be realized. The rule of thumb for optimization on hierarchical memory systems is to maximize temporal and spatial locality of memory references by reusing data and minimizing memory access stride. We investigate the effects of a number of optimizations on the performance of three related kernels taken from a computational fluid dynamics application. Timing the kernels on a range of processors, we observe an inconsistent and often counterintuitive impact of the optimizations on performance. In particular, code variations that have a positive impact on one architecture can have a negative impact on another, and variations expected to be unimportant can produce large effects. Moreover, we find that cache miss rates-as reported by a cache simulation tool, and confirmed by hardware counters-only partially explain the results. By contrast, the compiler-generated assembly code provides more insight by revealing the importance of processor-specific instructions and of compiler maturity, both of which strongly, and sometimes unexpectedly, influence performance. We conclude that it is difficult to obtain performance portability on modern cache-based computers, and comment on the implications of this result.

  6. On the Efficacy of Source Code Optimizations for Cache-Based Systems

    NASA Technical Reports Server (NTRS)

    VanderWijngaart, Rob F.; Saphir, William C.

    1998-01-01

    Obtaining high performance without machine-specific tuning is an important goal of scientific application programmers. Since most scientific processing is done on commodity microprocessors with hierarchical memory systems, this goal of "portable performance" can be achieved if a common set of optimization principles is effective for all such systems. It is widely believed, or at least hoped, that portable performance can be realized. The rule of thumb for optimization on hierarchical memory systems is to maximize temporal and spatial locality of memory references by reusing data and minimizing memory access stride. We investigate the effects of a number of optimizations on the performance of three related kernels taken from a computational fluid dynamics application. Timing the kernels on a range of processors, we observe an inconsistent and often counterintuitive impact of the optimizations on performance. In particular, code variations that have a positive impact on one architecture can have a negative impact on another, and variations expected to be unimportant can produce large effects. Moreover, we find that cache miss rates - as reported by a cache simulation tool, and confirmed by hardware counters - only partially explain the results. By contrast, the compiler-generated assembly code provides more insight by revealing the importance of processor-specific instructions and of compiler maturity, both of which strongly, and sometimes unexpectedly, influence performance. We conclude that it is difficult to obtain performance portability on modern cache-based computers, and comment on the implications of this result.

  7. Phase transfer function based method to alleviate image artifacts in wavefront coding imaging system

    NASA Astrophysics Data System (ADS)

    Mo, Xutao; Wang, Jinjiang

    2013-09-01

    Wavefront coding technique can extend the depth of filed (DOF) of the incoherent imaging system. Several rectangular separable phase masks (such as cubic type, exponential type, logarithmic type, sinusoidal type, rational type, et al) have been proposed and discussed, because they can extend the DOF up to ten times of the DOF of ordinary imaging system. But according to the research on them, researchers have pointed out that the images are damaged by the artifacts, which usually come from the non-linear phase transfer function (PTF) differences between the PTF used in the image restoration filter and the PTF related to real imaging condition. In order to alleviate the image artifacts in imaging systems with wavefront coding, an optimization model based on the PTF was proposed to make the PTF invariance with the defocus. Thereafter, an image restoration filter based on the average PTF in the designed depth of field was introduced along with the PTF-based optimization. The combination of the optimization and the image restoration proposed can alleviate the artifacts, which was confirmed by the imaging simulation of spoke target. The cubic phase mask (CPM) and exponential phase mask (EPM) were discussed as example.

  8. Hybrid optical-digital encryption system based on wavefront coding paradigm

    NASA Astrophysics Data System (ADS)

    Konnik, Mikhail V.

    2012-04-01

    The wavefront coding is a widely used in the optical systems to compensate aberrations and increase the depth of field. This paper presents experimental results on application of the wavefront coding paradigm for data encryption. We use a synthesised diffractive optical element (DOE) to deliberately introduce a phase distortion during the images registration process to encode the acquired image. In this case, an optical convolution of the input image with the point spread function (PSF) of the DOE is registered. The encryption is performed optically, and is therefore is fast and secure. Since the introduced distortion is the same across the image, the decryption is performed digitally using deconvolution methods. However, due to noise and finite accuracy of a photosensor, the reconstructed image is degraded but still readable. The experimental results, which are presented in this paper, indicate that the proposed hybrid optical-digital system can be implemented as a portable device using inexpensive off-the-shelf components. We present the results of optical encryption and digital restoration with quantitative estimations of the images quality. Details of hardware optical implementation of the hybrid optical-digital encryption system are discussed.

  9. Flexible Coding of Task Rules in Frontoparietal Cortex: An Adaptive System for Flexible Cognitive Control.

    PubMed

    Woolgar, Alexandra; Afshar, Soheil; Williams, Mark A; Rich, Anina N

    2015-10-01

    How do our brains achieve the cognitive control that is required for flexible behavior? Several models of cognitive control propose a role for frontoparietal cortex in the structure and representation of task sets or rules. For behavior to be flexible, however, the system must also rapidly reorganize as mental focus changes. Here we used multivoxel pattern analysis of fMRI data to demonstrate adaptive reorganization of frontoparietal activity patterns following a change in the complexity of the task rules. When task rules were relatively simple, frontoparietal cortex did not hold detectable information about these rules. In contrast, when the rules were more complex, frontoparietal cortex showed clear and decodable rule discrimination. Our data demonstrate that frontoparietal activity adjusts to task complexity, with better discrimination of rules that are behaviorally more confusable. The change in coding was specific to the rule element of the task and was not mirrored in more specialized cortex (early visual cortex) where coding was independent of difficulty. In line with an adaptive view of frontoparietal function, the data suggest a system that rapidly reconfigures in accordance with the difficulty of a behavioral task. This system may provide a neural basis for the flexible control of human behavior.

  10. Analytical computation of the derivative of PSF for the optimization of phase mask in wavefront coding system.

    PubMed

    Chen, Xinhua; Zhou, Jiankang; Shen, Weimin

    2016-09-05

    Wavefront coding system can realize defocus invariance of PSF/OTF with a phase mask inserting in the pupil plane. Ideally, the derivative of the PSF/OTF with respect to defocus error should be close to zero as much as possible over the extended depth of field/focus for the wavefront coding system. In this paper, we propose an analytical expression for the computation of the derivative of PSF. With this expression, the derivative of PSF based merit function can be used in the optimization of the wavefront coding system with any type of phase mask and aberrations. Computation of the derivative of PSF using the proposed expression and FFT respectively are compared and discussed. We also demonstrate the optimization of a generic polynomial phase mask in wavefront coding system as an example.

  11. Rhetorical Consequences of the Computer Society: Expert Systems and Human Communication.

    ERIC Educational Resources Information Center

    Skopec, Eric Wm.

    Expert systems are computer programs that solve selected problems by modelling domain-specific behaviors of human experts. These computer programs typically consist of an input/output system that feeds data into the computer and retrieves advice, an inference system using the reasoning and heuristic processes of human experts, and a knowledge…

  12. Short pulse acquisition by low sampling rate with phase-coded sequence in lidar system

    NASA Astrophysics Data System (ADS)

    Wu, Long; Xu, Jiajia; Lv, Wentao; Yang, Xiaocheng

    2016-11-01

    The requirement of high range resolution results in impractical collection of every returned laser pulse due to the limited response speed of imaging detectors. This paper proposes a phase coded sequence acquisition method for signal preprocessing. The system employs an m-sequence with N bits for demonstration with the detector controlled to accumulate N+1 bits of the echo signals to deduce one single returned laser pulse. An indoor experiment achieved 2 μs resolution with the sampling period of 28 μs by employing a 15-bit m-sequence. This method shows the potential to improve the detection capabilities of narrow laser pulses with the detectors at a low frame rate, especially for the imaging lidar systems. Meanwhile, the lidar system is able to improve the range resolution with available detectors of restricted performance.

  13. Strengthening health systems in poor countries: a code of conduct for nongovernmental organizations.

    PubMed

    Pfeiffer, James; Johnson, Wendy; Fort, Meredith; Shakow, Aaron; Hagopian, Amy; Gloyd, Steve; Gimbel-Sherr, Kenneth

    2008-12-01

    The challenges facing efforts in Africa to increase access to antiretroviral HIV treatment underscore the urgent need to strengthen national health systems across the continent. However, donor aid to developing countries continues to be disproportionately channeled to international nongovernmental organizations (NGOs) rather than to ministries of health. The rapid proliferation of NGOs has provoked "brain drain" from the public sector by luring workers away with higher salaries, fragmentation of services, and increased management burdens for local authorities in many countries. Projects by NGOs sometimes can undermine the strengthening of public primary health care systems. We argue for a return to a public focus for donor aid, and for NGOs to adopt a code of conduct that establishes standards and best practices for NGO relationships with public sector health systems.

  14. The physics of compensating calorimetry and the new CALOR89 code system

    SciTech Connect

    Gabriel, T.A.; Brau, J.E.; Bishop, B.L.

    1989-03-01

    Much of the understanding of the physics of calorimetry has come from the use of excellent radiation transport codes. A new understanding of compensating calorimetry was introduced four years ago following detailed studies with a new CALOR system. Now, the CALOR system has again been revised to reflect a better comprehension of high energy nuclear collisions by incorporating a modified high energy fragmentation model from FLUKA87. This revision will allow for the accurate analysis of calorimeters at energies of 100's of GeV. Presented in this paper is a discussion of compensating calorimetry, the new CALOR system, the revisions to HETC, and recently generated calorimeter related data on modes of energy deposition and secondary neutron production (E < 50 MeV) in infinite iron and uranium blocks. 38 refs., 5 figs., 5 tabs.

  15. Speech input system for meat inspection and pathological coding used thereby

    NASA Astrophysics Data System (ADS)

    Abe, Shozo

    Meat inspection is one of exclusive and important jobs of veterinarians though it is not well known in general. As the inspection should be conducted skillfully during a series of continuous operations in a slaughter house, development of automatic inspecting systems has been required for a long time. We employed a hand-free speech input system to record the inspecting data because inspecters have to use their both hands to treat the internals of catles and check their health conditions by necked eyes. The data collected by the inspectors are transfered to a speech recognizer and then stored as controlable data of each catle inspected. Control of terms such as pathological conditions to be input and their coding are also important in this speech input system and practical examples are shown.

  16. Linear-scaling first-principles molecular dynamics of complex biological systems with the Conquest code

    NASA Astrophysics Data System (ADS)

    Otsuka, Takao; Taiji, Makoto; Bowler, David R.; Miyazaki, Tsuyoshi

    2016-11-01

    The recent progress of linear-scaling or O(N) methods in density functional theory (DFT) is remarkable. In this paper, we show that all-atom molecular dynamics simulations of complex biological systems based on DFT are now possible using our linear-scaling DFT code Conquest. We first overview the calculation methods used in Conquest and explain the method introduced recently to realise efficient and robust first-principles molecular dynamics (FPMD) with O(N) DFT. Then, we show that we can perform reliable all-atom FPMD simulations of a hydrated DNA model containing about 3400 atoms. We also report that the velocity scaling method is both reliable and useful for controlling the temperature of the FPMD simulation of this system. From these results, we conclude that reliable FPMD simulations of complex biological systems are now possible with Conquest.

  17. CFC (Comment-First-Coding)--A Simple yet Effective Method for Teaching Programming to Information Systems Students

    ERIC Educational Resources Information Center

    Sengupta, Arijit

    2009-01-01

    Programming courses have always been a difficult part of an Information Systems curriculum. While we do not train Information Systems students to be developers, understanding how to build a system always gives students an added perspective to improve their system design and analysis skills. This teaching tip presents CFC (Comment-First-Coding)--a…

  18. Simulation of Enhanced Geothermal Systems: A Benchmarking and Code Intercomparison Study

    SciTech Connect

    Scheibe, Timothy D.; White, Mark D.; White, Signe K.; Sivaramakrishnan, Chandrika; Purohit, Sumit; Black, Gary D.; Podgorney, Robert; Boyd, Lauren W.; Phillips, Benjamin R.

    2013-06-30

    Numerical simulation codes have become critical tools for understanding complex geologic processes, as applied to technology assessment, system design, monitoring, and operational guidance. Recently the need for quantitatively evaluating coupled Thermodynamic, Hydrologic, geoMechanical, and geoChemical (THMC) processes has grown, driven by new applications such as geologic sequestration of greenhouse gases and development of unconventional energy sources. Here we focus on Enhanced Geothermal Systems (EGS), which are man-made geothermal reservoirs created where hot rock exists but there is insufficient natural permeability and/or pore fluids to allow efficient energy extraction. In an EGS, carefully controlled subsurface fluid injection is performed to enhance the permeability of pre-existing fractures, which facilitates fluid circulation and heat transport. EGS technologies are relatively new, and pose significant simulation challenges. To become a trusted analytical tool for EGS, numerical simulation codes must be tested to demonstrate that they adequately represent the coupled THMC processes of concern. This presentation describes the approach and status of a benchmarking and code intercomparison effort currently underway, supported by the U. S. Department of Energy’s Geothermal Technologies Program. This study is being closely coordinated with a parallel international effort sponsored by the International Partnership for Geothermal Technology (IPGT). We have defined an extensive suite of benchmark problems, test cases, and challenge problems, ranging in complexity and difficulty, and a number of modeling teams are applying various simulation tools to these problems. The descriptions of the problems and modeling results are being compiled using the Velo framework, a scientific workflow and data management environment accessible through a simple web-based interface.

  19. The physics and technology basis entering European system code studies for DEMO

    NASA Astrophysics Data System (ADS)

    Wenninger, R.; Kembleton, R.; Bachmann, C.; Biel, W.; Bolzonella, T.; Ciattaglia, S.; Cismondi, F.; Coleman, M.; Donné, A. J. H.; Eich, T.; Fable, E.; Federici, G.; Franke, T.; Lux, H.; Maviglia, F.; Meszaros, B.; Pütterich, T.; Saarelma, S.; Snickers, A.; Villone, F.; Vincenzi, P.; Wolff, D.; Zohm, H.

    2017-01-01

    A large scale program to develop a conceptual design for a demonstration fusion power plant (DEMO) has been initiated in Europe. Central elements are the baseline design points, which are developed by system codes. The assessment of the credibility of these design points is often hampered by missing information. The main physics and technology content of the central European system codes have been published (Kovari et al 2014 Fusion Eng. Des. 89 3054-69, 2016 Fusion Eng. Des. 104 9-20, Reux et al 2015 Nucl. Fusion 55 073011). In addition, this publication discusses key input parameters for the pulsed and conservative design option \\tt{EU DEMO1 2015} and provides justifications for the parameter choices. In this context several DEMO physics gaps are identified, which need to be addressed in the future to reduce the uncertainty in predicting the performance of the device. Also the sensitivities of net electric power and pulse duration to variations of the input parameters are investigated. The most extreme sensitivity is found for the elongation ( Δ {κ95}=10 % corresponds to Δ {{P}\\text{el,\\text{net}}}=125 % ).

  20. Ultrasonic irradiation for ultrafiltration membrane cleaning in MBR systems: operational conditions and consequences.

    PubMed

    Ruiz, L M; Perez, J I; Gómez, A; Letona, A; Gómez, M A

    2017-02-01

    Ultrasonic irradiation is one of the most promising membrane cleaning techniques for membrane bioreactors (MBRs) because of several advantages such as high flux-recovery capacity and in situ application without interrupting the filtration process. However, significant contradictions may be found and, consequently, this method has not yet been widely developed. In this paper, four MBRs equipped with hollow-fibre polyvinylidene fluoride ultrafiltration membranes were operated continuously. The cleaning method applied consisted of sonication at low power (15 W) with different frequencies (20, 25, 30, and 40 kHz) for each module and aerated backwashing. The different MBRs were analysed comparatively between them and with a conventional MBR in order to check the effects of the irradiated waves on membrane integrity, effluent quality and process performance. Effluent turbidity and chemical oxygen demand, total and volatile suspended solid concentration and activated sludge viscosity were affected by biomass fragmentation or membrane cake removal, mainly at lower frequencies. The best transmembrane pressure control was achieved at the frequency of 20 kHz without a significant effect on membrane integrity. The results showed that under these operational conditions, no negative effects on effluent quality or membrane integrity were found, suggesting that this method was suitable for this type of membrane.

  1. The Consequences of Using One Assessment System to Pursue Two Objectives

    ERIC Educational Resources Information Center

    Neal, Derek

    2013-01-01

    Education officials often use one assessment system both to create measures of student achievement and to create performance metrics for educators. However, modern standardized testing systems are not designed to produce performance metrics for teachers or principals. They are designed to produce reliable measures of individual student achievement…

  2. Utilization of recently developed codes for high power Brayton and Rankine cycle power systems

    NASA Technical Reports Server (NTRS)

    Doherty, Michael P.

    1993-01-01

    Two recently developed FORTRAN computer codes for high power Brayton and Rankine thermodynamic cycle analysis for space power applications are presented. The codes were written in support of an effort to develop a series of subsystem models for multimegawatt Nuclear Electric Propulsion, but their use is not limited just to nuclear heat sources or to electric propulsion. Code development background, a description of the codes, some sample input/output from one of the codes, and state future plans/implications for the use of these codes by NASA's Lewis Research Center are provided.

  3. Performance improvement of hybrid subcarrier multiplexing optical spectrum code division multiplexing system using spectral direct decoding detection technique

    NASA Astrophysics Data System (ADS)

    Sahbudin, R. K. Z.; Abdullah, M. K.; Mokhtar, M.

    2009-06-01

    This paper proposes a hybrid subcarrier multiplexing/optical spectrum code division multiplexing (SCM/OSCDM) system for the purpose of combining the advantages of both techniques. Optical spectrum code division multiple-access (OSCDMA) is one of the multiplexing techniques that is becoming popular because of the flexibility in the allocation of channels, ability to operate asynchronously, enhanced privacy and increased capacity in bursty nature networks. On the other hand, subcarrier multiplexing (SCM) technique is able to enhance the channel data rate of OSCDMA systems. In this paper, a newly developed detection technique for the OSCDM called spectral direct decoding (SDD) detection technique is compared mathematically with the AND subtraction detection technique. The system utilizes a new unified code construction named KS (Khazani-Syed) code. The results characterizing the bit-error-rate (BER) show that SDD offers a significant improved performance at BER of 10 -9.

  4. SCALE: A modular code system for performing standardized computer analyses for licensing evaluation: Functional modules F1-F8

    SciTech Connect

    1997-03-01

    This Manual represents Revision 5 of the user documentation for the modular code system referred to as SCALE. The history of the SCALE code system dates back to 1969 when the current Computational Physics and Engineering Division at Oak Ridge National Laboratory (ORNL) began providing the transportation package certification staff at the U.S. Atomic Energy Commission with computational support in the use of the new KENO code for performing criticality safety assessments with the statistical Monte Carlo method. From 1969 to 1976 the certification staff relied on the ORNL staff to assist them in the correct use of codes and data for criticality, shielding, and heat transfer analyses of transportation packages. However, the certification staff learned that, with only occasional use of the codes, it was difficult to become proficient in performing the calculations often needed for an independent safety review. Thus, shortly after the move of the certification staff to the U.S. Nuclear Regulatory Commission (NRC), the NRC staff proposed the development of an easy-to-use analysis system that provided the technical capabilities of the individual modules with which they were familiar. With this proposal, the concept of the Standardized Computer Analyses for Licensing Evaluation (SCALE) code system was born. This volume consists of the section of the manual dealing with eight of the functional modules in the code. Those are: BONAMI - resonance self-shielding by the Bondarenko method; NITAWL-II - SCALE system module for performing resonance shielding and working library production; XSDRNPM - a one-dimensional discrete-ordinates code for transport analysis; XSDOSE - a module for calculating fluxes and dose rates at points outside a shield; KENO IV/S - an improved monte carlo criticality program; COUPLE; ORIGEN-S - SCALE system module to calculate fuel depletion, actinide transmutation, fission product buildup and decay, and associated radiation source terms; ICE.

  5. Optimal performance of networked control systems with bandwidth and coding constraints.

    PubMed

    Zhan, Xi-Sheng; Sun, Xin-xiang; Li, Tao; Wu, Jie; Jiang, Xiao-Wei

    2015-11-01

    The optimal tracking performance of multiple-input multiple-output (MIMO) discrete-time networked control systems with bandwidth and coding constraints is studied in this paper. The optimal tracking performance of networked control system is obtained by using spectral factorization technique and partial fraction. The obtained results demonstrate that the optimal performance is influenced by the directions and locations of the nonminimum phase zeros and unstable poles of the given plant. In addition to that, the characters of the reference signal, encoding, the bandwidth and additive white Gaussian noise (AWGN) of the communication channel are also closely influenced by the optimal tracking performance. Some typical examples are given to illustrate the theoretical results.

  6. High-Fidelity Lattice Physics Capabilities of the SCALE Code System Using TRITON

    SciTech Connect

    DeHart, Mark D

    2007-01-01

    Increasing complexity in reactor designs suggests a need to reexamine of methods applied in spent-fuel characterization. The ability to accurately predict the nuclide composition of depleted reactor fuel is important in a wide variety of applications. These applications include, but are not limited to, the design, licensing, and operation of commercial/research reactors and spent-fuel transport/storage systems. New complex design projects such as space reactors and Generation IV power reactors also require calculational methods that provide accurate prediction of the isotopic inventory. New high-fidelity physics methods will be required to better understand the physics associated with both evolutionary and revolutionary reactor concepts as they depart from traditional and well-understood light-water reactor designs. The TRITON sequence of the SCALE code system provides a powerful, robust, and rigorous approach for reactor physics analysis. This paper provides a detailed description of TRITON in terms of its key components used in reactor calculations.

  7. Biconjugate gradient stabilized method in image deconvolution of a wavefront coding system

    NASA Astrophysics Data System (ADS)

    Liu, Peng; Liu, Qin-xiao; Zhao, Ting-yu; Chen, Yan-ping; Yu, Fei-hong

    2013-04-01

    The point spread function (PSF) is a non-rotational symmetric for the wavefront coding (WFC) system with a cubic phase mask (CPM). Antireflective boundary conditions (BCs) are used to eliminate the ringing effect on the border and vibration on the edge of the image. The Kronecker product approximation is used to reduce the computation consumption. The image-formation process of the WFC system is transformed into a matrix equation. In order to save storage space, biconjugate gradient (Bi-CG) and biconjugate gradient stabilized (Bi-CGSTAB) methods are used to solve the asymmetric matrix equation, which is a typical iteration algorithm of the Krylov subspace using the two-side Lanczos process. Simulation and experimental results illustrate the efficiency of the proposed algorithm for the image deconvolution. The result based on the Bi-CGSTAB method is smoother than the classic Wiener filter, while preserving more details than the Truncated Singular Value Decomposition (TSVD) method.

  8. Method and system for pattern analysis using a coarse-coded neural network

    NASA Technical Reports Server (NTRS)

    Spirkovska, Liljana (Inventor); Reid, Max B. (Inventor)

    1994-01-01

    A method and system for performing pattern analysis with a neural network coarse-coding a pattern to be analyzed so as to form a plurality of sub-patterns collectively defined by data. Each of the sub-patterns comprises sets of pattern data. The neural network includes a plurality fields, each field being associated with one of the sub-patterns so as to receive the sub-pattern data therefrom. Training and testing by the neural network then proceeds in the usual way, with one modification: the transfer function thresholds the value obtained from summing the weighted products of each field over all sub-patterns associated with each pattern being analyzed by the system.

  9. A multi-layer VLC imaging system based on space-time trace-orthogonal coding

    NASA Astrophysics Data System (ADS)

    Li, Peng-Xu; Yang, Yu-Hong; Zhu, Yi-Jun; Zhang, Yan-Yu

    2017-02-01

    In visible light communication (VLC) imaging systems, different properties of data are usually demanded for transmission with different priorities in terms of reliability and/or validity. For this consideration, a novel transmission scheme called space-time trace-orthogonal coding (STTOC) for VLC is proposed in this paper by taking full advantage of the characteristics of time-domain transmission and space-domain orthogonality. Then, several constellation designs for different priority strategies subject to the total power constraint are presented. One significant advantage of this novel scheme is that the inter-layer interference (ILI) can be eliminated completely and the computation complexity of maximum likelihood (ML) detection is linear. Computer simulations verify the correctness of our theoretical analysis, and demonstrate that both transmission rate and error performance of the proposed scheme greatly outperform the conventional multi-layer transmission system.

  10. Model of U3Si2 Fuel System using BISON Fuel Code

    SciTech Connect

    K. E. Metzger; T. W. Knight; R. L. Williamson

    2014-04-01

    This research considers the proposed advanced fuel system: U3Si2 combined with an advanced cladding. U3Si2 has a number of advantageous thermophysical properties, which motivate its use as an accident tolerant fuel. This preliminary model evaluates the behavior of U3Si2 using available thermophysical data to predict the cladding-fuel pellet temperature and stress using the fuel performance code: BISON. The preliminary results obtained from the U3Si2 fuel model describe the mechanism of Pellet-Clad Mechanical Interaction for this system while more extensive testing including creep testing of U3Si2 is planned for improved understanding of thermophysical properties for predicting fuel performance.

  11. Space applications of the MITS electron-photon Monte Carlo transport code system

    SciTech Connect

    Kensek, R.P.; Lorence, L.J.; Halbleib, J.A.; Morel, J.E.

    1996-07-01

    The MITS multigroup/continuous-energy electron-photon Monte Carlo transport code system has matured to the point that it is capable of addressing more realistic three-dimensional adjoint applications. It is first employed to efficiently predict point doses as a function of source energy for simple three-dimensional experimental geometries exposed to simulated uniform isotropic planar sources of monoenergetic electrons up to 4.0 MeV. Results are in very good agreement with experimental data. It is then used to efficiently simulate dose to a detector in a subsystem of a GPS satellite due to its natural electron environment, employing a relatively complex model of the satellite. The capability for survivability analysis of space systems is demonstrated, and results are obtained with and without variance reduction.

  12. RIES - Rijnland Internet Election System: A Cursory Study of Published Source Code

    NASA Astrophysics Data System (ADS)

    Gonggrijp, Rop; Hengeveld, Willem-Jan; Hotting, Eelco; Schmidt, Sebastian; Weidemann, Frederik

    The Rijnland Internet Election System (RIES) is a system designed for voting in public elections over the internet. A rather cursory scan of the source code to RIES showed a significant lack of security-awareness among the programmers which - among other things - appears to have left RIES vulnerable to near-trivial attacks. If it had not been for independent studies finding problems, RIES would have been used in the 2008 Water Board elections, possibly handling a million votes or more. While RIES was more extensively studied to find cryptographic shortcomings, our work shows that more down-to-earth secure design practices can be at least as important, and the aspects need to be examined much sooner than right before an election.

  13. [Pain assessment using the Facial Action Coding System. A systematic review].

    PubMed

    Rojo, Rosa; Prados-Frutos, Juan Carlos; López-Valverde, Antonio

    2015-10-21

    Self-reporting is the most widely used pain measurement tool, although it may not be useful in patients with loss or deficit in communication skills. The aim of this paper was to undertake a systematic review of the literature of pain assessment through the Facial Action Coding System (FACS). The initial search found 4,335 references and, within the restriction «FACS», these were reduced to 40 (after exclusion of duplicates). Finally, only 26 articles meeting the inclusion criteria were included. Methodological quality was assessed using the GRADE system. Most patients were adults and elderly health conditions, or cognitive deficits and/or chronic pain. Our conclusion is that FACS is a reliable and objective tool in the detection and quantification of pain in all patients.

  14. Polycistronic mRNAs code for polypeptides of the Vibrio harveyi luminescence system

    SciTech Connect

    Miyamoto, C.M.; Graham, A.D.; Boylan, M.; Evans, J.F.; Hasel, K.W.; Meighen, E.A.; Graham, A.F.

    1985-03-01

    DNA coding for the ..cap alpha.. and ..beta.. subunits of Vibrio harveyi luciferase, the luxA and luxB genes, and the adjoining chromosomal regions on both sides of these genes (total of 18 kilobase pairs) was cloned into Escherichia coli. Using labeled DNA coding for the ..cap alpha.. subunit as a hybridization probe, the authors identified a set of polycistronic mRNAs (2.6, 4, 7, and 8 kilobases) by Northern blotting; the most prominent of these was the one 4 kilobases long. This set of mRNAs was induced during the development of bioluminescence in V. harveyi. Furthermore, the same set of mRNAs was synthesized in E. coli by a recombinant plasmid that contained a 12-kilobase pair length of V. harveyi DNA and expressed the genes for the luciferase subunits. A cloned DNA segment corresponding to the major 4-kilobase mRNA coded for the ..cap alpha.. and ..beta.. subunits of luciferase, as well as a 32,000-dalton protein upstream from these genes that could be specifically modified by acyl-coenzyme A and is a component of the bioluminescence system. V. harveyi mRNA that was hybridized to the released from cloned DNA encompassing the luxA and luxB genes was translated in vitro. Luciferase ..cap alpha.. and ..beta.. subunits and the 32,000-dalton polypeptide were detected among the products, along with 42,000- and 55,000-dalton polypeptides, which are encoded downstream from the lux genes and are thought to be involved in luminescence.

  15. Enhancing transfusion safety with an innovative bar-code-based tracking system.

    PubMed

    Askeland, Ryan W; McGrane, Steve P; Reifert, Dan R; Kemp, John D

    2009-01-01

    In an effort to reduce transfusion errors, a novel, comprehensive, computerized wireless bar-code-based tracking system for matching patients, blood samples and blood products was created and deployed at a major academic medical centre. With a grant from the Agency for Healthcare Research and Quality, software was developed to track scans at the times of sample collection, sample arrival in the blood bank, blood product dispensation from the blood bank and blood product administration. The system was deployed in February 2005. The system was well accepted from the outset, and the sample rejection rate due to clerical errors fell from 1.82 to 0.17%; incident reports fell by 83%. At the final blood administration step, the accumulated data as of November 2008 indicated that identification errors were being detected and prevented every 42.4 days and that the scan completion rate was stable at about 99%. Process analysis suggested that these were independent events and, thus, would be expected to coincide (and potentially produce a mis-transfusion) every 4,240 days (11.6 years) on average. We estimate that the system is 10 times safer than the manual system previously employed at our institution and may be 15-20 times safer than most systems employed in the United States.

  16. A System for Fault Management and Fault Consequences Analysis for NASA's Deep Space Habitat

    NASA Technical Reports Server (NTRS)

    Colombano, Silvano; Spirkovska, Liljana; Baskaran, Vijaykumar; Aaseng, Gordon; McCann, Robert S.; Ossenfort, John; Smith, Irene; Iverson, David L.; Schwabacher, Mark

    2013-01-01

    NASA's exploration program envisions the utilization of a Deep Space Habitat (DSH) for human exploration of the space environment in the vicinity of Mars and/or asteroids. Communication latencies with ground control of as long as 20+ minutes make it imperative that DSH operations be highly autonomous, as any telemetry-based detection of a systems problem on Earth could well occur too late to assist the crew with the problem. A DSH-based development program has been initiated to develop and test the automation technologies necessary to support highly autonomous DSH operations. One such technology is a fault management tool to support performance monitoring of vehicle systems operations and to assist with real-time decision making in connection with operational anomalies and failures. Toward that end, we are developing Advanced Caution and Warning System (ACAWS), a tool that combines dynamic and interactive graphical representations of spacecraft systems, systems modeling, automated diagnostic analysis and root cause identification, system and mission impact assessment, and mitigation procedure identification to help spacecraft operators (both flight controllers and crew) understand and respond to anomalies more effectively. In this paper, we describe four major architecture elements of ACAWS: Anomaly Detection, Fault Isolation, System Effects Analysis, and Graphic User Interface (GUI), and how these elements work in concert with each other and with other tools to provide fault management support to both the controllers and crew. We then describe recent evaluations and tests of ACAWS on the DSH testbed. The results of these tests support the feasibility and strength of our approach to failure management automation and enhanced operational autonomy

  17. Study on the Tritium Behaviors in the VHTR System. Part 1: Development of Tritium Analysis Code for VHTR and Verification

    SciTech Connect

    Eung Soo Kim; Chang Ho Oh; Mike Patterson

    2010-07-01

    A tritium permeation analyses code (TPAC) has been developed in Idaho National Laboratory (INL) by using MATLAB SIMULINK package for analysis of tritium behaviors in the VHTRs integrated with hydrogen production and process heat application systems. The modeling is based on the mass balance of tritium containing species and hydrogen (i.e., HT, H2, HTO, HTSO4, and TI) coupled with a variety of tritium source, sink, and permeation models. The code includes (1) tritium sources from ternary fission and neutron reactions with 6Li, 7Li 10B, 3He, (2) tritium purification system, (3) leakage of tritium with coolant, (4) permeation through pipes, vessels, and heat exchangers, (4) electrolyzer for high temperature steam electrolysis (HTSE), and (5) isotope exchange for SI process. Verification of the code has been performed by comparisons with the analytical solutions, the experimental data, and the benchmark code results based on the Peach Bottom reactor design. The results showed that all the governing equations are well implemented into the code and correctly solved. This paper summarizes all the background, the theory, the code structures, and some verification results related to the TPAC code development in Idaho National Laboratory (INL).

  18. Managing Errors to Reduce Accidents in High Consequence Networked Information Systems

    SciTech Connect

    Ganter, J.H.

    1999-02-01

    Computers have always helped to amplify and propagate errors made by people. The emergence of Networked Information Systems (NISs), which allow people and systems to quickly interact worldwide, has made understanding and minimizing human error more critical. This paper applies concepts from system safety to analyze how hazards (from hackers to power disruptions) penetrate NIS defenses (e.g., firewalls and operating systems) to cause accidents. Such events usually result from both active, easily identified failures and more subtle latent conditions that have resided in the system for long periods. Both active failures and latent conditions result from human errors. We classify these into several types (slips, lapses, mistakes, etc.) and provide NIS examples of how they occur. Next we examine error minimization throughout the NIS lifecycle, from design through operation to reengineering. At each stage, steps can be taken to minimize the occurrence and effects of human errors. These include defensive design philosophies, architectural patterns to guide developers, and collaborative design that incorporates operational experiences and surprises into design efforts. We conclude by looking at three aspects of NISs that will cause continuing challenges in error and accident management: immaturity of the industry, limited risk perception, and resource tradeoffs.

  19. Ecological and evolutionary consequences of explicit spatial structure in exploiter-victim systems

    NASA Astrophysics Data System (ADS)

    Klopfer, Eric David

    One class of spatial model which has been widely used in ecology has been termed "pseudo-spatial models" and classically employs various types of aggregation in studying the coexistence of competing parasitoids. Yet, little is known about the relative effects of each of these aggregation behaviors. Thus, in Chapter 1 I chose to examine three types of aggregation and explore their relative strengths in promoting coexistence of two competing parasitoids. A striking shortcoming of spatial models in ecology to date is that there is a relative lack of use of spatial models to investigate problems on the evolutionary as opposed to ecological time scale. Consequently, in Chapter 2 I chose to start with a classic problem of evolutionary time scale--the evolution of virulence and predation rates. Debate about this problem has continued through several decades, yet many instances are not adequately explained by current models. In this study I explored the effect of explicit spatial structure on exploitation rates by comparing a cellular automata (CA) exploiter-victim model which incorporates local dynamics to a metapopulation model which does not include such dynamics. One advantage of CA models is that they are defined by simple rules rather than the often complex equations of other types of spatial models. This is an extremely useful attribute when one wants to convey results of models to an audience with an applied bent that is often uncomfortable with hard-to-understand equations. Thus, in Chapter 3, through the use of CA models I show that there are spatial phenomena which alter the impact of introduced predators and that these phenomena are potentially important in the implementation of biocontrol programs. The relatively recent incorporation of spatial models into the ecological literature has left most ecologists and evolutionary biologists without the ability to understand, let alone employ, spatial models in evolutionary problems. In order to give the next

  20. Consequences of different suckling systems for reproductive activity and productivity of cattle in tropical conditions.

    PubMed

    Galina, C S.; Rubio, I; Basurto, H; Orihuela, A

    2001-05-02

    The late onset of ovarian activity in mature cattle raised under tropical conditions is the major setback impeding a sound reproductive performance needed for the increasing demand of livestock products in the area. The effect of suckling has been circled as one of the most important factors impeding ovarian activity. Farmers in this region have used the most diverse set of management tools to overcome the suckling effect without compromising reproduction, the health of the calf, growth until weaning, milk production and a correct function of the mammary gland.Farmer interventions can be divided in: (1) early weaning (about 1 week of age); (2) weaning at 1, 3 or 5 months; (3) restricted suckling; (4) partial weaning. These systems can be affected by the breed of the animal, the location of the enterprise, infrastructure in the farm, time of the year and system of separation. The advantages and disadvantages of these systems are discussed in this review.

  1. SCALE: A modular code system for performing standardized computer analyses for licensing evaluation: Control modules C4, C6

    SciTech Connect

    1997-03-01

    This Manual represents Revision 5 of the user documentation for the modular code system referred to as SCALE. The history of the SCALE code system dates back to 1969 when the current Computational Physics and Engineering Division at Oak Ridge National Laboratory (ORNL) began providing the transportation package certification staff at the U. S. Atomic Energy Commission with computational support in the use of the new KENO code for performing criticality safety assessments with the statistical Monte Carlo method. From 1969 to 1976 the certification staff relied on the ORNL staff to assist them in the correct use of codes and data for criticality, shielding, and heat transfer analyses of transportation packages. However, the certification staff learned that, with only occasional use of the codes, it was difficult to become proficient in performing the calculations often needed for an independent safety review. Thus, shortly after the move of the certification staff to the U.S. Nuclear Regulatory Commission (NRC), the NRC staff proposed the development of an easy-to-use analysis system that provided the technical capabilities of the individual modules with which they were familiar. With this proposal, the concept of the Standardized Computer Analyses for Licensing Evaluation (SCALE) code system was born. This volume is part of the manual related to the control modules for the newest updated version of this computational package.

  2. GENII (Generation II): The Hanford Environmental Radiation Dosimetry Software System: Volume 3, Code maintenance manual: Hanford Environmental Dosimetry Upgrade Project

    SciTech Connect

    Napier, B.A.; Peloquin, R.A.; Strenge, D.L.; Ramsdell, J.V.

    1988-09-01

    The Hanford Environmental Dosimetry Upgrade Project was undertaken to incorporate the internal dosimetry models recommended by the International Commission on Radiological Protection (ICRP) in updated versions of the environmental pathway analysis models used at Hanford. The resulting second generation of Hanford environmental dosimetry computer codes is compiled in the Hanford Environmental Dosimetry System (Generation II, or GENII). This coupled system of computer codes is intended for analysis of environmental contamination resulting from acute or chronic releases to, or initial contamination of, air, water, or soil, on through the calculation of radiation doses to individuals or populations. GENII is described in three volumes of documentation. This volume is a Code Maintenance Manual for the serious user, including code logic diagrams, global dictionary, worksheets to assist with hand calculations, and listings of the code and its associated data libraries. The first volume describes the theoretical considerations of the system. The second volume is a Users' Manual, providing code structure, users' instructions, required system configurations, and QA-related topics. 7 figs., 5 tabs.

  3. Burst Firing is a Neural Code in an Insect Auditory System

    PubMed Central

    Eyherabide, Hugo G.; Rokem, Ariel; Herz, Andreas V. M.; Samengo, Inés

    2008-01-01

    Various classes of neurons alternate between high-frequency discharges and silent intervals. This phenomenon is called burst firing. To analyze burst activity in an insect system, grasshopper auditory receptor neurons were recorded in vivo for several distinct stimulus types. The experimental data show that both burst probability and burst characteristics are strongly influenced by temporal modulations of the acoustic stimulus. The tendency to burst, hence, is not only determined by cell-intrinsic processes, but also by their interaction with the stimulus time course. We study this interaction quantitatively and observe that bursts containing a certain number of spikes occur shortly after stimulus deflections of specific intensity and duration. Our findings suggest a sparse neural code where information about the stimulus is represented by the number of spikes per burst, irrespective of the detailed interspike-interval structure within a burst. This compact representation cannot be interpreted as a firing-rate code. An information-theoretical analysis reveals that the number of spikes per burst reliably conveys information about the amplitude and duration of sound transients, whereas their time of occurrence is reflected by the burst onset time. The investigated neurons encode almost half of the total transmitted information in burst activity. PMID:18946533

  4. Two Novel Space-Time Coding Techniques Designed for UWB MISO Systems Based on Wavelet Transform

    PubMed Central

    Zaki, Amira Ibrahim; El-Khamy, Said E.

    2016-01-01

    In this paper two novel space-time coding multi-input single-output (STC MISO) schemes, designed especially for Ultra-Wideband (UWB) systems, are introduced. The proposed schemes are referred to as wavelet space-time coding (WSTC) schemes. The WSTC schemes are based on two types of multiplexing, spatial and wavelet domain multiplexing. In WSTC schemes, four symbols are transmitted on the same UWB transmission pulse with the same bandwidth, symbol duration, and number of transmitting antennas of the conventional STC MISO scheme. The used mother wavelet (MW) is selected to be highly correlated with transmitted pulse shape and such that the multiplexed signal has almost the same spectral characteristics as those of the original UWB pulse. The two WSTC techniques increase the data rate to four times that of the conventional STC. The first WSTC scheme increases the data rate with a simple combination process. The second scheme achieves the increase in the data rate with a less complex receiver and better performance than the first scheme due to the spatial diversity introduced by the structure of its transmitter and receiver. The two schemes use Rake receivers to collect the energy in the dense multipath channel components. The simulation results show that the proposed WSTC schemes have better performance than the conventional scheme in addition to increasing the data rate to four times that of the conventional STC scheme. PMID:27959939

  5. Embedded Systems Hardware Integration and Code Development for Maraia Capsule and E-MIST

    NASA Technical Reports Server (NTRS)

    Carretero, Emmanuel S.

    2015-01-01

    The cost of sending large spacecraft to orbit makes them undesirable for carrying out smaller scientific missions. Small spacecraft are more economical and can be tailored for missions where specific tasks need to be carried out, the Maraia capsule is such a spacecraft. Maraia will allow for samples of experiments conducted on the International Space Station to be returned to earth. The use of balloons to conduct experiments at the edge of space is a practical approach to reducing the large expense of using rockets. E-MIST is a payload designed to fly on a high altitude balloon. It can maintain science experiments in a controlled manner at the edge of space. The work covered here entails the integration of hardware onto each of the mentioned systems and the code associated with such work. In particular, the resistance temperature detector, pressure transducers, cameras, and thrusters for Maraia are discussed. The integration of the resistance temperature detectors and motor controllers to E-MIST is described. Several issues associated with sensor accuracy, code lock-up, and in-flight reset issues are mentioned. The solutions and proposed solutions to these issues are explained.

  6. A FORTRAN code for the calculation of probe volume geometry changes in a laser anemometry system caused by window refraction

    NASA Technical Reports Server (NTRS)

    Owen, Albert K.

    1987-01-01

    A computer code was written which utilizes ray tracing techniques to predict the changes in position and geometry of a laser Doppler velocimeter probe volume resulting from refraction effects. The code predicts the position change, changes in beam crossing angle, and the amount of uncrossing that occur when the beams traverse a region with a changed index of refraction, such as a glass window. The code calculates the changes for flat plate, cylinder, general axisymmetric and general surface windows and is currently operational on a VAX 8600 computer system.

  7. Performance Analysis of MIMO-STBC Systems with Higher Coding Rate Using Adaptive Semiblind Channel Estimation Scheme

    PubMed Central

    Kumar, Ravi

    2014-01-01

    Semiblind channel estimation method provides the best trade-off in terms of bandwidth overhead, computational complexity and latency. The result after using multiple input multiple output (MIMO) systems shows higher data rate and longer transmit range without any requirement for additional bandwidth or transmit power. This paper presents the detailed analysis of diversity coding techniques using MIMO antenna systems. Different space time block codes (STBCs) schemes have been explored and analyzed with the proposed higher code rate. STBCs with higher code rates have been simulated for different modulation schemes using MATLAB environment and the simulated results have been compared in the semiblind environment which shows the improvement even in highly correlated antenna arrays and is found very close to the condition when channel state information (CSI) is known to the channel. PMID:24688379

  8. Performance analysis of MIMO-STBC systems with higher coding rate using adaptive semiblind channel estimation scheme.

    PubMed

    Kumar, Ravi; Saxena, Rajiv

    2014-01-01

    Semiblind channel estimation method provides the best trade-off in terms of bandwidth overhead, computational complexity and latency. The result after using multiple input multiple output (MIMO) systems shows higher data rate and longer transmit range without any requirement for additional bandwidth or transmit power. This paper presents the detailed analysis of diversity coding techniques using MIMO antenna systems. Different space time block codes (STBCs) schemes have been explored and analyzed with the proposed higher code rate. STBCs with higher code rates have been simulated for different modulation schemes using MATLAB environment and the simulated results have been compared in the semiblind environment which shows the improvement even in highly correlated antenna arrays and is found very close to the condition when channel state information (CSI) is known to the channel.

  9. Hybrid information privacy system: integration of chaotic neural network and RSA coding

    NASA Astrophysics Data System (ADS)

    Hsu, Ming-Kai; Willey, Jeff; Lee, Ting N.; Szu, Harold H.

    2005-03-01

    Electronic mails are adopted worldwide; most are easily hacked by hackers. In this paper, we purposed a free, fast and convenient hybrid privacy system to protect email communication. The privacy system is implemented by combining private security RSA algorithm with specific chaos neural network encryption process. The receiver can decrypt received email as long as it can reproduce the specified chaos neural network series, so called spatial-temporal keys. The chaotic typing and initial seed value of chaos neural network series, encrypted by the RSA algorithm, can reproduce spatial-temporal keys. The encrypted chaotic typing and initial seed value are hidden in watermark mixed nonlinearly with message media, wrapped with convolution error correction codes for wireless 3rd generation cellular phones. The message media can be an arbitrary image. The pattern noise has to be considered during transmission and it could affect/change the spatial-temporal keys. Since any change/modification on chaotic typing or initial seed value of chaos neural network series is not acceptable, the RSA codec system must be robust and fault-tolerant via wireless channel. The robust and fault-tolerant properties of chaos neural networks (CNN) were proved by a field theory of Associative Memory by Szu in 1997. The 1-D chaos generating nodes from the logistic map having arbitrarily negative slope a = p/q generating the N-shaped sigmoid was given first by Szu in 1992. In this paper, we simulated the robust and fault-tolerance properties of CNN under additive noise and pattern noise. We also implement a private version of RSA coding and chaos encryption process on messages.

  10. [Modern principles of integrated diagnostics and rehabilitation of perinatal lesions of the nervous system and their consequences].

    PubMed

    Nemkova, S A

    2017-01-01

    The article is devoted to the comprehensive diagnosis and treatment of perinatal lesions of the nervous system and their consequences in children. Reflects modern approaches to data classification conditions, taking into account ideas on the etiology and pathogenesis of the disease, the clinical manifestations of the main syndromes (excitation and depression, hypertensive, convulsive, movement disorders) as the neonatal period, and in the formation of long-term effects (motor and mental delay and speech development, hyperkinetic syndrome, cerebral palsy and others). Considerable attention is paid to the modern principles of diagnosis (clinical, psychometric, instrumental) and comprehensive rehabilitation (medical, social and psycho-pedagogical) the effects of perinatal lesions of the nervous system. The results of the review of research on the use of the polypeptide and nootropic neurometabolic stimulator - cortexin - in the complex rehabilitation of perinatal lesions of the nervous system and their consequences in children. It is shown that the use of cortexin in treatment of critical conditions in newborns reduced the duration of intensive care and the length of stay of patients in a intensive care unit, the average period of hospital treatment and the stage of the primary neurological rehabilitation 2.5-3 times, but also reduces the frequency of detection of syndromes movement disorders in 2 times, hypertension-hydrocephalic disorders 3 times, vegetative-visceral dysfunctions 5 times. Application cortexin in the rehabilitation of children of the first years of life with the consequences of perinatal CNS indicates a significant improvement in their motor and cognitive functions, as well as predrechevogo and speech development. Application cortexin significantly improved the forecast recovery of motor, cognitive, and neurological status in general, with full compensation by the end of 1 year of life in 90% of patients, and was accompanied by a decline in

  11. Consequences of the Use of Private Coaching System to Enter Universities: A Study in Sri Lanka

    ERIC Educational Resources Information Center

    Siyambalapitiya, Sarath B.

    2005-01-01

    University admissions for professional degree programmes has become very competitive in many countries in the world. While students have to score high aggregate of marks at the university entrance examinations, other criteria may also have to be met. As a result, additional private coaching outside the normal high school system is being sought by…

  12. Plasticity of the Melanocortin System: Determinants and Possible Consequences on Food Intake.

    PubMed

    Nuzzaci, Danaé; Laderrière, Amélie; Lemoine, Aleth; Nédélec, Emmanuelle; Pénicaud, Luc; Rigault, Caroline; Benani, Alexandre

    2015-01-01

    The melanocortin system is one of the most important neuronal pathways involved in the regulation of food intake and is probably the best characterized. Agouti-related peptide (AgRP) and proopiomelanocortin (POMC) expressing neurons located in the arcuate nucleus of the hypothalamus are the key elements of this system. These two neuronal populations are sensitive to circulating molecules and receive many excitatory and inhibitory inputs from various brain areas. According to sensory and metabolic information they integrate, these neurons control different aspects of feeding behavior and orchestrate autonomic responses aimed at maintaining energy homeostasis. Interestingly, composition and abundance of pre-synaptic inputs onto arcuate AgRP and POMC neurons vary in the adult hypothalamus in response to changes in the metabolic state, a phenomenon that can be recapitulated by treatment with hormones, such as leptin or ghrelin. As described in other neuroendrocrine systems, glia might be determinant to shift the synaptic configuration of AgRP and POMC neurons. Here, we discuss the physiological outcome of the synaptic plasticity of the melanocortin system, and more particularly its contribution to the control of energy balance. The discovery of this attribute has changed how we view obesity and related disorders, and opens new perspectives for their management.

  13. Plasticity of the Melanocortin System: Determinants and Possible Consequences on Food Intake

    PubMed Central

    Nuzzaci, Danaé; Laderrière, Amélie; Lemoine, Aleth; Nédélec, Emmanuelle; Pénicaud, Luc; Rigault, Caroline; Benani, Alexandre

    2015-01-01

    The melanocortin system is one of the most important neuronal pathways involved in the regulation of food intake and is probably the best characterized. Agouti-related peptide (AgRP) and proopiomelanocortin (POMC) expressing neurons located in the arcuate nucleus of the hypothalamus are the key elements of this system. These two neuronal populations are sensitive to circulating molecules and receive many excitatory and inhibitory inputs from various brain areas. According to sensory and metabolic information they integrate, these neurons control different aspects of feeding behavior and orchestrate autonomic responses aimed at maintaining energy homeostasis. Interestingly, composition and abundance of pre-synaptic inputs onto arcuate AgRP and POMC neurons vary in the adult hypothalamus in response to changes in the metabolic state, a phenomenon that can be recapitulated by treatment with hormones, such as leptin or ghrelin. As described in other neuroendrocrine systems, glia might be determinant to shift the synaptic configuration of AgRP and POMC neurons. Here, we discuss the physiological outcome of the synaptic plasticity of the melanocortin system, and more particularly its contribution to the control of energy balance. The discovery of this attribute has changed how we view obesity and related disorders, and opens new perspectives for their management. PMID:26441833

  14. Toward a Culture of Consequences: Performance-Based Accountability Systems for Public Services. Monograph

    ERIC Educational Resources Information Center

    Stecher, Brian M.; Camm, Frank; Damberg, Cheryl L.; Hamilton, Laura S.; Mullen, Kathleen J.; Nelson, Christopher; Sorensen, Paul; Wachs, Martin; Yoh, Allison; Zellman, Gail L.

    2010-01-01

    Performance-based accountability systems (PBASs), which link incentives to measured performance as a means of improving services to the public, have gained popularity. While PBASs can vary widely across sectors, they share three main components: goals, incentives, and measures. Research suggests that PBASs influence provider behaviors, but little…

  15. Functional consequences of structural differences in stingray sensory systems. Part I: mechanosensory lateral line canals.

    PubMed

    Jordan, Laura K; Kajiura, Stephen M; Gordon, Malcolm S

    2009-10-01

    Short range hydrodynamic and electrosensory signals are important during final stages of prey capture in elasmobranchs (sharks, skates and rays), and may be particularly useful for dorso-ventrally flattened batoids with mouths hidden from their eyes. In stingrays, both the lateral line canal and electrosensory systems are highly modified and complex with significant differences on ventral surfaces that relate to feeding ecology. This study tests functional hypotheses based on quantified differences in sensory system morphology of three stingray species, Urobatis halleri, Myliobatis californica and Pteroplatytrygon violacea. Part I investigates the mechanosensory lateral line canal system whereas part II focuses on the electrosensory system. Stingray lateral line canals include both pored and non-pored sections and differ in branching complexity and distribution. A greater proportion of pored canals and high pore numbers were predicted to correspond to increased response to water flow. Behavioral experiments were performed to compare responses of stingrays to weak water jets mimicking signals produced by potential prey at velocities of 10-20 cm s(-1). Bat rays, M. californica, have the most complex and broadly distributed pored canal network and demonstrated both the highest response rate and greater response intensity to water jet signals. Results suggest that U. halleri and P. violacea may rely on additional sensory input, including tactile and visual cues, respectively, to initiate stronger feeding responses. These results suggest that stingray lateral line canal morphology can indicate detection capabilities through responsiveness to weak water jets.

  16. Role of serpentinization in the thermal and connected mineral evolution of planetesimals - evaluating possible consequences for exoplanetary systems

    NASA Astrophysics Data System (ADS)

    Góbi, Sándor; Kereszturi, Ákos

    2017-04-01

    This work gives an overview on the general consequences of serpentinization occurring in the planetesimals of any planetary system. These processes were studied by numerical simulations and the model used - based on earlier works - was developed by implementing the effect of interfacial water. As liquid water is fundamentally required for serpentinization, previous simulations considered only such cases when the initial temperature inside the planetesimal was above the melting point of ice thus neglecting the effect of microscopic water layer completely. However, our results show that it must be taken into account and since it facilitates the reaction to occur at temperatures even as low as 200 K - at which bulk liquid water is completely absent - it substantially broadens the initiation of this alteration regarding the range of possible objects. Investigating the effect of changing the initial parameters helps examine the serpentinization in more general terms. Consequently, the findings described here are ubiquitous and can be applied to any exoplanetary system, even if the initial conditions differ considerably from those that were characteristic to our early Solar system. As a first step towards the generalization of such heating processes, we evaluate the role of composition, starting temperature, porosity and planetesimal size on this heating effect. Besides heat generated by decay of radioactive nuclei, serpentinization should be considered as a 'universal process' in the thermal evolution of planetesimals, and variations of parameters considered in this model might provide an insight into differences between objects in various protoplanetary discs.

  17. Differential Subsidence in Mexico City and its Consequences to the Collective Transport System (Metro)

    NASA Astrophysics Data System (ADS)

    Solano Rojas, D. E.; Wdowinski, S.; Cabral, E.; Zhang, Y.; Torres, Y.

    2015-12-01

    Mexico City is one of the most populated metropolitans in the world, with more than 20 millions inhabitants. It is located above a sequence of deformable unconsolidated lacustrine sediments interlayered with strong volcanic rocks. These natural conditions combined with massive groundwater extraction, caused the city to subside unevenly, at rates from 0 to ~370 mm/yr, which we term differential subsidence. Our study focuses on the Collective Transport System (Metro), the massive, widely used transportation system in the city. It has been in operation since 1969. The Metro system carries an average of more than four million passengers per day along its 218 km of railways. This system has been occasionally damaged by ground deformation, in particular Line 12, in which 50% of its stations where shut down just 2.5 years after the beginning of its operation due to faults, "waves" and "bumps" along the line. In this study we used Interferometric Synthetic Aperture Radar (InSAR) observations to monitor land subsidence throughout the city and infer differential subsidence along the main Metro lines. Our analysis is based on 34 TerraSAR-X and 36 COSMO-SkyMed high-resolution scenes acquired from mid 2011 to mid 2013. The data were processed using the StaMPS InSAR time series technique, which calculates ground displacement time series for more than 2.5 million selected measurement points, typically separated 3-15 meters apart. The differential subsidence along the Metro lines was calculated by averaging subsidence rate within a 30 m radius circles, every 60 m along the lines. We found that the segments with the most differential deformation are in lines 4, 5, 9, A, B and 12. Our easy-to-implement method can be applied to permanent monitor deformation along the railways, as well as serve as a guide for the development of new lines of the Metro system prospected by Mexico's City government.

  18. Genomic conflict in scale insects: the causes and consequences of bizarre genetic systems.

    PubMed

    Ross, Laura; Pen, Ido; Shuker, David M

    2010-11-01

    It is now clear that mechanisms of sex determination are extraordinarily labile, with considerable variation across all taxonomic levels. This variation is often expressed through differences in the genetic system (XX-XY, XX-XO, haplodiploidy, and so on). Why there is so much variation in such a seemingly fundamental process has attracted much attention, with recent ideas concentrating on the possible role of genomic conflicts of interest. Here we consider the role of inter- and intra-genomic conflicts in one large insect taxon: the scale insects. Scale insects exhibit a dizzying array of genetic systems, and their biology promotes conflicts of interest over transmission and sex ratio between male- and female-expressed genes, parental- and offspring-expressed genes (both examples of intra-genomic conflict) and between scale insects and their endosymbionts (inter-genomic conflict). We first review the wide range of genetic systems found in scale insects and the possible evolutionary transitions between them. We then outline the theoretical opportunities for genomic conflicts in this group and how these might influence sex determination and sex ratio. We then consider the evidence for these conflicts in the evolution of sex determination in scale insects. Importantly, the evolution of novel genetic systems in scale insects has itself helped create new conflicts of interest, for instance over sex ratio. As a result, a major obstacle to our understanding of the role of conflict in the evolution of sex-determination and genetic systems will be the difficulty in identifying the direction of causal relationships. We conclude by outlining possible experimental and comparative approaches to test more effectively how important genomic conflicts have been.

  19. Space-Frequency Block Code with Matched Rotation for MIMO-OFDM System with Limited Feedback

    NASA Astrophysics Data System (ADS)

    Zhang, Min; Abhayapala, Thushara D.; Jayalath, Dhammika; Smith, David; Athaudage, Chandra

    2009-12-01

    This paper presents a novel matched rotation precoding (MRP) scheme to design a rate one space-frequency block code (SFBC) and a multirate SFBC for MIMO-OFDM systems with limited feedback. The proposed rate one MRP and multirate MRP can always achieve full transmit diversity and optimal system performance for arbitrary number of antennas, subcarrier intervals, and subcarrier groupings, with limited channel knowledge required by the transmit antennas. The optimization process of the rate one MRP is simple and easily visualized so that the optimal rotation angle can be derived explicitly, or even intuitively for some cases. The multirate MRP has a complex optimization process, but it has a better spectral efficiency and provides a relatively smooth balance between system performance and transmission rate. Simulations show that the proposed SFBC with MRP can overcome the diversity loss for specific propagation scenarios, always improve the system performance, and demonstrate flexible performance with large performance gain. Therefore the proposed SFBCs with MRP demonstrate flexibility and feasibility so that it is more suitable for a practical MIMO-OFDM system with dynamic parameters.

  20. Application of wavelets to image coding in an rf-link communication system

    NASA Astrophysics Data System (ADS)

    Liou, C. S. J.; Conners, Gary H.; Muczynski, Joe

    1995-04-01

    The joint University of Rochester/Rochester Institute of Technology `Center for Electronic Imaging Systems' (CEIS) is designed to focus on research problems of interest to industrial sponsors, especially the Rochester Imaging Consortium. Compression of tactical images for transmission over an rf link is an example of this type of research project which is being worked on in collaboration with one of the CEIS sponsors, Harris Corporation/RF Communications. The Harris digital video imagery transmission system (DVITS) is designed to fulfill the need to transmit secure imagery between unwired locations at real-time rates. DVITS specializes in transmission systems for users who rely on hf equipment operating at the low end of the frequency spectrum. However, the inherently low bandwidth of hf combined with transmission characteristics such as fading and dropout severely restrict the effective throughput. The problem at designing a system such as DVITS is particularly challenging because of bandwidth and signal/noise limitations, and because of the dynamic nature of the operational environment. In this paper, a novel application of wavelets in tactical image coding is proposed to replace the current DCT compression algorithm in the DVITS system. THe effects of channel noise on the received image are determined and various design strategies combining image segmentation, compression, and error correction are described.

  1. Deviations in the endocrine system and brain of patients with fibromyalgia: cause or consequence of pain and associated features?

    PubMed

    Geenen, Rinie; Bijlsma, Johannes W J

    2010-04-01

    The brain and endocrine system are crucial interfaces responding to pathological and psychological processes. This review discusses whether endocrine deviations and structural and functional changes in the brain are a cause or consequence of fibromyalgia. Studies in patients with fibromyalgia virtually uniformly observed subtle alterations in hypothalamic pituitary adrenal functioning, hyporeactive autonomic nervous system responsiveness to stressors, and structural and functional changes in the brain. Our model proposes that predisposing factors, such as genetic vulnerability and trauma, have led to an alteration of the nociceptive system including several neuroendocrine changes. The resulting pain and associated symptoms, such as sleep disturbance, low fitness, fatigue, stress, and distress, are a cause of new neuroendocrine changes. The model predicts that favorable neuroendocrine changes are to be expected after successful pharmacological or non-pharmacological interventions that target pain and associated symptoms.

  2. Inventory of Safety-related Codes and Standards for Energy Storage Systems with some Experiences related to Approval and Acceptance

    SciTech Connect

    Conover, David R.

    2014-09-11

    The purpose of this document is to identify laws, rules, model codes, codes, standards, regulations, specifications (CSR) related to safety that could apply to stationary energy storage systems (ESS) and experiences to date securing approval of ESS in relation to CSR. This information is intended to assist in securing approval of ESS under current CSR and to identification of new CRS or revisions to existing CRS and necessary supporting research and documentation that can foster the deployment of safe ESS.

  3. New double-byte error-correcting codes for memory systems

    NASA Technical Reports Server (NTRS)

    Feng, Gui-Liang; Wu, Xinen; Rao, T. R. N.

    1996-01-01

    Error-correcting or error-detecting codes have been used in the computer industry to increase reliability, reduce service costs, and maintain data integrity. The single-byte error-correcting and double-byte error-detecting (SbEC-DbED) codes have been successfully used in computer memory subsystems. There are many methods to construct double-byte error-correcting (DBEC) codes. In the present paper we construct a class of double-byte error-correcting codes, which are more efficient than those known to be optimum, and a decoding procedure for our codes is also considered.

  4. Impact-driven ice loss in outer Solar System satellites: Consequences for the Late Heavy Bombardment

    NASA Astrophysics Data System (ADS)

    Nimmo, F.; Korycansky, D. G.

    2012-05-01

    We use recent hydrodynamical results (Kraus, R.G., Senft, L.G., Stewart, S.S. [2011]. Icarus, 214, 724-738) for the production of water vapor by hypervelocity impacts on ice targets to assess which present-day major satellites of Jupiter, Saturn, and Uranus would have lost mass due to impact vaporization during an era of massive bombardment similar to the Late Heavy Bombardment in the inner Solar System. Using impactor populations suggested by recent work (Charnoz, S., Morbidelli, A., Dones, L., Salmon, J. [2009]. Icarus, 199, 413-428; Barr, A.C., Canup, R.M. [2010]. Nat. Geosci., 3, 164-167), we find that several satellites would have lost all their HO; we suggest that the most likely resolution of this paradox is that either the LHB delivered ≈10 times less mass to the outer Solar System than predicted by the standard Nice Model, or that the inner satellites formed after the LHB.

  5. Alternative rearing systems in pigs: consequences on stress indicators at slaughter and meat quality.

    PubMed

    Foury, A; Lebret, B; Chevillon, P; Vautier, A; Terlouw, C; Mormède, P

    2011-08-01

    The objective of this study was to evaluate the effects of three alternative (ALT) rearing systems for growing pigs (outdoor: 150 m2/pig; straw bedding: 1.30 m2/pig; and hut with access to a courtyard: 1.30 m2/pig) compared with a conventional system (fully slatted floor: 0.65 m2/pig, considered as control), on pre-slaughter stress indicators in relation with meat quality. To that end, the number of skin lesions on whole carcasses, as well as blood creatine kinase (CK) activity and urine levels in cortisol and catecholamines (adrenaline and noradrenaline) were determined at slaughter. Glycolytic potential (GP) and ultimate pH of the semimembranosus muscle were also measured. The global correlation network calculated between all these parameters shows that the indicators of pre-slaughter muscle activity (plasma CK) and/or stress indicators (e.g. adrenaline) are negatively (r=-0.26, P<0.01; r=-0.29, P<0.05, respectively) correlated with muscle GP and positively (r=0.17, P<0.05; r=0.44, P<0.001, respectively) with meat ultimate pH. Although some traits measured were sensitive to the degree of pre-slaughter mixing, they differed across rearing systems. The differences were most pronounced for the comparison of outdoors v. slatted floor. The lower levels of plasma CK and urinary catecholamines, and the lower number of carcass skin lesions of pigs reared outdoors, were related to a lower meat ultimate pH. Thus, ALT rearing systems influence animal welfare and meat quality, by providing enriched environmental conditions to the animals.

  6. SEARCH AND CHOICE IN TRANSPORT SYSTEMS PLANNING. VOLUME XVIII. PREDICTIVE MODELS FOR VEHICLE OPERATING CONSEQUENCES.

    DTIC Science & Technology

    variables are transformed into useful output - travel time, fuel consumption, and maintenance cost. This output may then be used in evaluating cost and...service performance measures, eventually relating technology properly to the broader questions raised in transportation systems analysis. Travel time... travel time model. Actual power expended is the independent predictive variable, and thus a true casual relationship is used for estimating. Maintenance costs are predicted by a third model. (Author)

  7. Attack-Potential-Based Survivability Modeling for High-Consequence Systems

    DTIC Science & Technology

    2005-03-24

    intrusion-tolerance has been seen as a potential approach to increasing the survivability of an information system. An important part of this approach has...surviv- ability as simply a small collection of statistics, such as mean time to security breach or mean effort to security breach. More recent approaches ...Prescribed by ANSI Std Z39-18 This approach and the fundamental concept of a purely stochastic model of security or survivability can be problematic for

  8. Speech coding

    SciTech Connect

    Ravishankar, C., Hughes Network Systems, Germantown, MD

    1998-05-08

    Speech is the predominant means of communication between human beings and since the invention of the telephone by Alexander Graham Bell in 1876, speech services have remained to be the core service in almost all telecommunication systems. Original analog methods of telephony had the disadvantage of speech signal getting corrupted by noise, cross-talk and distortion Long haul transmissions which use repeaters to compensate for the loss in signal strength on transmission links also increase the associated noise and distortion. On the other hand digital transmission is relatively immune to noise, cross-talk and distortion primarily because of the capability to faithfully regenerate digital signal at each repeater purely based on a binary decision. Hence end-to-end performance of the digital link essentially becomes independent of the length and operating frequency bands of the link Hence from a transmission point of view digital transmission has been the preferred approach due to its higher immunity to noise. The need to carry digital speech became extremely important from a service provision point of view as well. Modem requirements have introduced the need for robust, flexible and secure services that can carry a multitude of signal types (such as voice, data and video) without a fundamental change in infrastructure. Such a requirement could not have been easily met without the advent of digital transmission systems, thereby requiring speech to be coded digitally. The term Speech Coding is often referred to techniques that represent or code speech signals either directly as a waveform or as a set of parameters by analyzing the speech signal. In either case, the codes are transmitted to the distant end where speech is reconstructed or synthesized using the received set of codes. A more generic term that is applicable to these techniques that is often interchangeably used with speech coding is the term voice coding. This term is more generic in the sense that the

  9. Design and implementation of a Cooke triplet based wave-front coded super-resolution imaging system

    NASA Astrophysics Data System (ADS)

    Zhao, Hui; Wei, Jingxuan

    2015-09-01

    Wave-front coding is a powerful technique that could be used to extend the DOF (depth of focus) of incoherent imaging system. It is the suitably designed phase mask that makes the system defocus invariant and it is the de-convolution algorithm that generates the clear image with large DOF. Compared with the traditional imaging system, the point spread function (PSF) in wave-front coded imaging system has quite a large support size and this characteristic makes wave-front coding be capable of realizing super-resolution imaging without replacing the current sensor with one of smaller pitch size. An amplification based single image super-resolution reconstruction procedure has been specifically designed for wave-front coded imaging system and its effectiveness has been demonstrated experimentally. A Cooke Triplet based wave-front coded imaging system is established. For a focal length of 50 mm and f-number 4.5, objects within the range [5 m, ∞] could be clearly imaged, which indicates a DOF extension ratio of approximately 20. At the same time, the proposed processing procedure could produce at least 3× resolution improvement, with the quality of the reconstructed super-resolution image approaching the diffraction limit.

  10. Toward a Culture of Consequences: Performance-Based Accountability Systems for Public Services.

    PubMed

    Stecher, Brian M; Camm, Frank; Damberg, Cheryl L; Hamilton, Laura S; Mullen, Kathleen J; Nelson, Christopher; Sorensen, Paul; Wachs, Martin; Yoh, Allison; Zellman, Gail L; Leuschner, Kristin J; Camm, Frank; Stecher, Brian M

    2012-01-01

    Performance-based accountability systems (PBASs), which link incentives to measured performance as a means of improving services to the public, have gained popularity. While PBASs can vary widely across sectors, they share three main components: goals, incentives, and measures. Research suggests that PBASs influence provider behaviors, but little is known about PBAS effectiveness at achieving performance goals or about government and agency experiences. This study examines nine PBASs that are drawn from five sectors: child care, education, health care, public health emergency preparedness, and transportation. In the right circumstances, a PBAS can be an effective strategy for improving service delivery. Optimum circumstances include having a widely shared goal, unambiguous observable measures, meaningful incentives for those with control over the relevant inputs and processes, few competing interests, and adequate resources to design, implement, and operate the PBAS. However, these conditions are rarely fully realized, so it is difficult to design and implement PBASs that are uniformly effective. PBASs represent a promising policy option for improving the quality of service-delivery activities in many contexts. The evidence supports continued experimentation with and adoption of this approach in appropriate circumstances. Even so, PBAS design and its prospects for success depend on the context in which it will operate. Also, ongoing system evaluation and monitoring are integral components of a PBAS; they inform refinements that improve system functioning over time. Empirical evidence of the effects of performance-based public management is scarce. This article also describes a framework used to evaluate a PBAS. Such a system identifies individuals or organizations that must change their behavior for the performance of an activity to improve, chooses an implicit or explicit incentive structure to motivate these organizations or individuals to change, and then

  11. Cellular and circuit properties supporting different sensory coding strategies in electric fish and other systems.

    PubMed

    Marsat, Gary; Longtin, André; Maler, Leonard

    2012-08-01

    Neural codes often seem tailored to the type of information they must carry. Here we contrast the encoding strategies for two different communication signals in electric fish and describe the underlying cellular and network properties that implement them. We compare an aggressive signal that needs to be quickly detected, to a courtship signal whose quality needs to be evaluated. The aggressive signal is encoded by synchronized bursts and a predictive feedback input is crucial in separating background noise from the communication signal. The courtship signal is accurately encoded through a heterogenous population response allowing the discrimination of signal differences. Most importantly we show that the same strategies are used in other systems arguing that they evolved similar solutions because they faced similar tasks.

  12. Alterations in hypothalamic KiSS-1 system in experimental diabetes: early changes and functional consequences.

    PubMed

    Castellano, J M; Navarro, V M; Roa, J; Pineda, R; Sánchez-Garrido, M A; García-Galiano, D; Vigo, E; Dieguez, C; Aguilar, E; Pinilla, L; Tena-Sempere, M

    2009-02-01

    Using long-term streptozotocin (STZ)-treated male rats, we recently proposed that defective function of hypothalamic KiSS-1 system is mechanistically relevant for central hypogonadotropism of uncontrolled diabetes. However, the temporal pattern of such defects and its potential contribution to disturbed gonadotropin secretion in the diabetic female remain so far unexplored. To cover these issues, expression analyses and hormonal tests were conducted in diabetic male (1 wk after STZ; short term) and female (4 wk after STZ; long term) rats. Short-term diabetic males had lower basal testosterone levels and decreased gonadotropin responses to orchidectomy (ORX), which associated with significantly attenuated post-ORX rises of hypothalamic KiSS-1 mRNA. Yet kisspeptin administration to diabetic males was able to acutely elicit supramaximal LH and testosterone responses and normalize post-ORX gonadotropin secretion. Long-term diabetic females showed persistent anestrus and significantly decreased basal gonadotropin levels as well as blunted LH responses to ovariectomy; changes that were linked to lowering of basal and postovariectomy expression of hypothalamic KiSS-1 mRNA. Moreover, despite prevailing gonadotropin suppression, LH responses to acute kisspeptin administration were fully preserved, and even enhanced after its repeated injection, in diabetic females. In sum, our present findings further define the temporal course and mechanistic relevance of altered hypothalamic KiSS-1 system in the hypogonadotropic state of uncontrolled diabetes. Furthermore, our data provide the basis for the potential therapeutic intervention of the KiSS-1 system as adjuvant in the management of disturbed gonadotropin secretion of type 1 diabetes in the female.

  13. Modeling activities in air traffic control systems: antecedents and consequences of a mid-air collision.

    PubMed

    de Carvalho, Paulo Victor R; Ferreira, Bemildo

    2012-01-01

    In this article we present a model of some functions and activities of the Brazilian Air traffic Control System (ATS) in the period in which occurred a mid-air collision between flight GLO1907, a commercial aircraft Boeing 737-800, and flight N600XL, an executive jet EMBRAER E-145, to investigate key resilience characteristics of the ATM. Modeling in some detail activities during the collision and related them to overall behavior and antecedents that stress the organization uncover some drift into failure mechanisms that erode safety defenses provided by the Air Navigation Service Provider (ANSP), enabling a mid-air collision to be happen.

  14. Long-term consequences of drugs on the paediatric cardiovascular system.

    PubMed

    Hausner, Elizabeth; Fiszman, Monica L; Hanig, Joseph; Harlow, Patricia; Zornberg, Gwen; Sobel, Solomon

    2008-01-01

    Many pharmacological and toxicological actions of drugs in children cannot be fully predicted from adult clinical experience or from standard non-clinical toxicology studies. Numerous drugs have direct or indirect pharmacological effects on the heart and are prescribed for children of all ages. Toxicity or secondary effects may be immediate or delayed for years after drug exposure has ceased. Originally, the aim of this review was to compile information on the effect of specific drugs on the post-natal development of the cardiovascular system and to examine long-term follow-up of the use of cardio-active drugs in children. The limited database of published information caused the original question to evolve into an examination of the medical literature for three areas of information: (i) whether vulnerable developmental windows have been identified that reflect the substantial functional development that the cardiovascular system undergoes after birth; (ii) what is known about pharmacological perturbation of development; and (iii) what the likelihood is of drug exposure during childhood. We examined different scenarios for exposure including random, isolated exposure, conditions historically associated with adults, primary or secondary cardiac disease, psychiatric and neurological conditions, asthma, cancer and HIV. Except for random, isolated drug exposures, each category of possible exposure contained numerous drugs known to have either primary or secondary effects on the cardiovascular system or to influence factors associated with atherosclerosis. It is likely that a significant number of children will be prescribed drugs having either direct or indirect effects upon the immature cardiovascular system. A confounding factor is the simultaneous use of over-the-counter medications and herbal or nutraceutical preparations that a patient, parent or guardian does not mention to a prescribing physician. Metabolism is also important in assessing drug effects in children

  15. Misuse of Child Restraint Systems in Crash Situations - Danger and Possible Consequences

    PubMed Central

    Lesire, Philippe; Cuny, Sophie; Alonzo, François; Cataldi, Manuela

    2007-01-01

    Based on real-world crash data and recent field studies, an ad-hoc group was set up in order to have a better comprehension of the effects of misuse of Child Restraint Systems (CRS) on child protection. A testing programme of 60 single misuse situations was conducted. Test results confirmed that, in frontal impact, children have higher risk of being injured on a number of different body regions when CRS’s are misused. This work provides material for educational and training purposes to help parents understand that child restraints need to be correctly fitted in order to provide the level of protection they are designed for. PMID:18184494

  16. Climate-induced tree mortality: earth system consequences for carbon, energy, and water exchanges

    NASA Astrophysics Data System (ADS)

    Adams, H. D.; Macalady, A.; Breshears, D. D.; Allen, C. D.; Luce, C.; Royer, P. D.; Huxman, T. E.

    2010-12-01

    One of the greatest uncertainties in global environmental change is predicting changes in feedbacks between the biosphere and atmosphere that could present hazards to current earth system function. Terrestrial ecosystems, and in particular forests, exert strong controls on the global carbon cycle and influence regional hydrology and climatology directly through water and surface energy budgets. Widespread, rapid, drought- and infestation-triggered tree mortality is now emerging as a phenomenon affecting forests globally and may be linked to increasing temperatures and drought frequency and severity. We demonstrate the link between climate-sensitive tree mortality and risks of altered earth system function though carbon, water, and energy exchange. Tree mortality causes a loss of carbon stocks from an ecosystem and a reduction sequestration capacity. Recent research has shown that the 2000s pinyon pine die-off in the southwest US caused the loss of 4.6 Tg of aboveground carbon stocks from the region in 5 years, far exceeding carbon loss from other disturbances. Widespread tree mortality in British Columbia resulted in the loss of 270 Tg of carbon, shifting affected forestland from a carbon sink to a source, and influenced Canadian forest policy on carbon stocks. Tree mortality, as an immediate loss of live tree cover, directly alters albedo, near-ground solar radiation, and the relative contributions of evaporation and transpiration to total evapotranspiration. Near-ground solar radiation, an important ecosystem trait affecting soil heating and water availability, increased regionally following the pinyon pine die-off. Conversely, forest canopy loss with tree mortality, is expected to increase regional albedo, especially for forests which experience winter snow cover, potentially offsetting the climate forcing of terrestrial carbon releases to the atmosphere. Initial hydrological response to die-off is likely a reduction in evapotranspiration, which can increase

  17. PC-based PCM (Pulse Code Modulation) telemetry data reduction system hardware

    SciTech Connect

    Simms, D.A.; Butterfield, C.P.

    1990-02-01

    The Solar Energy Research Institute's (SERI) Wind Research Program is using pulse code modulation (PCM) telemetry systems to study horizontal-axis wind turbines. SERI has developed a low-cost PC-based PCM data acquisition system to facilitate quick PCM data analysis in the field. The SERI PC-PCM system consists of AT-compatible hardware boards for decoding and combining PCM data streams and DOS software for control and management of data acquisition. Up to four boards can be installed in a single PC, providing the capability to combine data from four PCM streams direct to disk or memory. This paper describes the SERI PC-PCM system hardware, focusing on the practicality of PC-based PCM data reduction. A related paper highlights our comprehensive PCM data management software program which can be used in conjunction with this hardware to provide full quick-look'' data processing and display. The PC-PCM hardware boards support a subset of the Inter-Range Instrumentation Group (IRIG) PCM standard, designed to synchronize and decommutate NRZ or Bi-Phase L PCM streams in the range of 1 to 800 Kbits/sec at 8 to 12 bits per word and 2 to 64 words per frame. Multiple PCM streams (at various rates) can be combined and interleaved into a contiguous digital time series. Maximum data throughput depends on characteristics of the PC hardware, such as CPU rate and disk access speed. 7 refs., 6 figs., 4 tabs.

  18. Annual Live Code Tsunami Warning System tests improve EAS services in Alaska

    NASA Astrophysics Data System (ADS)

    Preller, C. C.; Albanese, S.; Grueber, M.; Osiensky, J. M.; Curtis, J. C.

    2014-12-01

    The National Weather Service, in partnership with the State of Alaska Division of Homeland Security and Emergency Management (DHSEM) and the Alaska Broadcasters Association (ABA), has made tremendous improvements to Alaska's Emergency Alert System (EAS) with the use of an annual live code Tsunami System test. The annual test has been implemented since 2007 during the 3rd week of March commemorating the Great Alaska Earthquake of 1964 and promoting Tsunami Preparedness Week. Due to the antiquity of hardware, this test had always been conducted state-wide. This resulted in over-warn testing large areas of the largest state with no tsunami risk. The philosophy being that through over-warning, the most rural high risk areas would be warned. In 2012, the State of Alaska upgraded their dissemination hardware and the NWS was able to limit the test to a regional area eliminating most of the unthreatened areas from the test. While this occurred with several great successes, it also exposed a myriad of unknown problems and challenges. In addition, the NWS and the State of Alaska, with support from the National Tsunami Hazard Mitigation Committee (NTHMP), has engaged in an aggressive education, outreach, and mitigation campaign with Alaska's coastal high-risk community Emergency Managers. The resultant situation has produced a tight team between local Emergency Managers, State Emergency Managers and Emergency Operations Center, the NWS' National Tsunami Warning Center, NWS' Weather Forecast Offices and Regional Managers, and Alaska's Broadcasters coming together as a dynamic and creative problem solving force. This poster will address the leaps of progress as well as the upcoming hurdles. Ultimately, live code testing is improving how we warn and save lives and property during the shortest fuse disaster his planet offers; the tsunami.

  19. Unintended Pregnancy and Its Adverse Social and Economic Consequences on Health System: A Narrative Review Article

    PubMed Central

    YAZDKHASTI, Mansureh; POURREZA, Abolghasem; PIRAK, Arezoo; ABDI, Fatemeh

    2015-01-01

    Abstract Unintended pregnancy is among the most troubling public health problems and a major reproductive health issue worldwide imposing appreciable socioeconomic burden on individuals and society. Governments generally plan to control growth of births (especially wanted births as well as orphans and illegitimate births) imposing extra burden on public funding of the governments which inevitably affects economic efficiency and leads to economic slowdown, too. The present narrative review focuses on socioeconomic impacts of unintended pregnancy from the health system perspective. Follow of Computerized searches of Academic, 53 scientific journals were found in various databases including PubMed, EMBASE, ISI, Iranian databases, IPPE, UNFPA (1985-2013). Original articles, review articles, published books about the purpose of the paper were used. During this search, 20 studies were found which met the inclusion criteria. Unintended pregnancy is one of the most critical challenges facing the public health system that imposes substantial financial and social costs on society. On the other hand, affecting fertility indicators, it causes reduced quality of life and workforce efficiency. Therefore lowering the incidence of intended pregnancies correlates with elevating economic growth, socio-economic development and promoting public health. Regarding recent policy changes in Iran on family planning programs and adopting a new approach in increasing population may place the country at a higher risk of increasing the rate of unintended pregnancy. Hence, all governmental plans and initiatives of public policy must be regulated intelligently and logically aiming to make saving in public spending and reduce healthcare cost inflation. PMID:26060771

  20. Sympathetic‐mediated activation versus suppression of the immune system: consequences for hypertension

    PubMed Central

    Case, Adam J.

    2016-01-01

    Abstract It is generally well‐accepted that the immune system is a significant contributor in the pathogenesis of hypertension. Specifically, activated and pro‐inflammatory T‐lymphocytes located primarily in the vasculature and kidneys appear to have a causal role in exacerbating elevated blood pressure. It has been proposed that increased sympathetic nerve activity and noradrenaline outflow associated with hypertension may be primary contributors to the initial activation of the immune system early in the disease progression. However, it has been repeatedly demonstrated in many different human and experimental diseases that sympathoexcitation is immunosuppressive in nature. Moreover, human hypertensive patients have demonstrated increased susceptibility to secondary immune insults like infections. Thus, it is plausible, and perhaps even likely, that in diseases like hypertension, specific immune cells are activated by increased noradrenaline, while others are in fact suppressed. We propose a model in which this differential regulation is based upon activation status of the immune cell as well as the resident organ. With this, the concept of global immunosuppression is obfuscated as a viable target for hypertension treatment, and we put forth the concept of focused organ‐specific immunotherapy as an alternative option. PMID:26830047

  1. Unintended greenhouse gas consequences of lowering level of service in urban transit systems

    NASA Astrophysics Data System (ADS)

    Griswold, Julia B.; Cheng, Han; Madanat, Samer; Horvath, Arpad

    2014-12-01

    Public transit is often touted as a ‘green’ transportation option and a way for users to reduce their environmental footprint by avoiding automobile emissions, but that may not be the case when systems run well below passenger capacity. In previous work, we explored an approach to optimizing the design and operations of transit systems for both costs and emissions, using continuum approximation models and assuming fixed demand. In this letter, we expand upon our previous work to explore how the level of service for users impacts emissions. We incorporate travel time elasticities into the optimization to account for demand shifts from transit to cars, resulting from increases in transit travel time. We find that emissions reductions are moderated, but not eliminated, for relatively inelastic users. We consider two scenarios: the first is where only the agency faces an emissions budget; the second is where the entire city faces an emissions budget. In the latter scenario, the emissions reductions resulting from reductions in transit level of service are mitigated as users switch to automobile.

  2. Performance of an adaptive coding scheme in a fixed wireless cellular system working in millimeter-wave bands

    NASA Astrophysics Data System (ADS)

    Farahvash, Shayan; Akhavan, Koorosh; Kavehrad, Mohsen

    1999-12-01

    This paper presents a solution to problem of providing bit- error rate performance guarantees in a fixed millimeter-wave wireless system, such as local multi-point distribution system in line-of-sight or nearly line-of-sight applications. The basic concept is to take advantage of slow-fading behavior of fixed wireless channel by changing the transmission code rate. Rate compatible punctured convolutional codes are used to implement adaptive coding. Cochannel interference analysis is carried out for downlink direction; from base station to subscriber premises. Cochannel interference is treated as a noise-like random process with a power equal to the sum of the power from finite number of interfering base stations. Two different cellular architectures based on using single or dual polarizations are investigated. Average spectral efficiency of the proposed adaptive rate system is found to be at least 3 times larger than a fixed rate system with similar outage requirements.

  3. Medical Consequences of Chernobyl with Focus on the Endocrine System - Part 2.

    PubMed

    Foley, Thomas P; Límanová, Zdeňka; Potluková, Eliška

    2015-01-01

    In the last 70 years, atomic disasters have occurred several times. The nuclear power plant accident at Chernobyl in 1986 in North-Central Ukraine was a unique experience in population exposures to radiation by all ages, and ongoing studies have brought a large amount of information effects of radiation on human organism. Concerning the deteriorating global security situation and the strong rhetoric of some of the world leaders, the knowledge on the biological effects of ionizing radiation and the preventive measures designed to decrease the detrimental effects of radiation gains a new dimension, and involves all of us. This review focuses on the long-term effects of Chernobyl catastrophe especially on the endocrine system in children and in adults, and includes a summary of preventive measures in case of an atomic disaster.

  4. Medical consequences of Chernobyl with focus on the endocrine system: Part 1.

    PubMed

    Foley, Thomas P; Límanová, Zdeňka; Potluková, Eliška

    2015-01-01

    In the last 70 years, atomic disasters have occurred several times. The nuclear power plant accident at Chernobyl in 1986 in North-Central Ukraine was a unique experience in population exposures to radiation by all ages, and ongoing studies have brought a large amount of information on effects of radiation on human organism. Concerning the deteriorating global security situation and the strong rhetoric of some of the world leaders, the knowledge on the biological effects of ionizing radiation and the preventive measures designed to decrease the detrimental effects of radiation gains a new dimension, and involves all of us. This review focuses on the long-term effects of Chernobyl catastrophe especially on the endocrine system in children and in adults, and includes a summary of preventive measures in case of an atomic disaster.

  5. Detonator cable initiation system safety investigation: Consequences of energizing the detonator and actuator cables

    SciTech Connect

    Osher, J.; Chau, H.; Von Holle, W.

    1994-03-01

    This study was performed to explore and assess the worst-case response of a W89-type weapons system, damaged so as to expose detonator and/or detonator safing strong link (DSSL) cables to the most extreme, credible lightning-discharge, environment. The test program used extremely high-current-level, fast-rise-time (1- to 2-{mu}s) discharges to simulate lightning strikes to either the exposed detonator or DSSL cables. Discharges with peak currents above 700 kA were required to explode test sections of detonator cable and launch a flyer fast enough potentially to detonate weapon high explosive (HE). Detonator-safing-strong-link (DSSL) cables were exploded in direct contact with hot LX-17 and Ultrafine TATB (UFTATB). At maximum charging voltage, the discharge system associated with the HE firing chamber exploded the cables at more than 600-kA peak current; however, neither LX-17 nor UFTATB detonated at 250{degree}C. Tests showed that intense surface arc discharges of more than 700 kA/cm in width across the surface of hot UFTATB [generally the more sensitive of the two insensitive high explosives (IHE)] could not initiate this hot IHE. As an extension to this study, we applied the same technique to test sections of the much-narrower but thicker-cover-layer W87 detonator cable. These tests were performed at the same initial stored electrical energy as that used for the W89 study. Because of the narrower cable conductor in the W87 cables, discharges greater than 550-kA peak current were sufficient to explode the cable and launch a fast flyer. In summary, we found that lightning strikes to exposed DSSL cables cannot directly detonate LX-17 or UFTATB even at high temperatures, and they pose no HE safety threat.

  6. Performance of the OVERFLOW-MLP and LAURA-MLP CFD Codes on the NASA Ames 512 CPU Origin System

    NASA Technical Reports Server (NTRS)

    Taft, James R.

    2000-01-01

    The shared memory Multi-Level Parallelism (MLP) technique, developed last year at NASA Ames has been very successful in dramatically improving the performance of important NASA CFD codes. This new and very simple parallel programming technique was first inserted into the OVERFLOW production CFD code in FY 1998. The OVERFLOW-MLP code's parallel performance scaled linearly to 256 CPUs on the NASA Ames 256 CPU Origin 2000 system (steger). Overall performance exceeded 20.1 GFLOP/s, or about 4.5x the performance of a dedicated 16 CPU C90 system. All of this was achieved without any major modification to the original vector based code. The OVERFLOW-MLP code is now in production on the inhouse Origin systems as well as being used offsite at commercial aerospace companies. Partially as a result of this work, NASA Ames has purchased a new 512 CPU Origin 2000 system to further test the limits of parallel performance for NASA codes of interest. This paper presents the performance obtained from the latest optimization efforts on this machine for the LAURA-MLP and OVERFLOW-MLP codes. The Langley Aerothermodynamics Upwind Relaxation Algorithm (LAURA) code is a key simulation tool in the development of the next generation shuttle, interplanetary reentry vehicles, and nearly all "X" plane development. This code sustains about 4-5 GFLOP/s on a dedicated 16 CPU C90. At this rate, expected workloads would require over 100 C90 CPU years of computing over the next few calendar years. It is not feasible to expect that this would be affordable or available to the user community. Dramatic performance gains on cheaper systems are needed. This code is expected to be perhaps the largest consumer of NASA Ames compute cycles per run in the coming year.The OVERFLOW CFD code is extensively used in the government and commercial aerospace communities to evaluate new aircraft designs. It is one of the largest consumers of NASA supercomputing cycles and large simulations of highly resolved full

  7. Biofilm streamers cause catastrophic disruption of flow with consequences for environmental and medical systems

    PubMed Central

    Drescher, Knut; Shen, Yi; Bassler, Bonnie L.; Stone, Howard A.

    2013-01-01

    Biofilms are antibiotic-resistant, sessile bacterial communities that occupy most moist surfaces on Earth and cause chronic and medical device-associated infections. Despite their importance, basic information about biofilm dynamics in common ecological environments is lacking. Here, we demonstrate that flow through soil-like porous materials, industrial filters, and medical stents dramatically modifies the morphology of Pseudomonas aeruginosa biofilms to form 3D streamers, which, over time, bridge the spaces between obstacles and corners in nonuniform environments. We discovered that accumulation of surface-attached biofilm has little effect on flow through such environments, whereas biofilm streamers cause sudden and rapid clogging. We demonstrate that flow-induced shedding of extracellular matrix from surface-attached biofilms generates a sieve-like network that captures cells and other biomass, which add to the existing network, causing exponentially fast clogging independent of growth. These results suggest that biofilm streamers are ubiquitous in nature and strongly affect flow through porous materials in environmental, industrial, and medical systems. PMID:23401501

  8. Korean Survivors of the Japanese "Comfort Women" System: Understanding the Lifelong Consequences of Early Life Trauma.

    PubMed

    Park, Jee Hoon; Lee, KyongWeon; Hand, Michelle D; Anderson, Keith A; Schleitwiler, Tess E

    2016-01-01

    Prior to and during World War II, thousands of girls and young women were abducted from Korea and forced into sexual slavery by the Japanese government. Termed comfort women, these girls and young women suffered extreme sexual, physical, and emotional abuse and trauma. Research on this group is not well-developed and people know little of the impact of this early life trauma on the lives of these women who are now in later life. Using snowball sampling, 16 older adult survivors of the comfort women system participated in semistructured qualitative interviews. Thematic analysis was conducted to gain an understanding of the trauma that these women suffered and how it impacted their lives. Results revealed the depths of the abuse these women suffered, including repeated rapes, physical beatings, humiliation, forced surgery and sterilization, and social exclusion. These early traumatic experiences appeared to reverberate throughout their lives in their family relations, their inability to marry and to conceive children, and their emotional and physical well-being throughout the life course and into later life. The experiences of these survivors illustrate the lasting impact of early-life trauma and can guide interventions with current survivors of sexual abuse or trafficking.

  9. Exploring a QoS driven scheduling approach for peer-to-peer live streaming systems with network coding.

    PubMed

    Cui, Laizhong; Lu, Nan; Chen, Fu

    2014-01-01

    Most large-scale peer-to-peer (P2P) live streaming systems use mesh to organize peers and leverage pull scheduling to transmit packets for providing robustness in dynamic environment. The pull scheduling brings large packet delay. Network coding makes the push scheduling feasible in mesh P2P live streaming and improves the efficiency. However, it may also introduce some extra delays and coding computational overhead. To improve the packet delay, streaming quality, and coding overhead, in this paper are as follows. we propose a QoS driven push scheduling approach. The main contributions of this paper are: (i) We introduce a new network coding method to increase the content diversity and reduce the complexity of scheduling; (ii) we formulate the push scheduling as an optimization problem and transform it to a min-cost flow problem for solving it in polynomial time; (iii) we propose a push scheduling algorithm to reduce the coding overhead and do extensive experiments to validate the effectiveness of our approach. Compared with previous approaches, the simulation results demonstrate that packet delay, continuity index, and coding ratio of our system can be significantly improved, especially in dynamic environments.

  10. Exploring a QoS Driven Scheduling Approach for Peer-to-Peer Live Streaming Systems with Network Coding

    PubMed Central

    Cui, Laizhong; Lu, Nan; Chen, Fu

    2014-01-01

    Most large-scale peer-to-peer (P2P) live streaming systems use mesh to organize peers and leverage pull scheduling to transmit packets for providing robustness in dynamic environment. The pull scheduling brings large packet delay. Network coding makes the push scheduling feasible in mesh P2P live streaming and improves the efficiency. However, it may also introduce some extra delays and coding computational overhead. To improve the packet delay, streaming quality, and coding overhead, in this paper are as follows. we propose a QoS driven push scheduling approach. The main contributions of this paper are: (i) We introduce a new network coding method to increase the content diversity and reduce the complexity of scheduling; (ii) we formulate the push scheduling as an optimization problem and transform it to a min-cost flow problem for solving it in polynomial time; (iii) we propose a push scheduling algorithm to reduce the coding overhead and do extensive experiments to validate the effectiveness of our approach. Compared with previous approaches, the simulation results demonstrate that packet delay, continuity index, and coding ratio of our system can be significantly improved, especially in dynamic environments. PMID:25114968

  11. Design and fabrication of passive wireless sensor array system using composite coding resonant SAW transducer

    NASA Astrophysics Data System (ADS)

    Li, Ping; Wen, Yumei

    2006-02-01

    This paper presents a novel composite SAW (surface acoustic wave) passive wireless sensor system involving a resonator and a delay line. While the interrogational signal is a sinusoidal burst, the response is a delayed and damped oscillation. The frequency and the delay time of response are related to the measurand and the coding of the sensor element, respectively. The composite sensor consists of a SAW resonator and a delay line. It combines the advantages of these two devices and can be used as elements of multiple sensors for longer distance passive wireless measurements. As the wireless sensing response is weak and transient, in order to get the response with the maximum signal-to-noise ratio, the interrogational frequency is designed to be adjustable according to the result of frequency estimation. As a result, an optimal sensing result is achieved. In the transceiver set-up, the software DDS (direct digital synthesis) source with a rather high resolution is implemented to track the passive wireless sensor. An isolated switch is set in transmitter to depress the correlation leakage noise after switching off the wireless RF (radio frequency) interrogation signal. In this paper, the characteristics of the response, the working procedure of the signal processing, sensor temperature test results and the system error analyses are elaborated. A prototype instrument is built. Experimental results show the effectiveness of the instrumentation and the advantages of the composite sensor system.

  12. Advanced modulation technology development for earth station demodulator applications. Coded modulation system development

    NASA Astrophysics Data System (ADS)

    Miller, Susan P.; Kappes, J. Mark; Layer, David H.; Johnson, Peter N.

    1990-04-01

    A jointly optimized coded modulation system is described which was designed, built, and tested by COMSAT Laboratories for NASA LeRC which provides a bandwidth efficiency of 2 bits/s/Hz at an information rate of 160 Mbit/s. A high speed rate 8/9 encoder with a Viterbi decoder and an Octal PSK modem are used to achieve this. The BER performance is approximately 1 dB from the theoretically calculated value for this system at a BER of 5 E-7 under nominal conditions. The system operates in burst mode for downlink applications and tests have demonstrated very little degradation in performance with frequency and level offset. Unique word miss rate measurements were conducted which demonstrate reliable acquisition at low values of Eb/No. Codec self tests have verified the performance of this subsystem in a stand alone mode. The codec is capable of operation at a 200 Mbit/s information rate as demonstrated using a codec test set which introduces noise digitally. The measured performance is within 0.2 dB of the computer simulated predictions. A gate array implementation of the most time critical element of the high speed Viterbi decoder was completed. This gate array add-compare-select chip significantly reduces the power consumption and improves the manufacturability of the decoder. This chip has general application in the implementation of high speed Viterbi decoders.

  13. DOSEXPRT: A bioassay dosimetry code for Martin Marietta Energy Systems, Inc

    SciTech Connect

    Ward, R.C.; Eckerman, K.F.

    1992-04-01

    The bioassay code DOSEXPRT was developed for Martin Marietta Energy Systems, Inc., to provide compliance with Department of Energy (DOE) Order 5480, Chapter 11. DOSEXPRT computes the intake of a radionuclide in any year (considering both acute and chronic intakes) from in vivo measurements of the retained activity and/or measurements of the activity in excreta. The committed effective and organ doses for the intake are computed as well as the effective and organ doses expected to be received in each calendar year out to 50 years beyond the year of intake. The bioassay records used as input for DOSEXPRT are extracted from the Martin Marietta Energy Systems Occupational Health Information System (OHIS). DOSEXPRT implements a set of algorithms with parameters governing the translocation, retention, and excretion of the nuclide contained in data files specific to the nuclide. These files also contain dose-per-unit-intake coefficients used to compute the committed dose equivalent for the intakes in the year. Annual organ and effective doses are computed using additional dose-rate files that contain data on the dose rate at various times following a unit intake. If measurements are presented for more than one assay for a given nuclide, DOSEXPRT estimates the intake by applying weights assigned in the nuclide file for each assay. DOSEXPRT is accessed off the OHIS MENU No. 4 and designed to be run as a batch processor, but can also be run interactively for testing purposes.

  14. DOSEXPRT: A bioassay dosimetry code for Martin Marietta Energy Systems, Inc.

    SciTech Connect

    Ward, R.C.; Eckerman, K.F.

    1992-04-01

    The bioassay code DOSEXPRT was developed for Martin Marietta Energy Systems, Inc., to provide compliance with Department of Energy (DOE) Order 5480, Chapter 11. DOSEXPRT computes the intake of a radionuclide in any year (considering both acute and chronic intakes) from in vivo measurements of the retained activity and/or measurements of the activity in excreta. The committed effective and organ doses for the intake are computed as well as the effective and organ doses expected to be received in each calendar year out to 50 years beyond the year of intake. The bioassay records used as input for DOSEXPRT are extracted from the Martin Marietta Energy Systems Occupational Health Information System (OHIS). DOSEXPRT implements a set of algorithms with parameters governing the translocation, retention, and excretion of the nuclide contained in data files specific to the nuclide. These files also contain dose-per-unit-intake coefficients used to compute the committed dose equivalent for the intakes in the year. Annual organ and effective doses are computed using additional dose-rate files that contain data on the dose rate at various times following a unit intake. If measurements are presented for more than one assay for a given nuclide, DOSEXPRT estimates the intake by applying weights assigned in the nuclide file for each assay. DOSEXPRT is accessed off the OHIS MENU No. 4 and designed to be run as a batch processor, but can also be run interactively for testing purposes.

  15. Advanced modulation technology development for earth station demodulator applications. Coded modulation system development

    NASA Technical Reports Server (NTRS)

    Miller, Susan P.; Kappes, J. Mark; Layer, David H.; Johnson, Peter N.

    1990-01-01

    A jointly optimized coded modulation system is described which was designed, built, and tested by COMSAT Laboratories for NASA LeRC which provides a bandwidth efficiency of 2 bits/s/Hz at an information rate of 160 Mbit/s. A high speed rate 8/9 encoder with a Viterbi decoder and an Octal PSK modem are used to achieve this. The BER performance is approximately 1 dB from the theoretically calculated value for this system at a BER of 5 E-7 under nominal conditions. The system operates in burst mode for downlink applications and tests have demonstrated very little degradation in performance with frequency and level offset. Unique word miss rate measurements were conducted which demonstrate reliable acquisition at low values of Eb/No. Codec self tests have verified the performance of this subsystem in a stand alone mode. The codec is capable of operation at a 200 Mbit/s information rate as demonstrated using a codec test set which introduces noise digitally. The measured performance is within 0.2 dB of the computer simulated predictions. A gate array implementation of the most time critical element of the high speed Viterbi decoder was completed. This gate array add-compare-select chip significantly reduces the power consumption and improves the manufacturability of the decoder. This chip has general application in the implementation of high speed Viterbi decoders.

  16. Convolutional coding techniques for data protection

    NASA Technical Reports Server (NTRS)

    Massey, J. L.

    1975-01-01

    Results of research on the use of convolutional codes in data communications are presented. Convolutional coding fundamentals are discussed along with modulation and coding interaction. Concatenated coding systems and data compression with convolutional codes are described.

  17. Metabolic and hematological consequences of dietary deoxynivalenol interacting with systemic Escherichia coli lipopolysaccharide.

    PubMed

    Bannert, Erik; Tesch, Tanja; Kluess, Jeannette; Frahm, Jana; Kersten, Susanne; Kahlert, Stefan; Renner, Lydia; Rothkötter, Hermann-Josef; Dänicke, Sven

    2015-11-16

    Previous studies have shown that chronic oral deoxynivalenol (DON) exposure modulated Escherichia coli lipopolysaccharide (LPS)-induced systemic inflammation, whereby the liver was suspected to play an important role. Thus, a total of 41 barrows was fed one of two maize-based diets, either a DON-diet (4.59 mg DON/kg feed, n = 19) or a control diet (CON, n = 22). Pigs were equipped with indwelling catheters for pre- or post-hepatic (portal vs. jugular catheter) infusion of either control (0.9% NaCl) or LPS (7.5 µg/kg BW) for 1h and frequent blood sampling. This design yielded six groups: CON_CONjugular‑CONportal, CON_CONjugular‑LPSportal, CON_LPSjugular‑CONportal, DON_CONjugular‑CONportal, DON_CONjugular‑LPSportal and DON_LPSjugular‑CONportal. Blood samples were analyzed for blood gases, electrolytes, glucose, pH, lactate and red hemogram. The red hemogram and electrolytes were not affected by DON and LPS. DON-feeding solely decreased portal glucose uptake (p < 0.05). LPS-decreased partial oxygen pressure (pO₂) overall (p < 0.05), but reduced pCO₂ only in arterial blood, and DON had no effect on either. Irrespective of catheter localization, LPS decreased pH and base-excess (p < 0.01), but increased lactate and anion-gap (p < 0.01), indicating an emerging lactic acidosis. Lactic acidosis was more pronounced in the group DON_LPSjugular-CONportal than in CON-fed counterparts (p < 0.05). DON-feeding aggravated the porcine acid-base balance in response to a subsequent immunostimulus dependent on its exposure site (pre- or post-hepatic).

  18. Metabolic and Hematological Consequences of Dietary Deoxynivalenol Interacting with Systemic Escherichia coli Lipopolysaccharide

    PubMed Central

    Bannert, Erik; Tesch, Tanja; Kluess, Jeannette; Frahm, Jana; Kersten, Susanne; Kahlert, Stefan; Renner, Lydia; Rothkötter, Hermann-Josef; Dänicke, Sven

    2015-01-01

    Previous studies have shown that chronic oral deoxynivalenol (DON) exposure modulated Escherichia coli lipopolysaccharide (LPS)-induced systemic inflammation, whereby the liver was suspected to play an important role. Thus, a total of 41 barrows was fed one of two maize-based diets, either a DON-diet (4.59 mg DON/kg feed, n = 19) or a control diet (CON, n = 22). Pigs were equipped with indwelling catheters for pre- or post-hepatic (portal vs. jugular catheter) infusion of either control (0.9% NaCl) or LPS (7.5 µg/kg BW) for 1h and frequent blood sampling. This design yielded six groups: CON_CONjugular-CONportal, CON_CONjugular-LPSportal, CON_LPSjugular-CONportal, DON_CONjugular-CONportal, DON_CONjugular-LPSportal and DON_LPSjugular-CONportal. Blood samples were analyzed for blood gases, electrolytes, glucose, pH, lactate and red hemogram. The red hemogram and electrolytes were not affected by DON and LPS. DON-feeding solely decreased portal glucose uptake (p < 0.05). LPS-decreased partial oxygen pressure (pO2) overall (p < 0.05), but reduced pCO2 only in arterial blood, and DON had no effect on either. Irrespective of catheter localization, LPS decreased pH and base-excess (p < 0.01), but increased lactate and anion-gap (p < 0.01), indicating an emerging lactic acidosis. Lactic acidosis was more pronounced in the group DON_LPSjugular-CONportal than in CON-fed counterparts (p < 0.05). DON-feeding aggravated the porcine acid-base balance in response to a subsequent immunostimulus dependent on its exposure site (pre- or post-hepatic). PMID:26580654

  19. Systemic mistakes in hand hygiene practice in Ukraine: detection, consequences and ways of elimination

    PubMed Central

    Klymenko, Iryna; Kampf, Günter

    2015-01-01

    Aim: Every year, millions of people around the world suffer from different infectious diseases, considerable part of which are hospital-acquired infections. WHO considers hand hygiene as a priority measure aimed to reduce the level of infection. We evaluated various aspects related to the situational behavior and prioritization regarding hand hygiene measures among the healthcare workers of Ukraine. Method: Identification of system mistakes in hand hygiene was carried out first of all by direct and indirect observation of the activities of medical and pharmaceutical personnel in their everyday practice as well as during their participation in trainings on routine hand hygiene. Questionnaires also were used to estimate the level of hand hygiene compliance of participants of the study. During this period 112 training courses, 315 master-classes and presentations on proper hand hygiene were realized. The target audience included health care workers of medical centers, clinics, maternity hospitals, health care organizations and staff of pharmacies and pharmaceutical manufacturing enterprises in all regions of Ukraine. 638 respondents took part in anonymous survey on hand hygiene practice. Results: The most common mistakes were to regard hand washing and hand disinfection equally, to wash hands before doing a hand disinfection, to neglect the five moments for hand hygiene and to ignore hand hygiene before and after wearing protective gloves. Practitioners, medical attendants, pharmacy and pharmaceutical industry workers highlighted the need for practical and understandable instructions of various hand hygiene procedures, including the clarification of the possible technical mistakes. This became a ground for us to create individual master classes on hand hygiene for each cluster of healthcare workers. Conclusions: Changing hand hygiene behavior and attitude is possible by beginning to observe clinical practice and by involving healthcare workers in teaching and training

  20. Requirements for imaging vulnerable plaque in the coronary artery using a coded aperture imaging system

    NASA Astrophysics Data System (ADS)

    Tozian, Cynthia

    A coded aperture1 plate was employed on a conventional gamma camera for 3D single photon emission computed tomography (SPECT) imaging on small animal models. The coded aperture design was selected to improve the spatial resolution and decrease the minimum detectable activity (MDA) required to image plaque formation in the APoE (apolipoprotein E) gene deficient mouse model when compared to conventional SPECT techniques. The pattern that was tested was a no-two-holes-touching (NTHT) modified uniformly redundant array (MURA) having 1,920 pinholes. The number of pinholes combined with the thin sintered tungsten plate was designed to increase the efficiency of the imaging modality over conventional gamma camera imaging methods while improving spatial resolution and reducing noise in the image reconstruction. The MDA required to image the vulnerable plaque in a human cardiac-torso mathematical phantom was simulated with a Monte Carlo code and evaluated to determine the optimum plate thickness by a receiver operating characteristic (ROC) yielding the lowest possible MDA and highest area under the curve (AUC). A partial 3D expectation maximization (EM) reconstruction was developed to improve signal-to-noise ratio (SNR), dynamic range, and spatial resolution over the linear correlation method of reconstruction. This improvement was evaluated by imaging a mini hot rod phantom, simulating the dynamic range, and by performing a bone scan of the C-57 control mouse. Results of the experimental and simulated data as well as other plate designs were analyzed for use as a small animal and potentially human cardiac imaging modality for a radiopharmaceutical developed at Bristol-Myers Squibb Medical Imaging Company, North Billerica, MA, for diagnosing vulnerable plaques. If left untreated, these plaques may rupture causing sudden, unexpected coronary occlusion and death. The results of this research indicated that imaging and reconstructing with this new partial 3D algorithm improved