Sample records for consequence code system

  1. Methods for nuclear air-cleaning-system accident-consequence assessment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Andrae, R.W.; Bolstad, J.W.; Gregory, W.S.

    1982-01-01

    This paper describes a multilaboratory research program that is directed toward addressing many questions that analysts face when performing air cleaning accident consequence assessments. The program involves developing analytical tools and supportive experimental data that will be useful in making more realistic assessments of accident source terms within and up to the atmospheric boundaries of nuclear fuel cycle facilities. The types of accidents considered in this study includes fires, explosions, spills, tornadoes, criticalities, and equipment failures. The main focus of the program is developing an accident analysis handbook (AAH). We will describe the contents of the AAH, which include descriptionsmore » of selected nuclear fuel cycle facilities, process unit operations, source-term development, and accident consequence analyses. Three computer codes designed to predict gas and material propagation through facility air cleaning systems are described. These computer codes address accidents involving fires (FIRAC), explosions (EXPAC), and tornadoes (TORAC). The handbook relies on many illustrative examples to show the analyst how to approach accident consequence assessments. We will use the FIRAC code and a hypothetical fire scenario to illustrate the accident analysis capability.« less

  2. The Therapy Process Observational Coding System for Child Psychotherapy Strategies Scale

    ERIC Educational Resources Information Center

    McLeod, Bryce D.; Weisz, John R.

    2010-01-01

    Most everyday child and adolescent psychotherapy does not follow manuals that document the procedures. Consequently, usual clinical care has remained poorly understood and rarely studied. The Therapy Process Observational Coding System for Child Psychotherapy-Strategies scale (TPOCS-S) is an observational measure of youth psychotherapy procedures…

  3. Evaluation of severe accident risks: Quantification of major input parameters: MAACS (MELCOR Accident Consequence Code System) input

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sprung, J.L.; Jow, H-N; Rollstin, J.A.

    1990-12-01

    Estimation of offsite accident consequences is the customary final step in a probabilistic assessment of the risks of severe nuclear reactor accidents. Recently, the Nuclear Regulatory Commission reassessed the risks of severe accidents at five US power reactors (NUREG-1150). Offsite accident consequences for NUREG-1150 source terms were estimated using the MELCOR Accident Consequence Code System (MACCS). Before these calculations were performed, most MACCS input parameters were reviewed, and for each parameter reviewed, a best-estimate value was recommended. This report presents the results of these reviews. Specifically, recommended values and the basis for their selection are presented for MACCS atmospheric andmore » biospheric transport, emergency response, food pathway, and economic input parameters. Dose conversion factors and health effect parameters are not reviewed in this report. 134 refs., 15 figs., 110 tabs.« less

  4. Finding Resolution for the Responsible Transparency of Economic Models in Health and Medicine.

    PubMed

    Padula, William V; McQueen, Robert Brett; Pronovost, Peter J

    2017-11-01

    The Second Panel on Cost-Effectiveness in Health and Medicine recommendations for conduct, methodological practices, and reporting of cost-effectiveness analyses has a number of questions unanswered with respect to the implementation of transparent, open source code interface for economic models. The possibility of making economic model source code could be positive and progressive for the field; however, several unintended consequences of this system should be first considered before complete implementation of this model. First, there is the concern regarding intellectual property rights that modelers have to their analyses. Second, the open source code could make analyses more accessible to inexperienced modelers, leading to inaccurate or misinterpreted results. We propose several resolutions to these concerns. The field should establish a licensing system of open source code such that the model originators maintain control of the code use and grant permissions to other investigators who wish to use it. The field should also be more forthcoming towards the teaching of cost-effectiveness analysis in medical and health services education so that providers and other professionals are familiar with economic modeling and able to conduct analyses with open source code. These types of unintended consequences need to be fully considered before the field's preparedness to move forward into an era of model transparency with open source code.

  5. Ultra Safe And Secure Blasting System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hart, M M

    2009-07-27

    The Ultra is a blasting system that is designed for special applications where the risk and consequences of unauthorized demolition or blasting are so great that the use of an extraordinarily safe and secure blasting system is justified. Such a blasting system would be connected and logically welded together through digital code-linking as part of the blasting system set-up and initialization process. The Ultra's security is so robust that it will defeat the people who designed and built the components in any attempt at unauthorized detonation. Anyone attempting to gain unauthorized control of the system by substituting components or tappingmore » into communications lines will be thwarted in their inability to provide encrypted authentication. Authentication occurs through the use of codes that are generated by the system during initialization code-linking and the codes remain unknown to anyone, including the authorized operator. Once code-linked, a closed system has been created. The system requires all components connected as they were during initialization as well as a unique code entered by the operator for function and blasting.« less

  6. Non-coding variants contribute to the clinical heterogeneity of TTR amyloidosis.

    PubMed

    Iorio, Andrea; De Lillo, Antonella; De Angelis, Flavio; Di Girolamo, Marco; Luigetti, Marco; Sabatelli, Mario; Pradotto, Luca; Mauro, Alessandro; Mazzeo, Anna; Stancanelli, Claudia; Perfetto, Federico; Frusconi, Sabrina; My, Filomena; Manfellotto, Dario; Fuciarelli, Maria; Polimanti, Renato

    2017-09-01

    Coding mutations in TTR gene cause a rare hereditary form of systemic amyloidosis, which has a complex genotype-phenotype correlation. We investigated the role of non-coding variants in regulating TTR gene expression and consequently amyloidosis symptoms. We evaluated the genotype-phenotype correlation considering the clinical information of 129 Italian patients with TTR amyloidosis. Then, we conducted a re-sequencing of TTR gene to investigate how non-coding variants affect TTR expression and, consequently, phenotypic presentation in carriers of amyloidogenic mutations. Polygenic scores for genetically determined TTR expression were constructed using data from our re-sequencing analysis and the GTEx (Genotype-Tissue Expression) project. We confirmed a strong phenotypic heterogeneity across coding mutations causing TTR amyloidosis. Considering the effects of non-coding variants on TTR expression, we identified three patient clusters with specific expression patterns associated with certain phenotypic presentations, including late onset, autonomic neurological involvement, and gastrointestinal symptoms. This study provides novel data regarding the role of non-coding variation and the gene expression profiles in patients affected by TTR amyloidosis, also putting forth an approach that could be used to investigate the mechanisms at the basis of the genotype-phenotype correlation of the disease.

  7. New quantum codes derived from a family of antiprimitive BCH codes

    NASA Astrophysics Data System (ADS)

    Liu, Yang; Li, Ruihu; Lü, Liangdong; Guo, Luobin

    The Bose-Chaudhuri-Hocquenghem (BCH) codes have been studied for more than 57 years and have found wide application in classical communication system and quantum information theory. In this paper, we study the construction of quantum codes from a family of q2-ary BCH codes with length n=q2m+1 (also called antiprimitive BCH codes in the literature), where q≥4 is a power of 2 and m≥2. By a detailed analysis of some useful properties about q2-ary cyclotomic cosets modulo n, Hermitian dual-containing conditions for a family of non-narrow-sense antiprimitive BCH codes are presented, which are similar to those of q2-ary primitive BCH codes. Consequently, via Hermitian Construction, a family of new quantum codes can be derived from these dual-containing BCH codes. Some of these new antiprimitive quantum BCH codes are comparable with those derived from primitive BCH codes.

  8. [Quality management and strategic consequences of assessing documentation and coding under the German Diagnostic Related Groups system].

    PubMed

    Schnabel, M; Mann, D; Efe, T; Schrappe, M; V Garrel, T; Gotzen, L; Schaeg, M

    2004-10-01

    The introduction of the German Diagnostic Related Groups (D-DRG) system requires redesigning administrative patient management strategies. Wrong coding leads to inaccurate grouping and endangers the reimbursement of treatment costs. This situation emphasizes the roles of documentation and coding as factors of economical success. The aims of this study were to assess the quantity and quality of initial documentation and coding (ICD-10 and OPS-301) and find operative strategies to improve efficiency and strategic means to ensure optimal documentation and coding quality. In a prospective study, documentation and coding quality were evaluated in a standardized way by weekly assessment. Clinical data from 1385 inpatients were processed for initial correctness and quality of documentation and coding. Principal diagnoses were found to be accurate in 82.7% of cases, inexact in 7.1%, and wrong in 10.1%. Effects on financial returns occurred in 16%. Based on these findings, an optimized, interdisciplinary, and multiprofessional workflow on medical documentation, coding, and data control was developed. Workflow incorporating regular assessment of documentation and coding quality is required by the DRG system to ensure efficient accounting of hospital services. Interdisciplinary and multiprofessional cooperation is recognized to be an important factor in establishing an efficient workflow in medical documentation and coding.

  9. Edge-diffraction effects in RCS predictions and their importance in systems analysis

    NASA Astrophysics Data System (ADS)

    Friess, W. F.; Klement, D.; Ruppel, M.; Stein, Volker

    1996-06-01

    In developing RCS prediction codes a variety of physical effects such as the edge diffraction effect have to be considered with the consequence that the computer effort increases considerably. This fact limits the field of application of such codes, especially if the RCS data serve as input parameters for system simulators which very often need these data for a high number of observation angles and/or frequencies. Vice versa the issues of a system analysis can be used to estimate the relevance of physical effects under system viewpoints and to rank them according to their magnitude. This paper tries to evaluate the importance of RCS predictions containing an edge diffracted field for systems analysis. A double dihedral with a strong depolarizing behavior and a generic airplane design containing many arbitrarily oriented edges are used as test structures. Data of the scattered field are generated by the RCS computer code SIGMA with and without including edge diffraction effects. These data are submitted to the code DORA to determine radar range and radar detectibility and to a SAR simulator code to generate SAR imagery. In both cases special scenarios are assumed. The essential features of the computer codes in their current state are described, the results are presented and discussed under systems viewpoints.

  10. PFLOTRAN-RepoTREND Source Term Comparison Summary.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Frederick, Jennifer M.

    Code inter-comparison studies are useful exercises to verify and benchmark independently developed software to ensure proper function, especially when the software is used to model high-consequence systems which cannot be physically tested in a fully representative environment. This summary describes the results of the first portion of the code inter-comparison between PFLOTRAN and RepoTREND, which compares the radionuclide source term used in a typical performance assessment.

  11. Probabilistic accident consequence uncertainty analysis -- Late health effects uncertain assessment. Volume 2: Appendices

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Little, M.P.; Muirhead, C.R.; Goossens, L.H.J.

    1997-12-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library ofmore » uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA late health effects models. This volume contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures, (3) the rationales and results for the expert panel on late health effects, (4) short biographies of the experts, and (5) the aggregated results of their responses.« less

  12. Probabilistic accident consequence uncertainty analysis -- Uncertainty assessment for internal dosimetry. Volume 2: Appendices

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goossens, L.H.J.; Kraan, B.C.P.; Cooke, R.M.

    1998-04-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library ofmore » uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA internal dosimetry models. This volume contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures, (3) the rationales and results for the panel on internal dosimetry, (4) short biographies of the experts, and (5) the aggregated results of their responses.« less

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rosenberg, Michael; Jonlin, Duane; Nadel, Steven

    Today’s building energy codes focus on prescriptive requirements for features of buildings that are directly controlled by the design and construction teams and verifiable by municipal inspectors. Although these code requirements have had a significant impact, they fail to influence a large slice of the building energy use pie – including not only miscellaneous plug loads, cooking equipment and commercial/industrial processes, but the maintenance and optimization of the code-mandated systems as well. Currently, code compliance is verified only through the end of construction, and there are no limits or consequences for the actual energy use in an occupied building. Inmore » the future, our suite of energy regulations will likely expand to include building efficiency, energy use or carbon emission budgets over their full life cycle. Intelligent building systems, extensive renewable energy, and a transition from fossil fuel to electric heating systems will likely be required to meet ultra-low-energy targets. This paper lays out the authors’ perspectives on how buildings may evolve over the course of the 21st century and the roles that codes and regulations will play in shaping those buildings of the future.« less

  14. Dynamics on Networks of Manifolds

    NASA Astrophysics Data System (ADS)

    DeVille, Lee; Lerman, Eugene

    2015-03-01

    We propose a precise definition of a continuous time dynamical system made up of interacting open subsystems. The interconnections of subsystems are coded by directed graphs. We prove that the appropriate maps of graphs called graph fibrations give rise to maps of dynamical systems. Consequently surjective graph fibrations give rise to invariant subsystems and injective graph fibrations give rise to projections of dynamical systems.

  15. A low-complexity and high performance concatenated coding scheme for high-speed satellite communications

    NASA Technical Reports Server (NTRS)

    Lin, Shu; Rhee, Dojun; Rajpal, Sandeep

    1993-01-01

    This report presents a low-complexity and high performance concatenated coding scheme for high-speed satellite communications. In this proposed scheme, the NASA Standard Reed-Solomon (RS) code over GF(2(exp 8) is used as the outer code and the second-order Reed-Muller (RM) code of Hamming distance 8 is used as the inner code. The RM inner code has a very simple trellis structure and is decoded with the soft-decision Viterbi decoding algorithm. It is shown that the proposed concatenated coding scheme achieves an error performance which is comparable to that of the NASA TDRS concatenated coding scheme in which the NASA Standard rate-1/2 convolutional code of constraint length 7 and d sub free = 10 is used as the inner code. However, the proposed RM inner code has much smaller decoding complexity, less decoding delay, and much higher decoding speed. Consequently, the proposed concatenated coding scheme is suitable for reliable high-speed satellite communications, and it may be considered as an alternate coding scheme for the NASA TDRS system.

  16. Electromagnetic code for naval applications

    NASA Astrophysics Data System (ADS)

    Crescimbeni, F.; Bessi, F.; Chiti, S.

    1988-12-01

    The use of an increasing number of electronic apparatus became vital to meet the high performance required for military Navy applications. Thus the number of antennas to be mounted on shipboard greatly increased. As a consequence of the high antenna density, of the complexity of the shipboard environment and of the powers used for communication and radar systems, the EMC (Electro-Magnetic Compatibility) problem is playing a leading role in the design of the topside of a ship. The Italian Navy has acquired a numerical code for the antenna siting and design. This code, together with experimental data measured at the Italian Navy test range facility, allows for the evaluation of optimal sitings for antenna systems on shipboard, and the prediction of their performances in the actual environment. The structure of this code, named Programma Elettromagnetico per Applicazioni Navali, (Electromagnetic Code for Naval Applications) is discussed, together with its capabilities and applications. Also the results obtained in some examples are presented and compared with the measurements.

  17. The feasibility of QR-code prescription in Taiwan.

    PubMed

    Lin, C-H; Tsai, F-Y; Tsai, W-L; Wen, H-W; Hu, M-L

    2012-12-01

    An ideal Health Care Service is a service system that focuses on patients. Patients in Taiwan have the freedom to fill their prescriptions at any pharmacies contracted with National Health Insurance. Each of these pharmacies uses its own computer system. So far, there are at least ten different systems on the market in Taiwan. To transmit the prescription information from the hospital to the pharmacy accurately and efficiently presents a great issue. This study consisted of two-dimensional applications using a QR-code to capture Patient's identification and prescription information from the hospitals as well as using a webcam to read the QR-code and transfer all data to the pharmacy computer system. Two hospitals and 85 community pharmacies participated in the study. During the trial, all participant pharmacies appraised highly of the accurate transmission of the prescription information. The contents in QR-code prescriptions from Taipei area were picked up efficiently and accurately in pharmacies at Taichung area (middle Taiwan) without software system limit and area limitation. The QR-code device received a patent (No. M376844, March 2010) from Intellectual Property Office Ministry of Economic Affair, China. Our trial has proven that QR-code prescription can provide community pharmacists an efficient, accurate and inexpensive device to digitalize the prescription contents. Consequently, pharmacists can offer better quality of pharmacy service to patients. © 2012 Blackwell Publishing Ltd.

  18. Probabilistic accident consequence uncertainty analysis -- Uncertainty assessment for deposited material and external doses. Volume 2: Appendices

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goossens, L.H.J.; Kraan, B.C.P.; Cooke, R.M.

    1997-12-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library ofmore » uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA deposited material and external dose models. This volume contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures, (3) the rationales and results for the panel on deposited material and external doses, (4) short biographies of the experts, and (5) the aggregated results of their responses.« less

  19. Athermalization of infrared dual field optical system based on wavefront coding

    NASA Astrophysics Data System (ADS)

    Jiang, Kai; Jiang, Bo; Liu, Kai; Yan, Peipei; Duan, Jing; Shan, Qiu-sha

    2017-02-01

    Wavefront coding is a technology which combination of the optical design and digital image processing. By inserting a phase mask closed to the pupil plane of the optical system the wavefront of the system is re-modulated. And the depth of focus is extended consequently. In reality the idea is same as the athermalization theory of infrared optical system. In this paper, an uncooled infrared dual field optical system with effective focal as 38mm/19mm, F number as 1.2 of both focal length, operating wavelength varying from 8μm to 12μm was designed. A cubic phase mask was used at the pupil plane to re-modulate the wavefront. Then the performance of the infrared system was simulated with CODEV as the environment temperature varying from -40° to 60°. MTF curve of the optical system with phase mask are compared with the outcome before using phase mask. The result show that wavefront coding technology can make the system not sensitive to thermal defocus, and then realize the athermal design of the infrared optical system.

  20. Proceedings of the 21st DOE/NRC Nuclear Air Cleaning Conference; Sessions 1--8

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    First, M.W.

    1991-02-01

    Separate abstracts have been prepared for the papers presented at the meeting on nuclear facility air cleaning technology in the following specific areas of interest: air cleaning technologies for the management and disposal of radioactive wastes; Canadian waste management program; radiological health effects models for nuclear power plant accident consequence analysis; filter testing; US standard codes on nuclear air and gas treatment; European community nuclear codes and standards; chemical processing off-gas cleaning; incineration and vitrification; adsorbents; nuclear codes and standards; mathematical modeling techniques; filter technology; safety; containment system venting; and nuclear air cleaning programs around the world. (MB)

  1. Development of high-fidelity multiphysics system for light water reactor analysis

    NASA Astrophysics Data System (ADS)

    Magedanz, Jeffrey W.

    There has been a tendency in recent years toward greater heterogeneity in reactor cores, due to the use of mixed-oxide (MOX) fuel, burnable absorbers, and longer cycles with consequently higher fuel burnup. The resulting asymmetry of the neutron flux and energy spectrum between regions with different compositions causes a need to account for the directional dependence of the neutron flux, instead of the traditional diffusion approximation. Furthermore, the presence of both MOX and high-burnup fuel in the core increases the complexity of the heat conduction. The heat transfer properties of the fuel pellet change with irradiation, and the thermal and mechanical expansion of the pellet and cladding strongly affect the size of the gap between them, and its consequent thermal resistance. These operational tendencies require higher fidelity multi-physics modeling capabilities, and this need is addressed by the developments performed within this PhD research. The dissertation describes the development of a High-Fidelity Multi-Physics System for Light Water Reactor Analysis. It consists of three coupled codes -- CTF for Thermal Hydraulics, TORT-TD for Neutron Kinetics, and FRAPTRAN for Fuel Performance. It is meant to address these modeling challenges in three ways: (1) by resolving the state of the system at the level of each fuel pin, rather than homogenizing entire fuel assemblies, (2) by using the multi-group Discrete Ordinates method to account for the directional dependence of the neutron flux, and (3) by using a fuel-performance code, rather than a Thermal Hydraulics code's simplified fuel model, to account for the material behavior of the fuel and its feedback to the hydraulic and neutronic behavior of the system. While the first two are improvements, the third, the use of a fuel-performance code for feedback, constitutes an innovation in this PhD project. Also important to this work is the manner in which such coupling is written. While coupling involves combining codes into a single executable, they are usually still developed and maintained separately. It should thus be a design objective to minimize the changes to those codes, and keep the changes to each code free of dependence on the details of the other codes. This will ease the incorporation of new versions of the code into the coupling, as well as re-use of parts of the coupling to couple with different codes. In order to fulfill this objective, an interface for each code was created in the form of an object-oriented abstract data type. Object-oriented programming is an effective method for enforcing a separation between different parts of a program, and clarifying the communication between them. The interfaces enable the main program to control the codes in terms of high-level functionality. This differs from the established practice of a master/slave relationship, in which the slave code is incorporated into the master code as a set of subroutines. While this PhD research continues previous work with a coupling between CTF and TORT-TD, it makes two major original contributions: (1) using a fuel-performance code, instead of a thermal-hydraulics code's simplified built-in models, to model the feedback from the fuel rods, and (2) the design of an object-oriented interface as an innovative method to interact with a coupled code in a high-level, easily-understandable manner. The resulting code system will serve as a tool to study the question of under what conditions, and to what extent, these higher-fidelity methods will provide benefits to reactor core analysis. (Abstract shortened by UMI.)

  2. Bilingual processing of ASL-English code-blends: The consequences of accessing two lexical representations simultaneously

    PubMed Central

    Emmorey, Karen; Petrich, Jennifer; Gollan, Tamar H.

    2012-01-01

    Bilinguals who are fluent in American Sign Language (ASL) and English often produce code-blends - simultaneously articulating a sign and a word while conversing with other ASL-English bilinguals. To investigate the cognitive mechanisms underlying code-blend processing, we compared picture-naming times (Experiment 1) and semantic categorization times (Experiment 2) for code-blends versus ASL signs and English words produced alone. In production, code-blending did not slow lexical retrieval for ASL and actually facilitated access to low-frequency signs. However, code-blending delayed speech production because bimodal bilinguals synchronized English and ASL lexical onsets. In comprehension, code-blending speeded access to both languages. Bimodal bilinguals’ ability to produce code-blends without any cost to ASL implies that the language system either has (or can develop) a mechanism for switching off competition to allow simultaneous production of close competitors. Code-blend facilitation effects during comprehension likely reflect cross-linguistic (and cross-modal) integration at the phonological and/or semantic levels. The absence of any consistent processing costs for code-blending illustrates a surprising limitation on dual-task costs and may explain why bimodal bilinguals code-blend more often than they code-switch. PMID:22773886

  3. Modification of the SAS4A Safety Analysis Code for Integration with the ADAPT Discrete Dynamic Event Tree Framework.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jankovsky, Zachary Kyle; Denman, Matthew R.

    It is difficult to assess the consequences of a transient in a sodium-cooled fast reactor (SFR) using traditional probabilistic risk assessment (PRA) methods, as numerous safety-related sys- tems have passive characteristics. Often there is significant dependence on the value of con- tinuous stochastic parameters rather than binary success/failure determinations. One form of dynamic PRA uses a system simulator to represent the progression of a transient, tracking events through time in a discrete dynamic event tree (DDET). In order to function in a DDET environment, a simulator must have characteristics that make it amenable to changing physical parameters midway through themore » analysis. The SAS4A SFR system analysis code did not have these characteristics as received. This report describes the code modifications made to allow dynamic operation as well as the linking to a Sandia DDET driver code. A test case is briefly described to demonstrate the utility of the changes.« less

  4. The perceived organizational impact of the gender gap across a Canadian department of medicine and proposed strategies to combat it: a qualitative study.

    PubMed

    Pattani, Reena; Marquez, Christine; Dinyarian, Camellia; Sharma, Malika; Bain, Julie; Moore, Julia E; Straus, Sharon E

    2018-04-10

    Despite the gender parity existing in medical schools for over three decades, women remain underrepresented in academic medical centers, particularly in senior ranks and in leadership roles. This has consequences for patient care, education, research, and workplace culture within healthcare organizations. This study was undertaken to explore the perspectives of faculty members at a single department of medicine on the impact of the existing gender gap on organizational effectiveness and workplace culture, and to identify systems-based strategies to mitigate the gap. The study took place at a large university department of medicine in Toronto, Canada, with six affiliated hospitals. In this qualitative study, semi-structured individual interviews were conducted between May and September 2016 with full-time faculty members who held clinical and university-based appointments. Transcripts of the interviews were analyzed using thematic analysis. Three authors independently reviewed the transcripts to determine a preliminary list of codes and establish a coding framework. A modified audit consensus coding approach was applied; a single analyst reviewed all the transcripts and a second analyst audited 20% of the transcripts in each round of coding. Following each round, inter-rater reliability was determined, discrepancies were resolved through discussion, and modifications were made as needed to the coding framework. The analysis revealed faculty members' perceptions of the gender gap, potential contributing factors, organizational impacts, and possible solutions to bridge the gap. Of the 43 full-time faculty members who participated in the survey (29 of whom self-identified as female), most participants were aware of the existing gender gap within academic medicine. Participants described social exclusion, reinforced stereotypes, and unprofessional behaviors as consequences of the gap on organizational effectiveness and culture. They suggested improvements in (1) the processes for recruitment, hiring, and promotion; (2) inclusiveness of the work environment; (3) structures for mentorship; and (4) ongoing monitoring of the gap. The existing gender gap in academic medicine may have negative consequences for organizational effectiveness and workplace culture but many systems-based strategies to mitigate the gap exist. Although these solutions warrant rigorous evaluation, they are feasible to institute within most healthcare organizations immediately.

  5. Probabilistic accident consequence uncertainty analysis: Dispersion and deposition uncertainty assessment, appendices A and B

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harper, F.T.; Young, M.L.; Miller, L.A.

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, completed in 1990, estimate the risks presented by nuclear installations based on postulated frequencies and magnitudes of potential accidents. In 1991, the US Nuclear Regulatory Commission (NRC) and the Commission of the European Communities (CEC) began a joint uncertainty analysis of the two codes. The objective was to develop credible and traceable uncertainty distributions for the input variables of the codes. Expert elicitation, developed independently, was identified as the best technology available for developing a library of uncertainty distributions for the selected consequence parameters. The study was formulatedmore » jointly and was limited to the current code models and to physical quantities that could be measured in experiments. To validate the distributions generated for the wet deposition input variables, samples were taken from these distributions and propagated through the wet deposition code model along with the Gaussian plume model (GPM) implemented in the MACCS and COSYMA codes. Resulting distributions closely replicated the aggregated elicited wet deposition distributions. Project teams from the NRC and CEC cooperated successfully to develop and implement a unified process for the elaboration of uncertainty distributions on consequence code input parameters. Formal expert judgment elicitation proved valuable for synthesizing the best available information. Distributions on measurable atmospheric dispersion and deposition parameters were successfully elicited from experts involved in the many phenomenological areas of consequence analysis. This volume is the second of a three-volume document describing the project and contains two appendices describing the rationales for the dispersion and deposition data along with short biographies of the 16 experts who participated in the project.« less

  6. New French Regulation for NPPs and Code Consequences

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Faidy, Claude

    2006-07-01

    On December 2005, the French regulator issued a new regulation for French nuclear power plants, in particular for pressure equipment (PE). This regulation need first to agree with non-nuclear PE regulation and add to that some specific requirements, in particular radiation protection requirements. Different advantages are in these proposal, it's more qualitative risk oriented and it's an important link with non-nuclear industry. Only few components are nuclear specific. But, the general philosophy of the existing Codes (RCC-M [15], KTA [16] or ASME [17]) have to be improved. For foreign Codes, it's plan to define the differences in the user specifications.more » In parallel to that, a new safety classification has been developed by French utility. The consequences is the need to cross all these specifications to define a minimum quality level for each components or systems. In the same time a new concept has been developed to replace the well known 'Leak Before Break methodology': the 'Break Exclusion' methodology. This paper will summarize the key aspects of these different topics. (authors)« less

  7. LDPC product coding scheme with extrinsic information for bit patterned media recoding

    NASA Astrophysics Data System (ADS)

    Jeong, Seongkwon; Lee, Jaejin

    2017-05-01

    Since the density limit of the current perpendicular magnetic storage system will soon be reached, bit patterned media recording (BPMR) is a promising candidate for the next generation storage system to achieve an areal density beyond 1 Tb/in2. Each recording bit is stored in a fabricated magnetic island and the space between the magnetic islands is nonmagnetic in BPMR. To approach recording densities of 1 Tb/in2, the spacing of the magnetic islands must be less than 25 nm. Consequently, severe inter-symbol interference (ISI) and inter-track interference (ITI) occur. ITI and ISI degrade the performance of BPMR. In this paper, we propose a low-density parity check (LDPC) product coding scheme that exploits extrinsic information for BPMR. This scheme shows an improved bit error rate performance compared to that in which one LDPC code is used.

  8. Emergence of Coding and its Specificity as a Physico-Informatic Problem

    NASA Astrophysics Data System (ADS)

    Wills, Peter R.; Nieselt, Kay; McCaskill, John S.

    2015-06-01

    We explore the origin-of-life consequences of the view that biological systems are demarcated from inanimate matter by their possession of referential information, which is processed computationally to control choices of specific physico-chemical events. Cells are cybernetic: they use genetic information in processes of communication and control, subjecting physical events to a system of integrated governance. The genetic code is the most obvious example of how cells use information computationally, but the historical origin of the usefulness of molecular information is not well understood. Genetic coding made information useful because it imposed a modular metric on the evolutionary search and thereby offered a general solution to the problem of finding catalysts of any specificity. We use the term "quasispecies symmetry breaking" to describe the iterated process of self-organisation whereby the alphabets of distinguishable codons and amino acids increased, step by step.

  9. Dynamic event tree analysis with the SAS4A/SASSYS-1 safety analysis code

    DOE PAGES

    Jankovsky, Zachary K.; Denman, Matthew R.; Aldemir, Tunc

    2018-02-02

    The consequences of a transient in an advanced sodium-cooled fast reactor are difficult to capture with the traditional approach to probabilistic risk assessment (PRA). Numerous safety-relevant systems are passive and may have operational states that cannot be represented by binary success or failure. In addition, the specific order and timing of events may be crucial which necessitates the use of dynamic PRA tools such as ADAPT. The modifications to the SAS4A/SASSYS-1 sodium-cooled fast reactor safety analysis code for linking it to ADAPT to perform a dynamic PRA are described. A test case is used to demonstrate the linking process andmore » to illustrate the type of insights that may be gained with this process. Finally, newly-developed dynamic importance measures are used to assess the significance of reactor parameters/constituents on calculated consequences of initiating events.« less

  10. Dynamic event tree analysis with the SAS4A/SASSYS-1 safety analysis code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jankovsky, Zachary K.; Denman, Matthew R.; Aldemir, Tunc

    The consequences of a transient in an advanced sodium-cooled fast reactor are difficult to capture with the traditional approach to probabilistic risk assessment (PRA). Numerous safety-relevant systems are passive and may have operational states that cannot be represented by binary success or failure. In addition, the specific order and timing of events may be crucial which necessitates the use of dynamic PRA tools such as ADAPT. The modifications to the SAS4A/SASSYS-1 sodium-cooled fast reactor safety analysis code for linking it to ADAPT to perform a dynamic PRA are described. A test case is used to demonstrate the linking process andmore » to illustrate the type of insights that may be gained with this process. Finally, newly-developed dynamic importance measures are used to assess the significance of reactor parameters/constituents on calculated consequences of initiating events.« less

  11. Essays on Information Assurance: Examination of Detrimental Consequences of Information Security, Privacy, and Extreme Event Concerns on Individual and Organizational Use of Systems

    ERIC Educational Resources Information Center

    Park, Insu

    2010-01-01

    The purpose of this study is to explore systems users' behavior on IS under the various circumstances (e.g., email usage and malware threats, online communication at the individual level, and IS usage in organizations). Specifically, the first essay develops a method for analyzing and predicting the impact category of malicious code, particularly…

  12. Interface Control Document for the EMPACT Module that Estimates Electric Power Transmission System Response to EMP-Caused Damage

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Werley, Kenneth Alan; Mccown, Andrew William

    The EPREP code is designed to evaluate the effects of an Electro-Magnetic Pulse (EMP) on the electric power transmission system. The EPREP code embodies an umbrella framework that allows a user to set up analysis conditions and to examine analysis results. The code links to three major physics/engineering modules. The first module describes the EM wave in space and time. The second module evaluates the damage caused by the wave on specific electric power (EP) transmission system components. The third module evaluates the consequence of the damaged network on its (reduced) ability to provide electric power to meet demand. Thismore » third module is the focus of the present paper. The EMPACT code serves as the third module. The EMPACT name denotes EMP effects on Alternating Current Transmission systems. The EMPACT algorithms compute electric power transmission network flow solutions under severely damaged network conditions. Initial solutions are often characterized by unacceptible network conditions including line overloads and bad voltages. The EMPACT code contains algorithms to adjust optimally network parameters to eliminate network problems while minimizing outages. System adjustments include automatically adjusting control equipment (generator V control, variable transformers, and variable shunts), as well as non-automatic control of generator power settings and minimal load shedding. The goal is to evaluate the minimal loss of customer load under equilibrium (steady-state) conditions during peak demand.« less

  13. Sources of financial pressure and up coding behavior in French public hospitals.

    PubMed

    Georgescu, Irène; Hartmann, Frank G H

    2013-05-01

    Drawing upon role theory and the literature concerning unintended consequences of financial pressure, this study investigates the effects of health care decision pressure from the hospital's administration and from the professional peer group on physician's inclination to engage in up coding. We explore two kinds of up coding, information-related and action-related, and develop hypothesis that connect these kinds of data manipulation to the sources of pressure via the intermediate effect of role conflict. Qualitative data from initial interviews with physicians and subsequent questionnaire evidence from 578 physicians in 14 French hospitals suggest that the source of pressure is a relevant predictor of physicians' inclination to engage in data-manipulation. We further find that this effect is partly explained by the extent to which these pressures create role conflict. Given the concern about up coding in treatment-based reimbursement systems worldwide, our analysis adds to understanding how the design of the hospital's management control system may enhance this undesired type of behavior. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  14. The Development of Bimodal Bilingualism: Implications for Linguistic Theory.

    PubMed

    Lillo-Martin, Diane; de Quadros, Ronice Müller; Pichler, Deborah Chen

    2016-01-01

    A wide range of linguistic phenomena contribute to our understanding of the architecture of the human linguistic system. In this paper we present a proposal dubbed Language Synthesis to capture bilingual phenomena including code-switching and 'transfer' as automatic consequences of the addition of a second language, using basic concepts of Minimalism and Distributed Morphology. Bimodal bilinguals, who use a sign language and a spoken language, provide a new type of evidence regarding possible bilingual phenomena, namely code-blending, the simultaneous production of (aspects of) a message in both speech and sign. We argue that code-blending also follows naturally once a second articulatory interface is added to the model. Several different types of code-blending are discussed in connection to the predictions of the Synthesis model. Our primary data come from children developing as bimodal bilinguals, but our proposal is intended to capture a wide range of bilingual effects across any language pair.

  15. The Development of Bimodal Bilingualism: Implications for Linguistic Theory

    PubMed Central

    Lillo-Martin, Diane; de Quadros, Ronice Müller; Pichler, Deborah Chen

    2017-01-01

    A wide range of linguistic phenomena contribute to our understanding of the architecture of the human linguistic system. In this paper we present a proposal dubbed Language Synthesis to capture bilingual phenomena including code-switching and ‘transfer’ as automatic consequences of the addition of a second language, using basic concepts of Minimalism and Distributed Morphology. Bimodal bilinguals, who use a sign language and a spoken language, provide a new type of evidence regarding possible bilingual phenomena, namely code-blending, the simultaneous production of (aspects of) a message in both speech and sign. We argue that code-blending also follows naturally once a second articulatory interface is added to the model. Several different types of code-blending are discussed in connection to the predictions of the Synthesis model. Our primary data come from children developing as bimodal bilinguals, but our proposal is intended to capture a wide range of bilingual effects across any language pair. PMID:28603576

  16. A time series analysis of presentations to Queensland health facilities for alcohol-related conditions, following the increase in 'alcopops' tax.

    PubMed

    Kisely, Steve; Crowe, Elizabeth; Lawrence, David; White, Angela; Connor, Jason

    2013-08-01

    In response to concerns about the health consequences of high-risk drinking by young people, the Australian Government increased the tax on pre-mixed alcoholic beverages ('alcopops') favoured by this demographic. We measured changes in admissions for alcohol-related harm to health throughout Queensland, before and after the tax increase in April 2008. We used data from the Queensland Trauma Register, Hospitals Admitted Patients Data Collection, and the Emergency Department Information System to calculate alcohol-related admission rates per 100,000 people, for 15 - 29 year-olds. We analysed data over 3 years (April 2006 - April 2009), using interrupted time-series analyses. This covered 2 years before, and 1 year after, the tax increase. We investigated both mental and behavioural consequences (via F10 codes), and intentional/unintentional injuries (S and T codes). We fitted an auto-regressive integrated moving average (ARIMA) model, to test for any changes following the increased tax. There was no decrease in alcohol-related admissions in 15 - 29 year-olds. We found similar results for males and females, as well as definitions of alcohol-related harms that were narrow (F10 codes only) and broad (F10, S and T codes). The increased tax on 'alcopops' was not associated with any reduction in hospital admissions for alcohol-related harms in Queensland 15 - 29 year-olds.

  17. A Systematic Method for Verification and Validation of Gyrokinetic Microstability Codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bravenec, Ronald

    My original proposal for the period Feb. 15, 2014 through Feb. 14, 2017 called for an integrated validation and verification effort carried out by myself with collaborators. The validation component would require experimental profile and power-balance analysis. In addition, it would require running the gyrokinetic codes varying the input profiles within experimental uncertainties to seek agreement with experiment before discounting a code as invalidated. Therefore, validation would require a major increase of effort over my previous grant periods which covered only code verification (code benchmarking). Consequently, I had requested full-time funding. Instead, I am being funded at somewhat less thanmore » half time (5 calendar months per year). As a consequence, I decided to forego the validation component and to only continue the verification efforts.« less

  18. Thermodynamic consequences of hydrogen combustion within a containment of pressurized water reactor

    NASA Astrophysics Data System (ADS)

    Bury, Tomasz

    2011-12-01

    Gaseous hydrogen may be generated in a nuclear reactor system as an effect of the core overheating. This creates a risk of its uncontrolled combustion which may have a destructive consequences, as it could be observed during the Fukushima nuclear power plant accident. Favorable conditions for hydrogen production occur during heavy loss-of-coolant accidents. The author used an own computer code, called HEPCAL, of the lumped parameter type to realize a set of simulations of a large scale loss-of-coolant accidents scenarios within containment of second generation pressurized water reactor. Some simulations resulted in high pressure peaks, seemed to be irrational. A more detailed analysis and comparison with Three Mile Island and Fukushima accidents consequences allowed for withdrawing interesting conclusions.

  19. A rocket-borne pulse-height analyzer for energetic particle measurements

    NASA Technical Reports Server (NTRS)

    Leung, W.; Smith, L. G.; Voss, H. D.

    1979-01-01

    The pulse-height analyzer basically resembles a time-sharing multiplexing data-acquisition system which acquires analog data (from energetic particle spectrometers) and converts them into digital code. The PHA simultaneously acquires pulse-height information from the analog signals of the four input channels and sequentially multiplexes the digitized data to a microprocessor. The PHA together with the microprocessor form an on-board real-time data-manipulation system. The system processes data obtained during the rocket flight and reduces the amount of data to be sent back to the ground station. Consequently the data-reduction process for the rocket experiments is speeded up. By using a time-sharing technique, the throughput rate of the microprocessor is increased. Moreover, data from several particle spectrometers are manipulated to share one information channel; consequently, the TM capacity is increased.

  20. Nonlinear ship waves and computational fluid dynamics

    PubMed Central

    MIYATA, Hideaki; ORIHARA, Hideo; SATO, Yohei

    2014-01-01

    Research works undertaken in the first author’s laboratory at the University of Tokyo over the past 30 years are highlighted. Finding of the occurrence of nonlinear waves (named Free-Surface Shock Waves) in the vicinity of a ship advancing at constant speed provided the start-line for the progress of innovative technologies in the ship hull-form design. Based on these findings, a multitude of the Computational Fluid Dynamic (CFD) techniques have been developed over this period, and are highlighted in this paper. The TUMMAC code has been developed for wave problems, based on a rectangular grid system, while the WISDAM code treats both wave and viscous flow problems in the framework of a boundary-fitted grid system. These two techniques are able to cope with almost all fluid dynamical problems relating to ships, including the resistance, ship’s motion and ride-comfort issues. Consequently, the two codes have contributed significantly to the progress in the technology of ship design, and now form an integral part of the ship-designing process. PMID:25311139

  1. Conceptual-driven classification for coding advise in health insurance reimbursement.

    PubMed

    Li, Sheng-Tun; Chen, Chih-Chuan; Huang, Fernando

    2011-01-01

    With the non-stop increases in medical treatment fees, the economic survival of a hospital in Taiwan relies on the reimbursements received from the Bureau of National Health Insurance, which in turn depend on the accuracy and completeness of the content of the discharge summaries as well as the correctness of their International Classification of Diseases (ICD) codes. The purpose of this research is to enforce the entire disease classification framework by supporting disease classification specialists in the coding process. This study developed an ICD code advisory system (ICD-AS) that performed knowledge discovery from discharge summaries and suggested ICD codes. Natural language processing and information retrieval techniques based on Zipf's Law were applied to process the content of discharge summaries, and fuzzy formal concept analysis was used to analyze and represent the relationships between the medical terms identified by MeSH. In addition, a certainty factor used as reference during the coding process was calculated to account for uncertainty and strengthen the credibility of the outcome. Two sets of 360 and 2579 textual discharge summaries of patients suffering from cerebrovascular disease was processed to build up ICD-AS and to evaluate the prediction performance. A number of experiments were conducted to investigate the impact of system parameters on accuracy and compare the proposed model to traditional classification techniques including linear-kernel support vector machines. The comparison results showed that the proposed system achieves the better overall performance in terms of several measures. In addition, some useful implication rules were obtained, which improve comprehension of the field of cerebrovascular disease and give insights to the relationships between relevant medical terms. Our system contributes valuable guidance to disease classification specialists in the process of coding discharge summaries, which consequently brings benefits in aspects of patient, hospital, and healthcare system. Copyright © 2010 Elsevier B.V. All rights reserved.

  2. Chimeric NP Non Coding Regions between Type A and C Influenza Viruses Reveal Their Role in Translation Regulation

    PubMed Central

    Crescenzo-Chaigne, Bernadette; Barbezange, Cyril; Frigard, Vianney; Poulain, Damien; van der Werf, Sylvie

    2014-01-01

    Exchange of the non coding regions of the NP segment between type A and C influenza viruses was used to demonstrate the importance not only of the proximal panhandle, but also of the initial distal panhandle strength in type specificity. Both elements were found to be compulsory to rescue infectious virus by reverse genetics systems. Interestingly, in type A influenza virus infectious context, the length of the NP segment 5′ NC region once transcribed into mRNA was found to impact its translation, and the level of produced NP protein consequently affected the level of viral genome replication. PMID:25268971

  3. The kinetics of aerosol particle formation and removal in NPP severe accidents

    NASA Astrophysics Data System (ADS)

    Zatevakhin, Mikhail A.; Arefiev, Valentin K.; Semashko, Sergey E.; Dolganov, Rostislav A.

    2016-06-01

    Severe Nuclear Power Plant (NPP) accidents are accompanied by release of a massive amount of energy, radioactive products and hydrogen into the atmosphere of the NPP containment. A valid estimation of consequences of such accidents can only be carried out through the use of the integrated codes comprising a description of the basic processes which determine the consequences. A brief description of a coupled aerosol and thermal-hydraulic code to be used for the calculation of the aerosol kinetics within the NPP containment in case of a severe accident is given. The code comprises a KIN aerosol unit integrated into the KUPOL-M thermal-hydraulic code. Some features of aerosol behavior in severe NPP accidents are briefly described.

  4. The kinetics of aerosol particle formation and removal in NPP severe accidents

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zatevakhin, Mikhail A.; Arefiev, Valentin K.; Semashko, Sergey E.

    2016-06-08

    Severe Nuclear Power Plant (NPP) accidents are accompanied by release of a massive amount of energy, radioactive products and hydrogen into the atmosphere of the NPP containment. A valid estimation of consequences of such accidents can only be carried out through the use of the integrated codes comprising a description of the basic processes which determine the consequences. A brief description of a coupled aerosol and thermal–hydraulic code to be used for the calculation of the aerosol kinetics within the NPP containment in case of a severe accident is given. The code comprises a KIN aerosol unit integrated into themore » KUPOL-M thermal–hydraulic code. Some features of aerosol behavior in severe NPP accidents are briefly described.« less

  5. Unfiltered Talk--A Challenge to Categories.

    ERIC Educational Resources Information Center

    McCormick, Kay

    A study investigated how and why code switching and mixing occurs between English and Afrikaans in a region of South Africa. In District Six, non-standard Afrikaans seems to be a mixed code, and it is unclear whether non-standard English is a mixed code. Consequently, it is unclear when codes are being switched or mixed. The analysis looks at…

  6. CIRMIS Data system. Volume 2. Program listings

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Friedrichs, D.R.

    1980-01-01

    The Assessment of Effectiveness of Geologic Isolation Systems (AEGIS) Program is developing and applying the methodology for assessing the far-field, long-term post-closure safety of deep geologic nuclear waste repositories. AEGIS is being performed by Pacific Northwest Laboratory (PNL) under contract with the Office of Nuclear Waste Isolation (OWNI) for the Department of Energy (DOE). One task within AEGIS is the development of methodology for analysis of the consequences (water pathway) from loss of repository containment as defined by various release scenarios. Analysis of the long-term, far-field consequences of release scenarios requires the application of numerical codes which simulate the hydrologicmore » systems, model the transport of released radionuclides through the hydrologic systems, model the transport of released radionuclides through the hydrologic systems to the biosphere, and, where applicable, assess the radiological dose to humans. The various input parameters required in the analysis are compiled in data systems. The data are organized and prepared by various input subroutines for utilization by the hydraulic and transport codes. The hydrologic models simulate the groundwater flow systems and provide water flow directions, rates, and velocities as inputs to the transport models. Outputs from the transport models are basically graphs of radionuclide concentration in the groundwater plotted against time. After dilution in the receiving surface-water body (e.g., lake, river, bay), these data are the input source terms for the dose models, if dose assessments are required.The dose models calculate radiation dose to individuals and populations. CIRMIS (Comprehensive Information Retrieval and Model Input Sequence) Data System is a storage and retrieval system for model input and output data, including graphical interpretation and display. This is the second of four volumes of the description of the CIRMIS Data System.« less

  7. Assessment of effectiveness of geologic isolation systems. CIRMIS data system. Volume 4. Driller's logs, stratigraphic cross section and utility routines

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Friedrichs, D.R.

    1980-01-01

    The Assessment of Effectiveness of Geologic Isolation Systems (AEGIS) Program is developing and applying the methodology for assessing the far-field, long-term post-closure safety of deep geologic nuclear waste repositories. AEGIS is being performed by Pacific Northwest Laboratory (PNL) under contract with the Office of Nuclear Waste Isolation (ONWI) for the Department of Energy (DOE). One task within AEGIS is the development of methodology for analysis of the consequences (water pathway) from loss of repository containment as defined by various release scenarios. Analysis of the long-term, far-field consequences of release scenarios requires the application of numerical codes which simulate the hydrologicmore » systems, model the transport of released radionuclides through the hydrologic systems to the biosphere, and, where applicable, assess the radiological dose to humans. The various input parameters required in the analysis are compiled in data systems. The data are organized and prepared by various input subroutines for use by the hydrologic and transport codes. The hydrologic models simulate the groundwater flow systems and provide water flow directions, rates, and velocities as inputs to the transport models. Outputs from the transport models are basically graphs of radionuclide concentration in the groundwater plotted against time. After dilution in the receiving surface-water body (e.g., lake, river, bay), these data are the input source terms for the dose models, if dose assessments are required. The dose models calculate radiation dose to individuals and populations. CIRMIS (Comprehensive Information Retrieval and Model Input Sequence) Data System is a storage and retrieval system for model input and output data, including graphical interpretation and display. This is the fourth of four volumes of the description of the CIRMIS Data System.« less

  8. Ethical education in software engineering: responsibility in the production of complex systems.

    PubMed

    Génova, Gonzalo; González, M Rosario; Fraga, Anabel

    2007-12-01

    Among the various contemporary schools of moral thinking, consequence-based ethics, as opposed to rule-based, seems to have a good acceptance among professionals such as software engineers. But naïve consequentialism is intellectually too weak to serve as a practical guide in the profession. Besides, the complexity of software systems makes it very hard to know in advance the consequences that will derive from professional activities in the production of software. Therefore, following the spirit of well-known codes of ethics such as the ACM/IEEE's, we advocate for a more solid position in the ethical education of software engineers, which we call 'moderate deontologism', that takes into account both rules and consequences to assess the goodness of actions, and at the same time pays an adequate consideration to the absolute values of human dignity. In order to educate responsible professionals, however, this position should be complemented with a pedagogical approach to virtue ethics.

  9. A modified carrier-to-code leveling method for retrieving ionospheric observables and detecting short-term temporal variability of receiver differential code biases

    NASA Astrophysics Data System (ADS)

    Zhang, Baocheng; Teunissen, Peter J. G.; Yuan, Yunbin; Zhang, Xiao; Li, Min

    2018-03-01

    Sensing the ionosphere with the global positioning system involves two sequential tasks, namely the ionospheric observable retrieval and the ionospheric parameter estimation. A prominent source of error has long been identified as short-term variability in receiver differential code bias (rDCB). We modify the carrier-to-code leveling (CCL), a method commonly used to accomplish the first task, through assuming rDCB to be unlinked in time. Aside from the ionospheric observables, which are affected by, among others, the rDCB at one reference epoch, the Modified CCL (MCCL) can also provide the rDCB offsets with respect to the reference epoch as by-products. Two consequences arise. First, MCCL is capable of excluding the effects of time-varying rDCB from the ionospheric observables, which, in turn, improves the quality of ionospheric parameters of interest. Second, MCCL has significant potential as a means to detect between-epoch fluctuations experienced by rDCB of a single receiver.

  10. Studies of Planet Formation Using a Hybrid N-Body + Planetesimal Code

    NASA Technical Reports Server (NTRS)

    Kenyon, Scott J.

    2004-01-01

    The goal of our proposal was to use a hybrid multi-annulus planetesimal/n-body code to examine the planetesimal theory, one of the two main theories of planet formation. We developed this code to follow the evolution of numerous 1 m to 1 km planetesimals as they collide, merge, and grow into full-fledged planets. Our goal was to apply the code to several well-posed, topical problems in planet formation and to derive observational consequences of the models. We planned to construct detailed models to address two fundamental issues: (1) icy planets: models for icy planet formation will demonstrate how the physical properties of debris disks - including the Kuiper Belt in our solar system - depend on initial conditions and input physics; and (2) terrestrial planets: calculations following the evolution of 1-10 km planetesimals into Earth-mass planets and rings of dust will provide a better understanding of how terrestrial planets form and interact with their environment.

  11. [Seasonal distribution of clinical case codes (DOC study)].

    PubMed

    von Dercks, N; Melz, R; Hepp, P; Theopold, J; Marquass, B; Josten, C

    2017-02-01

    The German diagnosis-related groups remuneration system (G-DRG) was implemented in 2004 and patient-related diagnoses and procedures lead to allocation to specific DRGs. This system includes several codes, such as case mix (CM), case mix index (CMI) and number of cases. Seasonal distribution of these codes as well as distribution of diagnoses and DRGs may lead to logistical consequences for clinical management. From 2004 to 2013 all the main diagnoses and DRGs for inpatients were recorded. Monthly and seasonal distributions were analyzed using ANOVA. The average monthly number of cases was 265 ± 25 cases, the average CM was 388.50 ± 51.75 and the average CMI was 1.46 ± 0.15 with no significant seasonal differences (p > 0.1). Concussion was the most frequently occurring main diagnosis (3739 cases) followed by fractures of the humeral head (699). Significant distribution differences could be shown for humeral head fractures in monthly (p = 0.018) and seasonal comparisons (p = 0.006) with a maximum in winter. Radius (p = 0.01) and ankle fractures (p ≤ 0.001) also occurred most frequently in winter. Non-bony lesions of the shoulder were significantly less in spring (p = 0.04). The DRGs showed no evidence of a monthly or seasonal clustering (p > 0.1). The significant clustering of injuries in specific months and seasons should lead to logistic consequences (e.g. operating room slots, availability of nursing and anesthesia staff). For a needs assessment the analysis of main diagnoses is more appropriate than DRGs.

  12. Momentary Patterns of Covariation between Specific Affects and Interpersonal Behavior: Linking Relationship Science and Personality Assessment

    PubMed Central

    Ross, Jaclyn M.; Girard, Jeffrey M.; Wright, Aidan G.C.; Beeney, Joseph E.; Scott, Lori N.; Hallquist, Michael N.; Lazarus, Sophie A.; Stepp, Stephanie D.; Pilkonis, Paul A.

    2016-01-01

    Relationships are among the most salient factors affecting happiness and wellbeing for individuals and families. Relationship science has identified the study of dyadic behavioral patterns between couple members during conflict as an important window in to relational functioning with both short-term and long-term consequences. Several methods have been developed for the momentary assessment of behavior during interpersonal transactions. Among these, the most popular is the Specific Affect Coding System (SPAFF), which organizes social behavior into a set of discrete behavioral constructs. This study examines the interpersonal meaning of the SPAFF codes through the lens of interpersonal theory, which uses the fundamental dimensions of Dominance and Affiliation to organize interpersonal behavior. A sample of 67 couples completed a conflict task, which was video recorded and coded using SPAFF and a method for rating momentary interpersonal behavior, the Continuous Assessment of Interpersonal Dynamics (CAID). Actor partner interdependence models in a multilevel structural equation modeling framework were used to study the covariation of SPAFF codes and CAID ratings. Results showed that a number of SPAFF codes had clear interpersonal signatures, but many did not. Additionally, actor and partner effects for the same codes were strongly consistent with interpersonal theory’s principle of complementarity. Thus, findings reveal points of convergence and divergence in the two systems and provide support for central tenets of interpersonal theory. Future directions based on these initial findings are discussed. PMID:27148786

  13. Performance Analysis of a New Coded TH-CDMA Scheme in Dispersive Infrared Channel with Additive Gaussian Noise

    NASA Astrophysics Data System (ADS)

    Hamdi, Mazda; Kenari, Masoumeh Nasiri

    2013-06-01

    We consider a time-hopping based multiple access scheme introduced in [1] for communication over dispersive infrared links, and evaluate its performance for correlator and matched filter receivers. In the investigated time-hopping code division multiple access (TH-CDMA) method, the transmitter benefits a low rate convolutional encoder. In this method, the bit interval is divided into Nc chips and the output of the encoder along with a PN sequence assigned to the user determines the position of the chip in which the optical pulse is transmitted. We evaluate the multiple access performance of the system for correlation receiver considering background noise which is modeled as White Gaussian noise due to its large intensity. For the correlation receiver, the results show that for a fixed processing gain, at high transmit power, where the multiple access interference has the dominant effect, the performance improves by the coding gain. But at low transmit power, in which the increase of coding gain leads to the decrease of the chip time, and consequently, to more corruption due to the channel dispersion, there exists an optimum value for the coding gain. However, for the matched filter, the performance always improves by the coding gain. The results show that the matched filter receiver outperforms the correlation receiver in the considered cases. Our results show that, for the same bandwidth and bit rate, the proposed system excels other multiple access techniques, like conventional CDMA and time hopping scheme.

  14. Momentary patterns of covariation between specific affects and interpersonal behavior: Linking relationship science and personality assessment.

    PubMed

    Ross, Jaclyn M; Girard, Jeffrey M; Wright, Aidan G C; Beeney, Joseph E; Scott, Lori N; Hallquist, Michael N; Lazarus, Sophie A; Stepp, Stephanie D; Pilkonis, Paul A

    2017-02-01

    Relationships are among the most salient factors affecting happiness and wellbeing for individuals and families. Relationship science has identified the study of dyadic behavioral patterns between couple members during conflict as an important window in to relational functioning with both short-term and long-term consequences. Several methods have been developed for the momentary assessment of behavior during interpersonal transactions. Among these, the most popular is the Specific Affect Coding System (SPAFF), which organizes social behavior into a set of discrete behavioral constructs. This study examines the interpersonal meaning of the SPAFF codes through the lens of interpersonal theory, which uses the fundamental dimensions of Dominance and Affiliation to organize interpersonal behavior. A sample of 67 couples completed a conflict task, which was video recorded and coded using SPAFF and a method for rating momentary interpersonal behavior, the Continuous Assessment of Interpersonal Dynamics (CAID). Actor partner interdependence models in a multilevel structural equation modeling framework were used to study the covariation of SPAFF codes and CAID ratings. Results showed that a number of SPAFF codes had clear interpersonal signatures, but many did not. Additionally, actor and partner effects for the same codes were strongly consistent with interpersonal theory's principle of complementarity. Thus, findings reveal points of convergence and divergence in the 2 systems and provide support for central tenets of interpersonal theory. Future directions based on these initial findings are discussed. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  15. Extremely accurate sequential verification of RELAP5-3D

    DOE PAGES

    Mesina, George L.; Aumiller, David L.; Buschman, Francis X.

    2015-11-19

    Large computer programs like RELAP5-3D solve complex systems of governing, closure and special process equations to model the underlying physics of nuclear power plants. Further, these programs incorporate many other features for physics, input, output, data management, user-interaction, and post-processing. For software quality assurance, the code must be verified and validated before being released to users. For RELAP5-3D, verification and validation are restricted to nuclear power plant applications. Verification means ensuring that the program is built right by checking that it meets its design specifications, comparing coding to algorithms and equations and comparing calculations against analytical solutions and method ofmore » manufactured solutions. Sequential verification performs these comparisons initially, but thereafter only compares code calculations between consecutive code versions to demonstrate that no unintended changes have been introduced. Recently, an automated, highly accurate sequential verification method has been developed for RELAP5-3D. The method also provides to test that no unintended consequences result from code development in the following code capabilities: repeating a timestep advancement, continuing a run from a restart file, multiple cases in a single code execution, and modes of coupled/uncoupled operation. In conclusion, mathematical analyses of the adequacy of the checks used in the comparisons are provided.« less

  16. Radioactive waste isolation in salt: special advisory report on the status of the Office of Nuclear Waste Isolation's plans for repository performance assessment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ditmars, J.D.; Walbridge, E.W.; Rote, D.M.

    1983-10-01

    Repository performance assessment is analysis that identifies events and processes that might affect a repository system for isolation of radioactive waste, examines their effects on barriers to waste migration, and estimates the probabilities of their occurrence and their consequences. In 1983 Battelle Memorial Institute's Office of Nuclear Waste Isolation (ONWI) prepared two plans - one for performance assessment for a waste repository in salt and one for verification and validation of performance assessment technology. At the request of the US Department of Energy's Salt Repository Project Office (SRPO), Argonne National Laboratory reviewed those plans and prepared this report to advisemore » SRPO of specific areas where ONWI's plans for performance assessment might be improved. This report presents a framework for repository performance assessment that clearly identifies the relationships among the disposal problems, the processes underlying the problems, the tools for assessment (computer codes), and the data. In particular, the relationships among important processes and 26 model codes available to ONWI are indicated. A common suggestion for computer code verification and validation is the need for specific and unambiguous documentation of the results of performance assessment activities. A major portion of this report consists of status summaries of 27 model codes indicated as potentially useful by ONWI. The code summaries focus on three main areas: (1) the code's purpose, capabilities, and limitations; (2) status of the elements of documentation and review essential for code verification and validation; and (3) proposed application of the code for performance assessment of salt repository systems. 15 references, 6 figures, 4 tables.« less

  17. Molecular Regulatory Pathways Link Sepsis With Metabolic Syndrome: Non-coding RNA Elements Underlying the Sepsis/Metabolic Cross-Talk.

    PubMed

    Meydan, Chanan; Bekenstein, Uriya; Soreq, Hermona

    2018-01-01

    Sepsis and metabolic syndrome (MetS) are both inflammation-related entities with high impact for human health and the consequences of concussions. Both represent imbalanced parasympathetic/cholinergic response to insulting triggers and variably uncontrolled inflammation that indicates shared upstream regulators, including short microRNAs (miRs) and long non-coding RNAs (lncRNAs). These may cross talk across multiple systems, leading to complex molecular and clinical outcomes. Notably, biomedical and RNA-sequencing based analyses both highlight new links between the acquired and inherited pathogenic, cardiac and inflammatory traits of sepsis/MetS. Those include the HOTAIR and MIAT lncRNAs and their targets, such as miR-122, -150, -155, -182, -197, -375, -608 and HLA-DRA. Implicating non-coding RNA regulators in sepsis and MetS may delineate novel high-value biomarkers and targets for intervention.

  18. Computing element evolution towards Exascale and its impact on legacy simulation codes

    NASA Astrophysics Data System (ADS)

    Colin de Verdière, Guillaume J. L.

    2015-12-01

    In the light of the current race towards the Exascale, this article highlights the main features of the forthcoming computing elements that will be at the core of next generations of supercomputers. The market analysis, underlying this work, shows that computers are facing a major evolution in terms of architecture. As a consequence, it is important to understand the impacts of those evolutions on legacy codes or programming methods. The problems of dissipated power and memory access are discussed and will lead to a vision of what should be an exascale system. To survive, programming languages had to respond to the hardware evolutions either by evolving or with the creation of new ones. From the previous elements, we elaborate why vectorization, multithreading, data locality awareness and hybrid programming will be the key to reach the exascale, implying that it is time to start rewriting codes.

  19. Structural Code Considerations for Solar Rooftop Installations.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dwyer, Stephen F.; Dwyer, Brian P.; Sanchez, Alfred

    2014-12-01

    Residential rooftop solar panel installations are limited in part by the high cost of structural related code requirements for field installation. Permitting solar installations is difficult because there is a belief among residential permitting authorities that typical residential rooftops may be structurally inadequate to support the additional load associated with a photovoltaic (PV) solar installation. Typical engineering methods utilized to calculate stresses on a roof structure involve simplifying assumptions that render a complex non-linear structure to a basic determinate beam. This method of analysis neglects the composite action of the entire roof structure, yielding a conservative analysis based on amore » rafter or top chord of a truss. Consequently, the analysis can result in an overly conservative structural analysis. A literature review was conducted to gain a better understanding of the conservative nature of the regulations and codes governing residential construction and the associated structural system calculations.« less

  20. 34 CFR 600.10 - Date, extent, duration, and consequence of eligibility.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Classification of Instructional Programs (CIP) code under the taxonomy of instructional program classifications... same CIP code as another program offered by the institution but leads to a different degree or...

  1. 34 CFR 600.10 - Date, extent, duration, and consequence of eligibility.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... Classification of Instructional Programs (CIP) code under the taxonomy of instructional program classifications... same CIP code as another program offered by the institution but leads to a different degree or...

  2. 34 CFR 600.10 - Date, extent, duration, and consequence of eligibility.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... Classification of Instructional Programs (CIP) code under the taxonomy of instructional program classifications... same CIP code as another program offered by the institution but leads to a different degree or...

  3. 34 CFR 600.10 - Date, extent, duration, and consequence of eligibility.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... Classification of Instructional Programs (CIP) code under the taxonomy of instructional program classifications... same CIP code as another program offered by the institution but leads to a different degree or...

  4. Bilingual Processing of ASL-English Code-Blends: The Consequences of Accessing Two Lexical Representations Simultaneously

    ERIC Educational Resources Information Center

    Emmorey, Karen; Petrich, Jennifer A. F.; Gollan, Tamar H.

    2012-01-01

    Bilinguals who are fluent in American Sign Language (ASL) and English often produce "code-blends"--simultaneously articulating a sign and a word while conversing with other ASL-English bilinguals. To investigate the cognitive mechanisms underlying code-blend processing, we compared picture-naming times (Experiment 1) and semantic categorization…

  5. A Survey of Visualization Tools Assessed for Anomaly-Based Intrusion Detection Analysis

    DTIC Science & Technology

    2014-04-01

    objective? • What vulnerabilities exist in the target system? • What damage or other consequences are likely? • What exploit scripts or other attack...languages C, R, and Python; no response capabilities. JUNG https://blogs.reucon.com/asterisk- java /tag/visualization/ Create custom layouts and can...annotate graphs, links, nodes with any Java data type. Must be familiar with coding in Java to call the routines; no monitoring or response

  6. SOARCA Peach Bottom Atomic Power Station Long-Term Station Blackout Uncertainty Analysis: Knowledge Advancement.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gauntt, Randall O.; Mattie, Patrick D.; Bixler, Nathan E.

    2014-02-01

    This paper describes the knowledge advancements from the uncertainty analysis for the State-of- the-Art Reactor Consequence Analyses (SOARCA) unmitigated long-term station blackout accident scenario at the Peach Bottom Atomic Power Station. This work assessed key MELCOR and MELCOR Accident Consequence Code System, Version 2 (MACCS2) modeling uncertainties in an integrated fashion to quantify the relative importance of each uncertain input on potential accident progression, radiological releases, and off-site consequences. This quantitative uncertainty analysis provides measures of the effects on consequences, of each of the selected uncertain parameters both individually and in interaction with other parameters. The results measure the modelmore » response (e.g., variance in the output) to uncertainty in the selected input. Investigation into the important uncertain parameters in turn yields insights into important phenomena for accident progression and off-site consequences. This uncertainty analysis confirmed the known importance of some parameters, such as failure rate of the Safety Relief Valve in accident progression modeling and the dry deposition velocity in off-site consequence modeling. The analysis also revealed some new insights, such as dependent effect of cesium chemical form for different accident progressions. (auth)« less

  7. Thermal-hydraulic modeling needs for passive reactors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kelly, J.M.

    1997-07-01

    The U.S. Nuclear Regulatory Commission has received an application for design certification from the Westinghouse Electric Corporation for an Advanced Light Water Reactor design known as the AP600. As part of the design certification process, the USNRC uses its thermal-hydraulic system analysis codes to independently audit the vendor calculations. The focus of this effort has been the small break LOCA transients that rely upon the passive safety features of the design to depressurize the primary system sufficiently so that gravity driven injection can provide a stable source for long term cooling. Of course, large break LOCAs have also been considered,more » but as the involved phenomena do not appear to be appreciably different from those of current plants, they were not discussed in this paper. Although the SBLOCA scenario does not appear to threaten core coolability - indeed, heatup is not even expected to occur - there have been concerns as to the performance of the passive safety systems. For example, the passive systems drive flows with small heads, consequently requiring more precision in the analysis compared to active systems methods for passive plants as compared to current plants with active systems. For the analysis of SBLOCAs and operating transients, the USNRC uses the RELAP5 thermal-hydraulic system analysis code. To assure the applicability of RELAP5 to the analysis of these transients for the AP600 design, a four year long program of code development and assessment has been undertaken.« less

  8. Coding Manual for Continuous Observation of Interactions by Single Subjects in an Academic Setting.

    ERIC Educational Resources Information Center

    Cobb, Joseph A.; Hops, Hyman

    The manual, designed particularly for work with acting-out or behavior problem students, describes coding procedures used in the observation of continuous classroom interactions between the student and his peers and teacher. Peer and/or teacher behaviors antecedent and consequent to the subject's behavior are identified in the coding process,…

  9. Finite element analysis of two disk rotor system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dixit, Harsh Kumar

    A finite element model of simple horizontal rotor system is developed for evaluating its dynamic behaviour. The model is based on Timoshenko beam element and accounts for the effect of gyroscopic couple and other rotational forces. Present rotor system consists of single shaft which is supported by bearings at both ends and two disks are mounted at different locations. The natural frequencies, mode shapes and orbits of rotating system for a specific range of rotation speed are obtained by developing a MATLAB code for solving the finite element equations of rotary system. Consequently, Campbell diagram is plotted for finding amore » relationship between natural whirl frequencies and rotation of the rotor.« less

  10. CoreTSAR: Core Task-Size Adapting Runtime

    DOE PAGES

    Scogland, Thomas R. W.; Feng, Wu-chun; Rountree, Barry; ...

    2014-10-27

    Heterogeneity continues to increase at all levels of computing, with the rise of accelerators such as GPUs, FPGAs, and other co-processors into everything from desktops to supercomputers. As a consequence, efficiently managing such disparate resources has become increasingly complex. CoreTSAR seeks to reduce this complexity by adaptively worksharing parallel-loop regions across compute resources without requiring any transformation of the code within the loop. Lastly, our results show performance improvements of up to three-fold over a current state-of-the-art heterogeneous task scheduler as well as linear performance scaling from a single GPU to four GPUs for many codes. In addition, CoreTSAR demonstratesmore » a robust ability to adapt to both a variety of workloads and underlying system configurations.« less

  11. Abstract feature codes: The building blocks of the implicit learning system.

    PubMed

    Eberhardt, Katharina; Esser, Sarah; Haider, Hilde

    2017-07-01

    According to the Theory of Event Coding (TEC; Hommel, Müsseler, Aschersleben, & Prinz, 2001), action and perception are represented in a shared format in the cognitive system by means of feature codes. In implicit sequence learning research, it is still common to make a conceptual difference between independent motor and perceptual sequences. This supposedly independent learning takes place in encapsulated modules (Keele, Ivry, Mayr, Hazeltine, & Heuer 2003) that process information along single dimensions. These dimensions have remained underspecified so far. It is especially not clear whether stimulus and response characteristics are processed in separate modules. Here, we suggest that feature dimensions as they are described in the TEC should be viewed as the basic content of modules of implicit learning. This means that the modules process all stimulus and response information related to certain feature dimensions of the perceptual environment. In 3 experiments, we investigated by means of a serial reaction time task the nature of the basic units of implicit learning. As a test case, we used stimulus location sequence learning. The results show that a stimulus location sequence and a response location sequence cannot be learned without interference (Experiment 2) unless one of the sequences can be coded via an alternative, nonspatial dimension (Experiment 3). These results support the notion that spatial location is one module of the implicit learning system and, consequently, that there are no separate processing units for stimulus versus response locations. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  12. The Increased Sensitivity of Irregular Peripheral Canal and Otolith Vestibular Afferents Optimizes their Encoding of Natural Stimuli

    PubMed Central

    Schneider, Adam D.; Jamali, Mohsen; Carriot, Jerome; Chacron, Maurice J.

    2015-01-01

    Efficient processing of incoming sensory input is essential for an organism's survival. A growing body of evidence suggests that sensory systems have developed coding strategies that are constrained by the statistics of the natural environment. Consequently, it is necessary to first characterize neural responses to natural stimuli to uncover the coding strategies used by a given sensory system. Here we report for the first time the statistics of vestibular rotational and translational stimuli experienced by rhesus monkeys during natural (e.g., walking, grooming) behaviors. We find that these stimuli can reach intensities as high as 1500 deg/s and 8 G. Recordings from afferents during naturalistic rotational and linear motion further revealed strongly nonlinear responses in the form of rectification and saturation, which could not be accurately predicted by traditional linear models of vestibular processing. Accordingly, we used linear–nonlinear cascade models and found that these could accurately predict responses to naturalistic stimuli. Finally, we tested whether the statistics of natural vestibular signals constrain the neural coding strategies used by peripheral afferents. We found that both irregular otolith and semicircular canal afferents, because of their higher sensitivities, were more optimized for processing natural vestibular stimuli as compared with their regular counterparts. Our results therefore provide the first evidence supporting the hypothesis that the neural coding strategies used by the vestibular system are matched to the statistics of natural stimuli. PMID:25855169

  13. Code Sharing and Collaboration: Experiences from the Scientist's Expert Assistant Project and their Relevance to the Virtual Observatory

    NASA Technical Reports Server (NTRS)

    Jones, Jeremy; Grosvenor, Sandy; Wolf, Karl; Li, Connie; Koratkar, Anuradha; Powers, Edward I. (Technical Monitor)

    2001-01-01

    In the Virtual Observatory (VO), software tools will perform the functions that have traditionally been performed by physical observatories and their instruments. These tools will not be adjuncts to VO functionality but will make up the very core of the VO. Consequently, the tradition of observatory and system independent tools serving a small user base is not valid for the VO. For the VO to succeed, we must improve software collaboration and code sharing between projects and groups. A significant goal of the Scientist's Expert Assistant (SEA) project has been promoting effective collaboration and code sharing between groups. During the past three years, the SEA project has been developing prototypes for new observation planning software tools and strategies. Initially funded by the Next Generation Space Telescope, parts of the SEA code have since been adopted by the Space Telescope Science Institute. SEA has also supplied code for SOFIA, the SIRTF planning tools, and the JSky Open Source Java library. The potential benefits of sharing code are clear. The recipient gains functionality for considerably less cost. The provider gains additional developers working with their code. If enough users groups adopt a set of common code and tools, defacto standards can emerge (as demonstrated by the success of the FITS standard). Code sharing also raises a number of challenges related to the management of the code. In this talk, we will review our experiences with SEA - both successes and failures - and offer some lessons learned that may promote further successes in collaboration and re-use.

  14. Code Sharing and Collaboration: Experiences From the Scientist's Expert Assistant Project and Their Relevance to the Virtual Observatory

    NASA Technical Reports Server (NTRS)

    Korathkar, Anuradha; Grosvenor, Sandy; Jones, Jeremy; Li, Connie; Mackey, Jennifer; Neher, Ken; Obenschain, Arthur F. (Technical Monitor)

    2001-01-01

    In the Virtual Observatory (VO), software tools will perform the functions that have traditionally been performed by physical observatories and their instruments. These tools will not be adjuncts to VO functionality but will make up the very core of the VO. Consequently, the tradition of observatory and system independent tools serving a small user base is not valid for the VO. For the VO to succeed, we must improve software collaboration and code sharing between projects and groups. A significant goal of the Scientist's Expert Assistant (SEA) project has been promoting effective collaboration and code sharing among groups. During the past three years, the SEA project has been developing prototypes for new observation planning software tools and strategies. Initially funded by the Next Generation Space Telescope, parts of the SEA code have since been adopted by the Space Telescope Science Institute. SEA has also supplied code for the SIRTF (Space Infrared Telescope Facility) planning tools, and the JSky Open Source Java library. The potential benefits of sharing code are clear. The recipient gains functionality for considerably less cost. The provider gains additional developers working with their code. If enough users groups adopt a set of common code and tools, de facto standards can emerge (as demonstrated by the success of the FITS standard). Code sharing also raises a number of challenges related to the management of the code. In this talk, we will review our experiences with SEA--both successes and failures, and offer some lessons learned that might promote further successes in collaboration and re-use.

  15. Institutional Controls and Educational Research.

    ERIC Educational Resources Information Center

    Homan, Roger

    1990-01-01

    Recognizing tendencies toward contract research and possible consequences, advocates creating a conduct code to regulate educational research and protect its integrity. Reports survey responses from 48 British institutions, showing no systematic code. States confidence in supervisory discretion currently guides research. Proposes a specific code…

  16. Separable concatenated codes with iterative map decoding for Rician fading channels

    NASA Technical Reports Server (NTRS)

    Lodge, J. H.; Young, R. J.

    1993-01-01

    Very efficient signalling in radio channels requires the design of very powerful codes having special structure suitable for practical decoding schemes. In this paper, powerful codes are obtained by combining comparatively simple convolutional codes to form multi-tiered 'separable' convolutional codes. The decoding of these codes, using separable symbol-by-symbol maximum a posteriori (MAP) 'filters', is described. It is known that this approach yields impressive results in non-fading additive white Gaussian noise channels. Interleaving is an inherent part of the code construction, and consequently, these codes are well suited for fading channel communications. Here, simulation results for communications over Rician fading channels are presented to support this claim.

  17. Changing Patient Classification System for Hospital Reimbursement in Romania

    PubMed Central

    Radu, Ciprian-Paul; Chiriac, Delia Nona; Vladescu, Cristian

    2010-01-01

    Aim To evaluate the effects of the change in the diagnosis-related group (DRG) system on patient morbidity and hospital financial performance in the Romanian public health care system. Methods Three variables were assessed before and after the classification switch in July 2007: clinical outcomes, the case mix index, and hospital budgets, using the database of the National School of Public Health and Health Services Management, which contains data regularly received from hospitals reimbursed through the Romanian DRG scheme (291 in 2009). Results The lack of a Romanian system for the calculation of cost-weights imposed the necessity to use an imported system, which was criticized by some clinicians for not accurately reflecting resource consumption in Romanian hospitals. The new DRG classification system allowed a more accurate clinical classification. However, it also exposed a lack of physicians’ knowledge on diagnosing and coding procedures, which led to incorrect coding. Consequently, the reported hospital morbidity changed after the DRG switch, reflecting an increase in the national case mix index of 25% in 2009 (compared with 2007). Since hospitals received the same reimbursement over the first two years after the classification switch, the new DRG system led them sometimes to change patients' diagnoses in order to receive more funding. Conclusion Lack of oversight of hospital coding and reporting to the national reimbursement scheme allowed the increase in the case mix index. The complexity of the new classification system requires more resources (human and financial), better monitoring and evaluation, and improved legislation in order to achieve better hospital resource allocation and more efficient patient care. PMID:20564769

  18. Changing patient classification system for hospital reimbursement in Romania.

    PubMed

    Radu, Ciprian-Paul; Chiriac, Delia Nona; Vladescu, Cristian

    2010-06-01

    To evaluate the effects of the change in the diagnosis-related group (DRG) system on patient morbidity and hospital financial performance in the Romanian public health care system. Three variables were assessed before and after the classification switch in July 2007: clinical outcomes, the case mix index, and hospital budgets, using the database of the National School of Public Health and Health Services Management, which contains data regularly received from hospitals reimbursed through the Romanian DRG scheme (291 in 2009). The lack of a Romanian system for the calculation of cost-weights imposed the necessity to use an imported system, which was criticized by some clinicians for not accurately reflecting resource consumption in Romanian hospitals. The new DRG classification system allowed a more accurate clinical classification. However, it also exposed a lack of physicians' knowledge on diagnosing and coding procedures, which led to incorrect coding. Consequently, the reported hospital morbidity changed after the DRG switch, reflecting an increase in the national case-mix index of 25% in 2009 (compared with 2007). Since hospitals received the same reimbursement over the first two years after the classification switch, the new DRG system led them sometimes to change patients' diagnoses in order to receive more funding. Lack of oversight of hospital coding and reporting to the national reimbursement scheme allowed the increase in the case-mix index. The complexity of the new classification system requires more resources (human and financial), better monitoring and evaluation, and improved legislation in order to achieve better hospital resource allocation and more efficient patient care.

  19. A visual programming environment for the Navier-Stokes computer

    NASA Technical Reports Server (NTRS)

    Tomboulian, Sherryl; Crockett, Thomas W.; Middleton, David

    1988-01-01

    The Navier-Stokes computer is a high-performance, reconfigurable, pipelined machine designed to solve large computational fluid dynamics problems. Due to the complexity of the architecture, development of effective, high-level language compilers for the system appears to be a very difficult task. Consequently, a visual programming methodology has been developed which allows users to program the system at an architectural level by constructing diagrams of the pipeline configuration. These schematic program representations can then be checked for validity and automatically translated into machine code. The visual environment is illustrated by using a prototype graphical editor to program an example problem.

  20. The home care teaching and learning process in undergraduate health care degree courses.

    PubMed

    Hermann, Ana Paula; Lacerda, Maria Ribeiro; Maftum, Mariluci Alves; Bernardino, Elizabeth; Mello, Ana Lúcia Schaefer Ferreira de

    2017-07-01

    Home care, one of the services provided by the health system, requires health practitioners who are capable of understanding its specificities. This study aimed to build a substantive theory that describes experiences of home care teaching and learning during undergraduate degree courses in nursing, pharmacy, medicine, nutrition, dentistry and occupational therapy. A qualitative analysis was performed using the grounded theory approach based on the results of 63 semistructured interviews conducted with final year students, professors who taught subjects related to home care, and recent graduates working with home care, all participants in the above courses. The data was analyzed in three stages - open coding, axial coding and selective coding - resulting in the phenomenon Experiences of home care teaching and learning during the undergraduate health care degree courses. Its causes were described in the category Articulating knowledge of home care, strategies in the category Experiencing the unique nature of home care, intervening conditions in the category Understanding the multidimensional characteristics of home care, consequences in the category Changing thinking about home care training, and context in the category Understanding home care in the health system. Home care contributes towards the decentralization of hospital care.

  1. Work-family balance by women GP specialist trainees in Slovenia: a qualitative study.

    PubMed

    Petek, Davorina; Gajsek, Tadeja; Petek Ster, Marija

    2016-01-28

    Women physicians face many challenges while balancing their many roles: doctor, specialist trainee, mother and partner. The most opportune biological time for a woman to start a family coincides with a great deal of demands and requirements at work. In this study we explored the options and capabilities of women GP specialist trainees in coordinating their family and career. This is a phenomenological qualitative research. Ten GP specialist trainees from urban and rural areas were chosen by the purposive sampling technique, and semi-structured in-depth interviews were conducted, recorded, transcribed and analysed by using thematic analysis process. Open coding and the book of codes were formed. Finally, we performed the process of code reduction by identifying the themes, which were compared, interpreted and organised in the highest analytical units--categories. One hundred fifty-five codes were identified in the analysis, which were grouped together into eleven themes. The identified themes are: types, causes and consequences of burdens, work as pleasure and positive attitude toward self, priorities, planning and help, and understanding of superiors, disburdening and changing in specialisation. The themes were grouped into four large categories: burdens, empowerment, coordination and needs for improvement. Women specialist trainees encounter intense burdens at work and home due to numerous demands and requirements during their specialisation training. In addition, there is also the issue of the work-family conflict. There are many consequences regarding burden and strain; however, burnout stands out the most. In contrast, reconciliation of work and family life and needs can be successful. The key element is empowerment of women doctors. The foremost necessary systemic solution is the reinforcement of general practitioners in primary health care and their understanding of the specialisation training scheme with more flexible possibilities for time adaptations of specialist training.

  2. Study of steam condensation at sub-atmospheric pressure: setting a basic research using MELCOR code

    NASA Astrophysics Data System (ADS)

    Manfredini, A.; Mazzini, M.

    2017-11-01

    One of the most serious accidents that can occur in the experimental nuclear fusion reactor ITER is the break of one of the headers of the refrigeration system of the first wall of the Tokamak. This results in water-steam mixture discharge in vacuum vessel (VV), with consequent pressurization of this container. To prevent the pressure in the VV exceeds 150 KPa absolute, a system discharges the steam inside a suppression pool, at an absolute pressure of 4.2 kPa. The computer codes used to analyze such incident (eg. RELAP 5 or MELCOR) are not validated experimentally for such conditions. Therefore, we planned a basic research, in order to have experimental data useful to validate the heat transfer correlations used in these codes. After a thorough literature search on this topic, ACTA, in collaboration with the staff of ITER, defined the experimental matrix and performed the design of the experimental apparatus. For the thermal-hydraulic design of the experiments, we executed a series of calculations by MELCOR. This code, however, was used in an unconventional mode, with the development of models suited respectively to low and high steam flow-rate tests. The article concludes with a discussion of the placement of experimental data within the map featuring the phenomenon characteristics, showing the importance of the new knowledge acquired, particularly in the case of chugging.

  3. The Use of Automated SNOMED CT Clinical Coding in Clinical Decision Support Systems for Preventive Care.

    PubMed

    Al-Hablani, Bader

    2017-01-01

    The objective of this study is to discuss and analyze the use of automated SNOMED CT clinical coding in clinical decision support systems (CDSSs) for preventive care. The central question that this study seeks to answer is whether the utilization of SNOMED CT in CDSSs can improve preventive care. PubMed, Google Scholar, and Cochrane Library were searched for articles published in English between 2001 and 2012 on SNOMED CT, CDSS, and preventive care. Outcome measures were the sensitivity or specificity of SNOMED CT coded data and the positive predictive value or negative predictive value of SNOMED CT coded data. Additionally, we documented the publication year, research question, study design, results, and conclusions of these studies. The reviewed studies suggested that SNOMED CT successfully represents clinical terms and negated clinical terms. The use of SNOMED CT in CDSS can be considered to provide an answer to the problem of medical errors as well as for preventive care in general. Enhancement of the modifiers and synonyms found in SNOMED CT will be necessary to improve the expected outcome of the integration of SNOMED CT with CDSS. Moreover, the application of the tree-augmented naïve (TAN) Bayesian network method can be considered the best technique to search SNOMED CT data and, consequently, to help improve preventive health services.

  4. Energy Cost Impact of Non-Residential Energy Code Requirements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Jian; Hart, Philip R.; Rosenberg, Michael I.

    2016-08-22

    The 2012 International Energy Conservation Code contains 396 separate requirements applicable to non-residential buildings; however, there is no systematic analysis of the energy cost impact of each requirement. Consequently, limited code department budgets for plan review, inspection, and training cannot be focused on the most impactful items. An inventory and ranking of code requirements based on their potential energy cost impact is under development. The initial phase focuses on office buildings with simple HVAC systems in climate zone 4C. Prototype building simulations were used to estimate the energy cost impact of varying levels of non-compliance. A preliminary estimate of themore » probability of occurrence of each level of non-compliance was combined with the estimated lost savings for each level to rank the requirements according to expected savings impact. The methodology to develop and refine further energy cost impacts, specific to building type, system type, and climate location is demonstrated. As results are developed, an innovative alternative method for compliance verification can focus efforts so only the most impactful requirements from an energy cost perspective are verified for every building and a subset of the less impactful requirements are verified on a random basis across a building population. The results can be further applied in prioritizing training material development and specific areas of building official training.« less

  5. The Use of Automated SNOMED CT Clinical Coding in Clinical Decision Support Systems for Preventive Care

    PubMed Central

    Al-Hablani, Bader

    2017-01-01

    Objective The objective of this study is to discuss and analyze the use of automated SNOMED CT clinical coding in clinical decision support systems (CDSSs) for preventive care. The central question that this study seeks to answer is whether the utilization of SNOMED CT in CDSSs can improve preventive care. Method PubMed, Google Scholar, and Cochrane Library were searched for articles published in English between 2001 and 2012 on SNOMED CT, CDSS, and preventive care. Outcome Measures Outcome measures were the sensitivity or specificity of SNOMED CT coded data and the positive predictive value or negative predictive value of SNOMED CT coded data. Additionally, we documented the publication year, research question, study design, results, and conclusions of these studies. Results The reviewed studies suggested that SNOMED CT successfully represents clinical terms and negated clinical terms. Conclusion The use of SNOMED CT in CDSS can be considered to provide an answer to the problem of medical errors as well as for preventive care in general. Enhancement of the modifiers and synonyms found in SNOMED CT will be necessary to improve the expected outcome of the integration of SNOMED CT with CDSS. Moreover, the application of the tree-augmented naïve (TAN) Bayesian network method can be considered the best technique to search SNOMED CT data and, consequently, to help improve preventive health services. PMID:28566995

  6. RISKIND : an enhanced computer code for National Environmental Policy Act transportation consequence analysis

    DOT National Transportation Integrated Search

    1996-01-01

    The RISKIND computer program was developed for the analysis of radiological consequences and health risks to individuals and the collective population from exposures associated with the transportation of spent nuclear fuel (SNF) or other radioactive ...

  7. Active imaging systems to see through adverse conditions: Light-scattering based models and experimental validation

    NASA Astrophysics Data System (ADS)

    Riviere, Nicolas; Ceolato, Romain; Hespel, Laurent

    2014-10-01

    Onera, the French aerospace lab, develops and models active imaging systems to understand the relevant physical phenomena affecting these systems performance. As a consequence, efforts have been done on the propagation of a pulse through the atmosphere and on target geometries and surface properties. These imaging systems must operate at night in all ambient illumination and weather conditions in order to perform strategic surveillance for various worldwide operations. We have implemented codes for 2D and 3D laser imaging systems. As we aim to image a scene in the presence of rain, snow, fog or haze, we introduce such light-scattering effects in our numerical models and compare simulated images with measurements provided by commercial laser scanners.

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    White, J.E.; Roussin, R.W.; Gilpin, H.

    A version of the CRAC2 computer code applicable for use in analyses of consequences and risks of reactor accidents in case work for environmental statements has been implemented for use on the Nuclear Regulatory Commission Data General MV/8000 computer system. Input preparation is facilitated through the use of an interactive computer program which operates on an IBM personal computer. The resulting CRAC2 input deck is transmitted to the MV/8000 by using an error-free file transfer mechanism. To facilitate the use of CRAC2 at NRC, relevant background material on input requirements and model descriptions has been extracted from four reports -more » ''Calculations of Reactor Accident Consequences,'' Version 2, NUREG/CR-2326 (SAND81-1994) and ''CRAC2 Model Descriptions,'' NUREG/CR-2552 (SAND82-0342), ''CRAC Calculations for Accident Sections of Environmental Statements, '' NUREG/CR-2901 (SAND82-1693), and ''Sensitivity and Uncertainty Studies of the CRAC2 Computer Code,'' NUREG/CR-4038 (ORNL-6114). When this background information is combined with instructions on the input processor, this report provides a self-contained guide for preparing CRAC2 input data with a specific orientation toward applications on the MV/8000. 8 refs., 11 figs., 10 tabs.« less

  9. The molecular basis for attractive salt-taste coding in Drosophila.

    PubMed

    Zhang, Yali V; Ni, Jinfei; Montell, Craig

    2013-06-14

    Below a certain level, table salt (NaCl) is beneficial for animals, whereas excessive salt is harmful. However, it remains unclear how low- and high-salt taste perceptions are differentially encoded. We identified a salt-taste coding mechanism in Drosophila melanogaster. Flies use distinct types of gustatory receptor neurons (GRNs) to respond to different concentrations of salt. We demonstrated that a member of the newly discovered ionotropic glutamate receptor (IR) family, IR76b, functioned in the detection of low salt and was a Na(+) channel. The loss of IR76b selectively impaired the attractive pathway, leaving salt-aversive GRNs unaffected. Consequently, low salt became aversive. Our work demonstrated that the opposing behavioral responses to low and high salt were determined largely by an elegant bimodal switch system operating in GRNs.

  10. Sanctions Connected to Dress Code Violations in Secondary School Handbooks

    ERIC Educational Resources Information Center

    Workman, Jane E.; Freeburg, Elizabeth W.; Lentz-Hees, Elizabeth S.

    2004-01-01

    This study identifies and evaluates sanctions for dress code violations in secondary school handbooks. Sanctions, or consequences for breaking rules, vary along seven interrelated dimensions: source, formality, retribution, obtrusiveness, magnitude, severity, and pervasiveness. A content analysis of handbooks from 155 public secondary schools…

  11. Natural Language Interface for Safety Certification of Safety-Critical Software

    NASA Technical Reports Server (NTRS)

    Denney, Ewen; Fischer, Bernd

    2011-01-01

    Model-based design and automated code generation are being used increasingly at NASA. The trend is to move beyond simulation and prototyping to actual flight code, particularly in the guidance, navigation, and control domain. However, there are substantial obstacles to more widespread adoption of code generators in such safety-critical domains. Since code generators are typically not qualified, there is no guarantee that their output is correct, and consequently the generated code still needs to be fully tested and certified. The AutoCert generator plug-in supports the certification of automatically generated code by formally verifying that the generated code is free of different safety violations, by constructing an independently verifiable certificate, and by explaining its analysis in a textual form suitable for code reviews.

  12. [Increasingly appropriate depiction of rheumatology for G-DRG reimbursement 2006].

    PubMed

    Lakomek, H J; Fiori, W; Buscham, K; Hülsemann, J; Köneke, N; Liman, W; Märker-Hermann, E; Roeder, N

    2006-02-01

    Starting with the second year of the so called "convergence period", specialized rheumatological treatment is now represented by a specific DRG (197Z) in the German G-DRG system. The definition of this DRG is based on the procedure codes for the complex and multimodal treatment of rheumatological inpatients (OPS 8-983 and 8-986). This will result in a more appropriate reimbursement of rheumatological treatment. The implementation of specialized rheumatological treatment can be regarded as exemplary for the incorporation of medical specializations into DRG systems. The first step is the definition of the characteristics by procedure codes, which can consequently be utilized within the grouping algorithm. After an inadequate representation of a medical specialization within the DRG system has been demonstrated, a new DRG will be established. As no cost data were available, the calculation of a cost weight for the new G-DRG 197Z is not yet possible for 2006. Hence, reimbursement has to be negotiated between the individual hospital and the budget commission of the health insurers. In this context, the use of clinical pathways is considered helpful.

  13. Decoding the complex genetic causes of heart diseases using systems biology.

    PubMed

    Djordjevic, Djordje; Deshpande, Vinita; Szczesnik, Tomasz; Yang, Andrian; Humphreys, David T; Giannoulatou, Eleni; Ho, Joshua W K

    2015-03-01

    The pace of disease gene discovery is still much slower than expected, even with the use of cost-effective DNA sequencing and genotyping technologies. It is increasingly clear that many inherited heart diseases have a more complex polygenic aetiology than previously thought. Understanding the role of gene-gene interactions, epigenetics, and non-coding regulatory regions is becoming increasingly critical in predicting the functional consequences of genetic mutations identified by genome-wide association studies and whole-genome or exome sequencing. A systems biology approach is now being widely employed to systematically discover genes that are involved in heart diseases in humans or relevant animal models through bioinformatics. The overarching premise is that the integration of high-quality causal gene regulatory networks (GRNs), genomics, epigenomics, transcriptomics and other genome-wide data will greatly accelerate the discovery of the complex genetic causes of congenital and complex heart diseases. This review summarises state-of-the-art genomic and bioinformatics techniques that are used in accelerating the pace of disease gene discovery in heart diseases. Accompanying this review, we provide an interactive web-resource for systems biology analysis of mammalian heart development and diseases, CardiacCode ( http://CardiacCode.victorchang.edu.au/ ). CardiacCode features a dataset of over 700 pieces of manually curated genetic or molecular perturbation data, which enables the inference of a cardiac-specific GRN of 280 regulatory relationships between 33 regulator genes and 129 target genes. We believe this growing resource will fill an urgent unmet need to fully realise the true potential of predictive and personalised genomic medicine in tackling human heart disease.

  14. Evaluation of radiological dispersion/consequence codes supporting DOE nuclear facility SARs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    O`Kula, K.R.; Paik, I.K.; Chung, D.Y.

    1996-12-31

    Since the early 1990s, the authorization basis documentation of many U.S. Department of Energy (DOE) nuclear facilities has been upgraded to comply with DOE orders and standards. In this process, many safety analyses have been revised. Unfortunately, there has been nonuniform application of software, and the most appropriate computer and engineering methodologies often are not applied. A DOE Accident Phenomenology and Consequence (APAC) Methodology Evaluation Program was originated at the request of DOE Defense Programs to evaluate the safety analysis methodologies used in nuclear facility authorization basis documentation and to define future cost-effective support and development initiatives. Six areas, includingmore » source term development (fire, spills, and explosion analysis), in-facility transport, and dispersion/ consequence analysis (chemical and radiological) are contained in the APAC program. The evaluation process, codes considered, key results, and recommendations for future model and software development of the Radiological Dispersion/Consequence Working Group are summarized in this paper.« less

  15. Studies of Planet Formation using a Hybrid N-body + Planetesimal Code

    NASA Technical Reports Server (NTRS)

    Kenyon, Scott J.; Bromley, Benjamin C.; Salamon, Michael (Technical Monitor)

    2005-01-01

    The goal of our proposal was to use a hybrid multi-annulus planetesimal/n-body code to examine the planetesimal theory, one of the two main theories of planet formation. We developed this code to follow the evolution of numerous 1 m to 1 km planetesimals as they collide, merge, and grow into full-fledged planets. Our goal was to apply the code to several well-posed, topical problems in planet formation and to derive observational consequences of the models. We planned to construct detailed models to address two fundamental issues: 1) icy planets - models for icy planet formation will demonstrate how the physical properties of debris disks, including the Kuiper Belt in our solar system, depend on initial conditions and input physics; and 2) terrestrial planets - calculations following the evolution of 1-10 km planetesimals into Earth-mass planets and rings of dust will provide a better understanding of how terrestrial planets form and interact with their environment. During the past year, we made progress on each issue. Papers published in 2004 are summarized. Summaries of work to be completed during the first half of 2005 and work planned for the second half of 2005 are included.

  16. Perceptual consequences of disrupted auditory nerve activity.

    PubMed

    Zeng, Fan-Gang; Kong, Ying-Yee; Michalewski, Henry J; Starr, Arnold

    2005-06-01

    Perceptual consequences of disrupted auditory nerve activity were systematically studied in 21 subjects who had been clinically diagnosed with auditory neuropathy (AN), a recently defined disorder characterized by normal outer hair cell function but disrupted auditory nerve function. Neurological and electrophysical evidence suggests that disrupted auditory nerve activity is due to desynchronized or reduced neural activity or both. Psychophysical measures showed that the disrupted neural activity has minimal effects on intensity-related perception, such as loudness discrimination, pitch discrimination at high frequencies, and sound localization using interaural level differences. In contrast, the disrupted neural activity significantly impairs timing related perception, such as pitch discrimination at low frequencies, temporal integration, gap detection, temporal modulation detection, backward and forward masking, signal detection in noise, binaural beats, and sound localization using interaural time differences. These perceptual consequences are the opposite of what is typically observed in cochlear-impaired subjects who have impaired intensity perception but relatively normal temporal processing after taking their impaired intensity perception into account. These differences in perceptual consequences between auditory neuropathy and cochlear damage suggest the use of different neural codes in auditory perception: a suboptimal spike count code for intensity processing, a synchronized spike code for temporal processing, and a duplex code for frequency processing. We also proposed two underlying physiological models based on desynchronized and reduced discharge in the auditory nerve to successfully account for the observed neurological and behavioral data. These methods and measures cannot differentiate between these two AN models, but future studies using electric stimulation of the auditory nerve via a cochlear implant might. These results not only show the unique contribution of neural synchrony to sensory perception but also provide guidance for translational research in terms of better diagnosis and management of human communication disorders.

  17. The Medicare Policy of Payment Adjustment for Health Care-Associated Infections: Perspectives on Potential Unintended Consequences

    PubMed Central

    Hartmann, Christine W.; Hoff, Timothy; Palmer, Jennifer A.; Wroe, Peter; Dutta-Linn, M. Maya; Lee, Grace

    2014-01-01

    In 2008, the Centers for Medicare & Medicaid Services introduced a new policy to adjust payment to hospitals for health care-associated infections (HAIs) not present on admission. Interviews with 36 hospital infection preventionists across the United States explored the perspectives of these key stakeholders on the potential unintended consequences of the current policy. Responses were analyzed using an iterative coding process where themes were developed from the data. Participants’ descriptions of unintended impacts of the policy centered around three themes. Results suggest the policy has focused more attention on targeted HAIs and has affected hospital staff; relatively fewer systems changes have ensued. Some consequences of the policy, such as infection preventionists having less time to devote to HAIs other than those in the policy or having less time to implement prevention activities, may have undesirable effects on HAI rates if hospitals do not recognize and react to potential time and resource gaps. PMID:21810797

  18. Generating Code Review Documentation for Auto-Generated Mission-Critical Software

    NASA Technical Reports Server (NTRS)

    Denney, Ewen; Fischer, Bernd

    2009-01-01

    Model-based design and automated code generation are increasingly used at NASA to produce actual flight code, particularly in the Guidance, Navigation, and Control domain. However, since code generators are typically not qualified, there is no guarantee that their output is correct, and consequently auto-generated code still needs to be fully tested and certified. We have thus developed AUTOCERT, a generator-independent plug-in that supports the certification of auto-generated code. AUTOCERT takes a set of mission safety requirements, and formally verifies that the autogenerated code satisfies these requirements. It generates a natural language report that explains why and how the code complies with the specified requirements. The report is hyper-linked to both the program and the verification conditions and thus provides a high-level structured argument containing tracing information for use in code reviews.

  19. Darwinism and ethology. The role of natural selection in animals and humans.

    PubMed

    Gervet, J; Soleilhavoup, M

    1997-11-01

    The role of behaviour in biological evolution is examined within the context of Darwinism. All Darwinian models are based on the distinction of two mechanisms: one that permits faithful transmission of a feature from one generation to another, and another that differentially regulates the degree of this transmission. Behaviour plays a minimal role as an agent of transmission in the greater part of the animal kingdom; by contrast, the forms it may assume strongly influence the mechanisms of selection regulating the different rates of transmission. We consider the decisive feature of the human species to be the existence of a phenotypical system of cultural coding characterized by precision and reliability which are the distinctive feature of genetic coding in animals. We examine the consequences for the application of the Darwinian model to human history.

  20. SKIRT: Hybrid parallelization of radiative transfer simulations

    NASA Astrophysics Data System (ADS)

    Verstocken, S.; Van De Putte, D.; Camps, P.; Baes, M.

    2017-07-01

    We describe the design, implementation and performance of the new hybrid parallelization scheme in our Monte Carlo radiative transfer code SKIRT, which has been used extensively for modelling the continuum radiation of dusty astrophysical systems including late-type galaxies and dusty tori. The hybrid scheme combines distributed memory parallelization, using the standard Message Passing Interface (MPI) to communicate between processes, and shared memory parallelization, providing multiple execution threads within each process to avoid duplication of data structures. The synchronization between multiple threads is accomplished through atomic operations without high-level locking (also called lock-free programming). This improves the scaling behaviour of the code and substantially simplifies the implementation of the hybrid scheme. The result is an extremely flexible solution that adjusts to the number of available nodes, processors and memory, and consequently performs well on a wide variety of computing architectures.

  1. Information theory of adaptation in neurons, behavior, and mood.

    PubMed

    Sharpee, Tatyana O; Calhoun, Adam J; Chalasani, Sreekanth H

    2014-04-01

    The ability to make accurate predictions of future stimuli and consequences of one's actions are crucial for the survival and appropriate decision-making. These predictions are constantly being made at different levels of the nervous system. This is evidenced by adaptation to stimulus parameters in sensory coding, and in learning of an up-to-date model of the environment at the behavioral level. This review will discuss recent findings that actions of neurons and animals are selected based on detailed stimulus history in such a way as to maximize information for achieving the task at hand. Information maximization dictates not only how sensory coding should adapt to various statistical aspects of stimuli, but also that reward function should adapt to match the predictive information from past to future. Copyright © 2013 Elsevier Ltd. All rights reserved.

  2. Hybrid finite element/waveguide mode analysis of passive RF devices

    NASA Astrophysics Data System (ADS)

    McGrath, Daniel T.

    1993-07-01

    A numerical solution for time-harmonic electromagnetic fields in two-port passive radio frequency (RF) devices has been developed, implemented in a computer code, and validated. Vector finite elements are used to represent the fields in the device interior, and field continuity across waveguide apertures is enforced by matching the interior solution to a sum of waveguide modes. Consequently, the mesh may end at the aperture instead of extending into the waveguide. The report discusses the variational formulation and its reduction to a linear system using Galerkin's method. It describes the computer code, including its interface to commercial CAD software used for geometry generation. It presents validation results for waveguide discontinuities, coaxial transitions, and microstrip circuits. They demonstrate that the method is an effective and versatile tool for predicting the performance of passive RF devices.

  3. Force-free electrodynamics in dynamical curved spacetimes

    NASA Astrophysics Data System (ADS)

    McWilliams, Sean

    2015-04-01

    We present results on our study of force-free electrodynamics in curved spacetimes. Specifically, we present several improvements to what has become the established set of evolution equations, and we apply these to study the nonlinear stability of analytically known force-free solutions for the first time. We implement our method in a new pseudo-spectral code built on top of the SpEC code for evolving dynamic spacetimes. Finally, we revisit these known solutions and attempt to clarify some interesting properties that render them analytically tractable. Finally, we preview some new work that similarly revisits the established approach to solving another problem in numerical relativity: the post-merger recoil from asymmetric gravitational-wave emission. These new results may have significant implications for the parameter dependence of recoils, and consequently on the statistical expectations for recoil velocities of merged systems.

  4. 25 CFR 11.1212 - Consequences of disobedience or interference.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 25 Indians 1 2010-04-01 2010-04-01 false Consequences of disobedience or interference. 11.1212 Section 11.1212 Indians BUREAU OF INDIAN AFFAIRS, DEPARTMENT OF THE INTERIOR LAW AND ORDER COURTS OF INDIAN OFFENSES AND LAW AND ORDER CODE Child Protection and Domestic Violence Procedures § 11.1212...

  5. 25 CFR 11.1212 - Consequences of disobedience or interference.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 25 Indians 1 2011-04-01 2011-04-01 false Consequences of disobedience or interference. 11.1212 Section 11.1212 Indians BUREAU OF INDIAN AFFAIRS, DEPARTMENT OF THE INTERIOR LAW AND ORDER COURTS OF INDIAN OFFENSES AND LAW AND ORDER CODE Child Protection and Domestic Violence Procedures § 11.1212...

  6. [Penal treatment and rehabilitation of the convict in the new Penal Code of San Marino. Juridical and criminological aspects].

    PubMed

    Sclafani, F; Starace, A

    1978-01-01

    The Republic of San Marino adopted a new Penal Code which came into force on Ist January 1975; it replaced the former one of 15th Sept. 1865. After having stated the typical aspects of the Penal Procedure System therein enforceable, the Authors examine the rules concerning criminal responsibility and the danger of committing new crimes. They point out and criticize the relevant contradictions. In explaining the measures regarding punishment and educational rehabilitation provided for by the San Marino's legal system, the Authors later consider them from a juridical and criminological viewpoint. If some reforms must be approved (for example: biopsychical inquiry on the charged person, probation, week-end imprisonments, fines according to the incomes of the condemned, etc.). the Authors stress that some legal provisions may appear useless and unrealistic when one considers the environmental conditions of the little Republic. The Authors conclude that Penal Procedure Law is not in accordance with Penal Law and, consequently, they hope that a new reform will be grounded on the needs arising from the crimes perpetrated in loco. It shall be, however, necessary to plan a co-ordination among the two Codes within a framework of de-criminalization of many acts which are now punishable as crime.

  7. The Vestibular System Implements a Linear–Nonlinear Transformation In Order to Encode Self-Motion

    PubMed Central

    Massot, Corentin; Schneider, Adam D.; Chacron, Maurice J.; Cullen, Kathleen E.

    2012-01-01

    Although it is well established that the neural code representing the world changes at each stage of a sensory pathway, the transformations that mediate these changes are not well understood. Here we show that self-motion (i.e. vestibular) sensory information encoded by VIIIth nerve afferents is integrated nonlinearly by post-synaptic central vestibular neurons. This response nonlinearity was characterized by a strong (∼50%) attenuation in neuronal sensitivity to low frequency stimuli when presented concurrently with high frequency stimuli. Using computational methods, we further demonstrate that a static boosting nonlinearity in the input-output relationship of central vestibular neurons accounts for this unexpected result. Specifically, when low and high frequency stimuli are presented concurrently, this boosting nonlinearity causes an intensity-dependent bias in the output firing rate, thereby attenuating neuronal sensitivities. We suggest that nonlinear integration of afferent input extends the coding range of central vestibular neurons and enables them to better extract the high frequency features of self-motion when embedded with low frequency motion during natural movements. These findings challenge the traditional notion that the vestibular system uses a linear rate code to transmit information and have important consequences for understanding how the representation of sensory information changes across sensory pathways. PMID:22911113

  8. Automated Source-Code-Based Testing of Object-Oriented Software

    NASA Astrophysics Data System (ADS)

    Gerlich, Ralf; Gerlich, Rainer; Dietrich, Carsten

    2014-08-01

    With the advent of languages such as C++ and Java in mission- and safety-critical space on-board software, new challenges for testing and specifically automated testing arise. In this paper we discuss some of these challenges, consequences and solutions based on an experiment in automated source- code-based testing for C++.

  9. Summary of evidence for an anticodonic basis for the origin of the genetic code

    NASA Technical Reports Server (NTRS)

    Lacey, J. C., Jr.; Mullins, D. W., Jr.

    1981-01-01

    This article summarizes data supporting the hypothesis that the genetic code origin was based on relationships (probably affinities) between amino acids and their anticodon nucleotides. Selective activation seems to follow from selective affinity and consequently, incorporation of amino acids into peptides can also be selective. It is suggested that these selectivities in affinity and activation, coupled with the base pairing specificities, allowed the origin of the code and the process of translation.

  10. Anticipatory anxiety disrupts neural valuation during risky choice.

    PubMed

    Engelmann, Jan B; Meyer, Friederike; Fehr, Ernst; Ruff, Christian C

    2015-02-18

    Incidental negative emotions unrelated to the current task, such as background anxiety, can strongly influence decisions. This is most evident in psychiatric disorders associated with generalized emotional disturbances. However, the neural mechanisms by which incidental emotions may affect choices remain poorly understood. Here we study the effects of incidental anxiety on human risky decision making, focusing on both behavioral preferences and their underlying neural processes. Although observable choices remained stable across affective contexts with high and low incidental anxiety, we found a clear change in neural valuation signals: during high incidental anxiety, activity in ventromedial prefrontal cortex and ventral striatum showed a marked reduction in (1) neural coding of the expected subjective value (ESV) of risky options, (2) prediction of observed choices, (3) functional coupling with other areas of the valuation system, and (4) baseline activity. At the same time, activity in the anterior insula showed an increase in coding the negative ESV of risky lotteries, and this neural activity predicted whether the risky lotteries would be rejected. This pattern of results suggests that incidental anxiety can shift the focus of neural valuation from possible positive consequences to anticipated negative consequences of choice options. Moreover, our findings show that these changes in neural value coding can occur in the absence of changes in overt behavior. This suggest a possible pathway by which background anxiety may lead to the development of chronic reward desensitization and a maladaptive focus on negative cognitions, as prevalent in affective and anxiety disorders. Copyright © 2015 the authors 0270-6474/15/353085-15$15.00/0.

  11. SecPop Version 4: Sector Population Land Fraction and Economic Estimation Program: Users? Guide Model Manual and Verification Report.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Weber, Scott; Bixler, Nathan E.; McFadden, Katherine Letizia

    In 1973 the U.S. Environmental Protection Agency (EPA) developed SecPop to calculate population estimates to support a study on air quality. The Nuclear Regulatory Commission (NRC) adopted this program to support siting reviews for nuclear power plant construction and license applications. Currently SecPop is used to prepare site data input files for offsite consequence calculations with the MELCOR Accident Consequence Code System (MACCS). SecPop enables the use of site-specific population, land use, and economic data for a polar grid defined by the user. Updated versions of SecPop have been released to use U.S. decennial census population data. SECPOP90 was releasedmore » in 1997 to use 1990 population and economic data. SECPOP2000 was released in 2003 to use 2000 population data and 1997 economic data. This report describes the current code version, SecPop version 4.3, which uses 2010 population data and both 2007 and 2012 economic data. It is also compatible with 2000 census and 2002 economic data. At the time of this writing, the current version of SecPop is 4.3.0, and that version is described herein. This report contains guidance for the installation and use of the code as well as a description of the theory, models, and algorithms involved. This report contains appendices which describe the development of the 2010 census file, 2007 county file, and 2012 county file. Finally, an appendix is included that describes the validation assessments performed.« less

  12. The effects of collateral consequences of criminal involvement on employment, use of Temporary Assistance for Needy Families, and health

    PubMed Central

    Kneipp, Shawn M.

    2017-01-01

    Criminal convictions are often associated with collateral consequences that limit access to the forms of employment and social services on which disadvantaged women most frequently rely – regardless of the severity of the offense. These consequences may play an important role in perpetuating health disparities by socioeconomic status and gender. We examined the extent to which research studies to date have assessed whether a criminal conviction might influence women’s health by limiting access to Temporary Assistance for Needy Families (TANF) and employment, as a secondary, or “collateral” criminal conviction-related consequence. We reviewed 434 peer-reviewed journal articles retrieved from three electronic article databases and 197 research reports from three research organizations. Two reviewers independently extracted data from each eligible article or report using a standardized coding scheme. Of the sixteen eligible studies included in the review, most were descriptive. None explored whether receiving TANF modified health outcomes, despite its potential to do so. Researchers to date have not fully examined the causal pathways that could link employment, receiving TANF, and health, especially for disadvantaged women. Future research is needed to address this gap and to understand better the potential consequences of the criminal justice system involvement on the health of this vulnerable population. PMID:25905904

  13. The Effects of Collateral Consequences of Criminal Involvement on Employment, Use of Temporary Assistance for Needy Families, and Health.

    PubMed

    Sheely, Amanda; Kneipp, Shawn M

    2015-01-01

    Criminal convictions are often associated with collateral consequences that limit access to the forms of employment and social services on which disadvantaged women most frequently rely--regardless of the severity of the offense. These consequences may play an important role in perpetuating health disparities by socioeconomic status and gender. We examined the extent to which research studies to date have assessed whether a criminal conviction might influence women's health by limiting access to Temporary Assistance for Needy Families (TANF) and employment, as a secondary, or "collateral" criminal conviction-related consequence. We reviewed 434 peer-reviewed journal articles retrieved from three electronic article databases and 197 research reports from three research organizations. Two reviewers independently extracted data from each eligible article or report using a standardized coding scheme. Of the sixteen eligible studies included in the review, most were descriptive. None explored whether receiving TANF modified health outcomes, despite its potential to do so. Researchers to date have not fully examined the causal pathways that could link employment, receiving TANF, and health, especially for disadvantaged women. Future research is needed to address this gap and to understand better the potential consequences of the criminal justice system involvement on the health of this vulnerable population.

  14. Triangulating case-finding tools for patient safety surveillance: a cross-sectional case study of puncture/laceration.

    PubMed

    Taylor, Jennifer A; Gerwin, Daniel; Morlock, Laura; Miller, Marlene R

    2011-12-01

    To evaluate the need for triangulating case-finding tools in patient safety surveillance. This study applied four case-finding tools to error-associated patient safety events to identify and characterise the spectrum of events captured by these tools, using puncture or laceration as an example for in-depth analysis. Retrospective hospital discharge data were collected for calendar year 2005 (n=48,418) from a large, urban medical centre in the USA. The study design was cross-sectional and used data linkage to identify the cases captured by each of four case-finding tools. Three case-finding tools (International Classification of Diseases external (E) and nature (N) of injury codes, Patient Safety Indicators (PSI)) were applied to the administrative discharge data to identify potential patient safety events. The fourth tool was Patient Safety Net, a web-based voluntary patient safety event reporting system. The degree of mutual exclusion among detection methods was substantial. For example, when linking puncture or laceration on unique identifiers, out of 447 potential events, 118 were identical between PSI and E-codes, 152 were identical between N-codes and E-codes and 188 were identical between PSI and N-codes. Only 100 events that were identified by PSI, E-codes and N-codes were identical. Triangulation of multiple tools through data linkage captures potential patient safety events most comprehensively. Existing detection tools target patient safety domains differently, and consequently capture different occurrences, necessitating the integration of data from a combination of tools to fully estimate the total burden.

  15. Quality of head injury coding from autopsy reports with AIS © 2005 update 2008.

    PubMed

    Schick, Sylvia; Humrich, Anton; Graw, Matthias

    2018-02-28

    ABSTACT Objective: Coding injuries from autopsy reports of traffic accident victims according to Abbreviated Injury Scale AIS © 2005 update 2008 [1] is quite time consuming. The suspicion arose, that many issues leading to discussion between coder and control reader were based on information required by the AIS that was not documented in the autopsy reports. To quantify this suspicion, we introduced an AIS-detail-indicator (AIS-DI). To each injury in the AIS Codebook one letter from A to N was assigned indicating the level of detail. Rules were formulated to receive repeatable assignments. This scheme was applied to a selection of 149 multiply injured traffic fatalities. The frequencies of "not A" codes were calculated for each body region and it was analysed, why the most detailed level A had not been coded. As a first finding, the results of the head region are presented. 747 AIS head injury codes were found in 137 traffic fatalities, and 60% of these injuries were coded with an AIS-DI of level A. There are three different explanations for codes of AIS-DI "not A": Group 1 "Missing information in autopsy report" (5%), Group 2 "Clinical data required by AIS" (20%), and Group 3 "AIS system determined" (15%). Groups 1 and 2 show consequences for the ISS in 25 cases. Other body regions might perform differently. The AIS-DI can indicate the quality of the underlying data basis and, depending on the aims of different AIS users it can be a helpful tool for quality checks.

  16. Revision of seismic design codes corresponding to building damages in the ``5.12'' Wenchuan earthquake

    NASA Astrophysics Data System (ADS)

    Wang, Yayong

    2010-06-01

    A large number of buildings were seriously damaged or collapsed in the “5.12” Wenchuan earthquake. Based on field surveys and studies of damage to different types of buildings, seismic design codes have been updated. This paper briefly summarizes some of the major revisions that have been incorporated into the “Standard for classification of seismic protection of building constructions GB50223-2008” and “Code for Seismic Design of Buildings GB50011-2001.” The definition of seismic fortification class for buildings has been revisited, and as a result, the seismic classifications for schools, hospitals and other buildings that hold large populations such as evacuation shelters and information centers have been upgraded in the GB50223-2008 Code. The main aspects of the revised GB50011-2001 code include: (a) modification of the seismic intensity specified for the Provinces of Sichuan, Shanxi and Gansu; (b) basic conceptual design for retaining walls and building foundations in mountainous areas; (c) regularity of building configuration; (d) integration of masonry structures and pre-cast RC floors; (e) requirements for calculating and detailing stair shafts; and (f) limiting the use of single-bay RC frame structures. Some significant examples of damage in the epicenter areas are provided as a reference in the discussion on the consequences of collapse, the importance of duplicate structural systems, and the integration of RC and masonry structures.

  17. An efficient MPI/OpenMP parallelization of the Hartree–Fock–Roothaan method for the first generation of Intel® Xeon Phi™ processor architecture

    DOE PAGES

    Mironov, Vladimir; Moskovsky, Alexander; D’Mello, Michael; ...

    2017-10-04

    The Hartree-Fock (HF) method in the quantum chemistry package GAMESS represents one of the most irregular algorithms in computation today. Major steps in the calculation are the irregular computation of electron repulsion integrals (ERIs) and the building of the Fock matrix. These are the central components of the main Self Consistent Field (SCF) loop, the key hotspot in Electronic Structure (ES) codes. By threading the MPI ranks in the official release of the GAMESS code, we not only speed up the main SCF loop (4x to 6x for large systems), but also achieve a significant (>2x) reduction in the overallmore » memory footprint. These improvements are a direct consequence of memory access optimizations within the MPI ranks. We benchmark our implementation against the official release of the GAMESS code on the Intel R Xeon PhiTM supercomputer. Here, scaling numbers are reported on up to 7,680 cores on Intel Xeon Phi coprocessors.« less

  18. From chemical metabolism to life: the origin of the genetic coding process

    PubMed Central

    2017-01-01

    Looking for origins is so much rooted in ideology that most studies reflect opinions that fail to explore the first realistic scenarios. To be sure, trying to understand the origins of life should be based on what we know of current chemistry in the solar system and beyond. There, amino acids and very small compounds such as carbon dioxide, dihydrogen or dinitrogen and their immediate derivatives are ubiquitous. Surface-based chemical metabolism using these basic chemicals is the most likely beginning in which amino acids, coenzymes and phosphate-based small carbon molecules were built up. Nucleotides, and of course RNAs, must have come to being much later. As a consequence, the key question to account for life is to understand how chemical metabolism that began with amino acids progressively shaped into a coding process involving RNAs. Here I explore the role of building up complementarity rules as the first information-based process that allowed for the genetic code to emerge, after RNAs were substituted to surfaces to carry over the basic metabolic pathways that drive the pursuit of life. PMID:28684991

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ritchie, L.T.; Johnson, J.D.; Blond, R.M.

    The CRAC2 computer code is a revision of the Calculation of Reactor Accident Consequences computer code, CRAC, developed for the Reactor Safety Study. The CRAC2 computer code incorporates significant modeling improvements in the areas of weather sequence sampling and emergency response, and refinements to the plume rise, atmospheric dispersion, and wet deposition models. New output capabilities have also been added. This guide is to facilitate the informed and intelligent use of CRAC2. It includes descriptions of the input data, the output results, the file structures, control information, and five sample problems.

  20. Mechanisms and consequences of alternative polyadenylation

    PubMed Central

    Di Giammartino, Dafne Campigli; Nishida, Kensei; Manley, James L.

    2011-01-01

    Summary Alternative polyadenylation (APA) is emerging as a widespread mechanism used to control gene expression. Like alternative splicing, usage of alternative poly(A) sites allows a single gene to encode multiple mRNA transcripts. In some cases, this changes the mRNA coding potential; in other cases, the code remains unchanged but the 3’UTR length is altered, influencing the fate of mRNAs in several ways, for example, by altering the availability of RNA binding protein sites and microRNA binding sites. The mechansims governing both global and gene-specific APA are only starting to be deciphered. Here we review what is known about these mechanisms and the functional consequences of alternative polyadenlyation. PMID:21925375

  1. Ancient DNA sequence revealed by error-correcting codes.

    PubMed

    Brandão, Marcelo M; Spoladore, Larissa; Faria, Luzinete C B; Rocha, Andréa S L; Silva-Filho, Marcio C; Palazzo, Reginaldo

    2015-07-10

    A previously described DNA sequence generator algorithm (DNA-SGA) using error-correcting codes has been employed as a computational tool to address the evolutionary pathway of the genetic code. The code-generated sequence alignment demonstrated that a residue mutation revealed by the code can be found in the same position in sequences of distantly related taxa. Furthermore, the code-generated sequences do not promote amino acid changes in the deviant genomes through codon reassignment. A Bayesian evolutionary analysis of both code-generated and homologous sequences of the Arabidopsis thaliana malate dehydrogenase gene indicates an approximately 1 MYA divergence time from the MDH code-generated sequence node to its paralogous sequences. The DNA-SGA helps to determine the plesiomorphic state of DNA sequences because a single nucleotide alteration often occurs in distantly related taxa and can be found in the alternative codon patterns of noncanonical genetic codes. As a consequence, the algorithm may reveal an earlier stage of the evolution of the standard code.

  2. Ancient DNA sequence revealed by error-correcting codes

    PubMed Central

    Brandão, Marcelo M.; Spoladore, Larissa; Faria, Luzinete C. B.; Rocha, Andréa S. L.; Silva-Filho, Marcio C.; Palazzo, Reginaldo

    2015-01-01

    A previously described DNA sequence generator algorithm (DNA-SGA) using error-correcting codes has been employed as a computational tool to address the evolutionary pathway of the genetic code. The code-generated sequence alignment demonstrated that a residue mutation revealed by the code can be found in the same position in sequences of distantly related taxa. Furthermore, the code-generated sequences do not promote amino acid changes in the deviant genomes through codon reassignment. A Bayesian evolutionary analysis of both code-generated and homologous sequences of the Arabidopsis thaliana malate dehydrogenase gene indicates an approximately 1 MYA divergence time from the MDH code-generated sequence node to its paralogous sequences. The DNA-SGA helps to determine the plesiomorphic state of DNA sequences because a single nucleotide alteration often occurs in distantly related taxa and can be found in the alternative codon patterns of noncanonical genetic codes. As a consequence, the algorithm may reveal an earlier stage of the evolution of the standard code. PMID:26159228

  3. Quartz crystal microbalance detection of DNA single-base mutation based on monobase-coded cadmium tellurium nanoprobe.

    PubMed

    Zhang, Yuqin; Lin, Fanbo; Zhang, Youyu; Li, Haitao; Zeng, Yue; Tang, Hao; Yao, Shouzhuo

    2011-01-01

    A new method for the detection of point mutation in DNA based on the monobase-coded cadmium tellurium nanoprobes and the quartz crystal microbalance (QCM) technique was reported. A point mutation (single-base, adenine, thymine, cytosine, and guanine, namely, A, T, C and G, mutation in DNA strand, respectively) DNA QCM sensor was fabricated by immobilizing single-base mutation DNA modified magnetic beads onto the electrode surface with an external magnetic field near the electrode. The DNA-modified magnetic beads were obtained from the biotin-avidin affinity reaction of biotinylated DNA and streptavidin-functionalized core/shell Fe(3)O(4)/Au magnetic nanoparticles, followed by a DNA hybridization reaction. Single-base coded CdTe nanoprobes (A-CdTe, T-CdTe, C-CdTe and G-CdTe, respectively) were used as the detection probes. The mutation site in DNA was distinguished by detecting the decreases of the resonance frequency of the piezoelectric quartz crystal when the coded nanoprobe was added to the test system. This proposed detection strategy for point mutation in DNA is proved to be sensitive, simple, repeatable and low-cost, consequently, it has a great potential for single nucleotide polymorphism (SNP) detection. 2011 © The Japan Society for Analytical Chemistry

  4. Development of Northeast Asia Nuclear Power Plant Accident Simulator.

    PubMed

    Kim, Juyub; Kim, Juyoul; Po, Li-Chi Cliff

    2017-06-15

    A conclusion from the lessons learned after the March 2011 Fukushima Daiichi accident was that Korea needs a tool to estimate consequences from a major accident that could occur at a nuclear power plant located in a neighboring country. This paper describes a suite of computer-based codes to be used by Korea's nuclear emergency response staff for training and potentially operational support in Korea's national emergency preparedness and response program. The systems of codes, Northeast Asia Nuclear Accident Simulator (NANAS), consist of three modules: source-term estimation, atmospheric dispersion prediction and dose assessment. To quickly assess potential doses to the public in Korea, NANAS includes specific reactor data from the nuclear power plants in China, Japan and Taiwan. The completed simulator is demonstrated using data for a hypothetical release. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  5. Clinical potential of oligonucleotide-based therapeutics in the respiratory system.

    PubMed

    Moschos, Sterghios A; Usher, Louise; Lindsay, Mark A

    2017-01-01

    The discovery of an ever-expanding plethora of coding and non-coding RNAs with nodal and causal roles in the regulation of lung physiology and disease is reinvigorating interest in the clinical utility of the oligonucleotide therapeutic class. This is strongly supported through recent advances in nucleic acids chemistry, synthetic oligonucleotide delivery and viral gene therapy that have succeeded in bringing to market at least three nucleic acid-based drugs. As a consequence, multiple new candidates such as RNA interference modulators, antisense, and splice switching compounds are now progressing through clinical evaluation. Here, manipulation of RNA for the treatment of lung disease is explored, with emphasis on robust pharmacological evidence aligned to the five pillars of drug development: exposure to the appropriate tissue, binding to the desired molecular target, evidence of the expected mode of action, activity in the relevant patient population and commercially viable value proposition. Copyright © 2016 Elsevier Inc. All rights reserved.

  6. High-Frequency Network Oscillations in Cerebellar Cortex

    PubMed Central

    Middleton, Steven J.; Racca, Claudia; Cunningham, Mark O.; Traub, Roger D.; Monyer, Hannah; Knöpfel, Thomas; Schofield, Ian S.; Jenkins, Alistair; Whittington, Miles A.

    2016-01-01

    SUMMARY Both cerebellum and neocortex receive input from the somatosensory system. Interaction between these regions has been proposed to underpin the correct selection and execution of motor commands, but it is not clear how such interactions occur. In neocortex, inputs give rise to population rhythms, providing a spatiotemporal coding strategy for inputs and consequent outputs. Here, we show that similar patterns of rhythm generation occur in cerebellum during nicotinic receptor subtype activation. Both gamma oscillations (30–80 Hz) and very fast oscillations (VFOs, 80–160 Hz) were generated by intrinsic cerebellar cortical circuitry in the absence of functional glutamatergic connections. As in neocortex, gamma rhythms were dependent on GABAA receptor-mediated inhibition, whereas VFOs required only nonsynaptically connected intercellular networks. The ability of cerebellar cortex to generate population rhythms within the same frequency bands as neocortex suggests that they act as a common spatiotemporal code within which corticocerebellar dialog may occur. PMID:18549787

  7. RNA editing in Drosophila melanogaster: new targets and functionalconsequences

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stapleton, Mark; Carlson, Joseph W.; Celniker, Susan E.

    2006-09-05

    Adenosine deaminases that act on RNA (ADARs) catalyze the site-specific conversion of adenosine to inosine in primary mRNA transcripts. These re-coding events affect coding potential, splice-sites, and stability of mature mRNAs. ADAR is an essential gene and studies in mouse, C. elegans, and Drosophila suggest its primary function is to modify adult behavior by altering signaling components in the nervous system. By comparing the sequence of isogenic cDNAs to genomic DNA, we have identified and experimentally verified 27 new targets of Drosophila ADAR. Our analyses lead us to identify new classes of genes whose transcripts are targets of ADAR includingmore » components of the actin cytoskeleton, and genes involved in ion homeostasis and signal transduction. Our results indicate that editing in Drosophila increases the diversity of the proteome, and does so in a manner that has direct functional consequences on protein function.« less

  8. Simulation of a beam rotation system for a spallation source

    NASA Astrophysics Data System (ADS)

    Reiss, Tibor; Reggiani, Davide; Seidel, Mike; Talanov, Vadim; Wohlmuther, Michael

    2015-04-01

    With a nominal beam power of nearly 1 MW on target, the Swiss Spallation Neutron Source (SINQ), ranks among the world's most powerful spallation neutron sources. The proton beam transport to the SINQ target is carried out exclusively by means of linear magnetic elements. In the transport line to SINQ the beam is scattered in two meson production targets and as a consequence, at the SINQ target entrance the beam shape can be described by Gaussian distributions in transverse x and y directions with tails cut short by collimators. This leads to a highly nonuniform power distribution inside the SINQ target, giving rise to thermal and mechanical stresses. In view of a future proton beam intensity upgrade, the possibility of homogenizing the beam distribution by means of a fast beam rotation system is currently under investigation. Important aspects which need to be studied are the impact of a rotating proton beam on the resulting neutron spectra, spatial flux distributions and additional—previously not present—proton losses causing unwanted activation of accelerator components. Hence a new source description method was developed for the radiation transport code MCNPX. This new feature makes direct use of the results from the proton beam optics code TURTLE. Its advantage to existing MCNPX source options is that all phase space information and correlations of each primary beam particle computed with TURTLE are preserved and transferred to MCNPX. Simulations of the different beam distributions together with their consequences in terms of neutron production are presented in this publication. Additionally, a detailed description of the coupling method between TURTLE and MCNPX is provided.

  9. Violence and its injury consequences in American movies

    PubMed Central

    McArthur, David L; Peek-Asa, Corinne; Webb, Theresa; Fisher, Kevin; Cook, Bernard; Browne, Nick; Kraus, Jess

    2000-01-01

    Objectives To evaluate the seriousness and frequency of violence and the degree of associated injury depicted in the 100 top-grossing American films of 1994. Methods Each scene in each film was examined for the presentation of violent actions on persons and coded by a systematic context-sensitive analytic scheme. Specific degrees of violence and indices of injury severity were abstracted. Only actually depicted, not implied, actions were coded, although both explicit and implied consequences were examined. Results The median number of violent actions per film was 16 (range, 0-110). Intentional violence outnumbered unintentional violence by a factor of 10. Almost 90% of violent actions showed no consequences to the recipient's body, although more than 80% of the violent actions were executed with lethal or moderate force. Fewer than 1% of violent actions were accompanied by injuries that were then medically attended. Conclusions Violent force in American films of 1994 was overwhelmingly intentional and in 4 of 5 cases was executed at levels likely to cause significant bodily injury. Not only action films but movies of all genres contained scenes in which the intensity of the action was not matched by correspondingly severe injury consequences. Many American films, regardless of genre, tend to minimize the consequences of violence to human beings. PMID:10986175

  10. Violence and its injury consequences in American movies: a public health perspective.

    PubMed

    McArthur, D L; Peek-Asa, C; Webb, T; Fisher, K; Cook, B; Browne, N; Kraus, J

    2000-09-01

    To evaluate the seriousness and frequency of violence and the degree of associated injury depicted in the 100 top-grossing American films of 1994. Each scene in each film was examined for the presentation of violent actions on persons and coded by a systematic context-sensitive analytic scheme. Specific degrees of violence and indices of injury severity were abstracted. Only actually depicted, not implied, actions were coded, although both explicit and implied consequences were examined. The median number of violent actions per film was 16 (range, 0-110). Intentional violence outnumbered unintentional violence by a factor of 10. Almost 90% of violent actions showed no consequences to the recipient's body, although more than 80% of the violent actions were executed with lethal or moderate force. Fewer than 1% of violent actions were accompanied by injuries that were then medically attended. Violent force in American films of 1994 was overwhelmingly intentional and in 4 of 5 cases was executed at levels likely to cause significant bodily injury. Not only action films but movies of all genres contained scenes in which the intensity of the action was not matched by correspondingly severe injury consequences. Many American films, regardless of genre, tend to minimize the consequences of violence to human beings.

  11. Multicarrier airborne ultrasound transmission with piezoelectric transducers.

    PubMed

    Ens, Alexander; Reindl, Leonhard M

    2015-05-01

    In decentralized localization systems, the received signal has to be assigned to the sender. Therefore, longrange airborne ultrasound communication enables the transmission of an identifier of the sender within the ultrasound signal to the receiver. Further, in areas with high electromagnetic noise or electromagnetic free areas, ultrasound communication is an alternative. Using code division multiple access (CDMA) to transmit data is ineffective in rooms due to high echo amplitudes. Further, piezoelectric transducers generate a narrow-band ultrasound signal, which limits the data rate. This work shows the use of multiple carrier frequencies in orthogonal frequency division multiplex (OFDM) and differential quadrature phase shift keying modulation with narrowband piezoelectric devices to achieve a packet length of 2.1 ms. Moreover, the adapted channel coding increases data rate by correcting transmission errors. As a result, a 2-carrier ultrasound transmission system on an embedded system achieves a data rate of approximately 5.7 kBaud. Within the presented work, a transmission range up to 18 m with a packet error rate (PER) of 13% at 10-V supply voltage is reported. In addition, the transmission works up to 22 m with a PER of 85%. Moreover, this paper shows the accuracy of the frame synchronization over the distance. Consequently, the system achieves a standard deviation of 14 μs for ranges up to 10 m.

  12. Dual Coding Theory and Computer Education: Some Media Experiments To Examine the Effects of Different Media on Learning.

    ERIC Educational Resources Information Center

    Alty, James L.

    Dual Coding Theory has quite specific predictions about how information in different media is stored, manipulated and recalled. Different combinations of media are expected to have significant effects upon the recall and retention of information. This obviously may have important consequences in the design of computer-based programs. The paper…

  13. An Exploratory Study of the Impact of Self-Efficacy and Learning Engagement in Coding Learning Activities in Italian Middle School

    ERIC Educational Resources Information Center

    Banzato, Monica; Tosato, Paolo

    2017-01-01

    In Italy, teaching coding at primary and secondary levels is emerging as a major educational issue, particularly in light of the recent reforms now being implemented. Consequently, there has been increased research on how to introduce information technology in lower secondary schools. This paper presents an exploratory survey, carried out through…

  14. Weighted SAW reflector gratings for orthogonal frequency coded SAW tags and sensors

    NASA Technical Reports Server (NTRS)

    Puccio, Derek (Inventor); Malocha, Donald (Inventor)

    2011-01-01

    Weighted surface acoustic wave reflector gratings for coding identification tags and sensors to enable unique sensor operation and identification for a multi-sensor environment. In an embodiment, the weighted reflectors are variable while in another embodiment the reflector gratings are apodized. The weighting technique allows the designer to decrease reflectively and allows for more chips to be implemented in a device and, consequently, more coding diversity. As a result, more tags and sensors can be implemented using a given bandwidth when compared with uniform reflectors. Use of weighted reflector gratings with OFC makes various phase shifting schemes possible, such as in-phase and quadrature implementations of coded waveforms resulting in reduced device size and increased coding.

  15. Spatial Disorientation in Military Vehicles: Causes, Consequences and Cures (Desorientation spaiale dans les vehicules militaires: causes, consequences et remedes)

    DTIC Science & Technology

    2003-02-01

    servcice warfighters (Training devices and protocols, Onboard equipment, Cognitive and sensorimotor aids, Visual and auditory symbology, Peripheral visual...vestibular stimulation causing a decrease in cerebral blood pressure with the consequent reduction in G-tolerance and increased likelihood of ALOC or GLOC...tactile stimulators (e.g. one providing a sensation of movement) or of displays with a more complex coding (e.g. by increase in the number of tactors, or

  16. Mechanism on brain information processing: Energy coding

    NASA Astrophysics Data System (ADS)

    Wang, Rubin; Zhang, Zhikang; Jiao, Xianfa

    2006-09-01

    According to the experimental result of signal transmission and neuronal energetic demands being tightly coupled to information coding in the cerebral cortex, the authors present a brand new scientific theory that offers a unique mechanism for brain information processing. They demonstrate that the neural coding produced by the activity of the brain is well described by the theory of energy coding. Due to the energy coding model's ability to reveal mechanisms of brain information processing based upon known biophysical properties, they cannot only reproduce various experimental results of neuroelectrophysiology but also quantitatively explain the recent experimental results from neuroscientists at Yale University by means of the principle of energy coding. Due to the theory of energy coding to bridge the gap between functional connections within a biological neural network and energetic consumption, they estimate that the theory has very important consequences for quantitative research of cognitive function.

  17. Energy coding in biological neural networks

    PubMed Central

    Zhang, Zhikang

    2007-01-01

    According to the experimental result of signal transmission and neuronal energetic demands being tightly coupled to information coding in the cerebral cortex, we present a brand new scientific theory that offers an unique mechanism for brain information processing. We demonstrate that the neural coding produced by the activity of the brain is well described by our theory of energy coding. Due to the energy coding model’s ability to reveal mechanisms of brain information processing based upon known biophysical properties, we can not only reproduce various experimental results of neuro-electrophysiology, but also quantitatively explain the recent experimental results from neuroscientists at Yale University by means of the principle of energy coding. Due to the theory of energy coding to bridge the gap between functional connections within a biological neural network and energetic consumption, we estimate that the theory has very important consequences for quantitative research of cognitive function. PMID:19003513

  18. Is my study system good enough? A case study for identifying maternal effects.

    PubMed

    Holand, Anna Marie; Steinsland, Ingelin

    2016-06-01

    In this paper, we demonstrate how simulation studies can be used to answer questions about identifiability and consequences of omitting effects from a model. The methodology is presented through a case study where identifiability of genetic and/or individual (environmental) maternal effects is explored. Our study system is a wild house sparrow ( Passer domesticus ) population with known pedigree. We fit pedigree-based (generalized) linear mixed models (animal models), with and without additive genetic and individual maternal effects, and use deviance information criterion (DIC) for choosing between these models. Pedigree and R-code for simulations are available. For this study system, the simulation studies show that only large maternal effects can be identified. The genetic maternal effect (and similar for individual maternal effect) has to be at least half of the total genetic variance to be identified. The consequences of omitting a maternal effect when it is present are explored. Our results indicate that the total (genetic and individual) variance are accounted for. When an individual (environmental) maternal effect is omitted from the model, this only influences the estimated (direct) individual (environmental) variance. When a genetic maternal effect is omitted from the model, both (direct) genetic and (direct) individual variance estimates are overestimated.

  19. Code manual for CONTAIN 2.0: A computer code for nuclear reactor containment analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Murata, K.K.; Williams, D.C.; Griffith, R.O.

    1997-12-01

    The CONTAIN 2.0 computer code is an integrated analysis tool used for predicting the physical conditions, chemical compositions, and distributions of radiological materials inside a containment building following the release of material from the primary system in a light-water reactor accident. It can also predict the source term to the environment. CONTAIN 2.0 is intended to replace the earlier CONTAIN 1.12, which was released in 1991. The purpose of this Code Manual is to provide full documentation of the features and models in CONTAIN 2.0. Besides complete descriptions of the models, this Code Manual provides a complete description of themore » input and output from the code. CONTAIN 2.0 is a highly flexible and modular code that can run problems that are either quite simple or highly complex. An important aspect of CONTAIN is that the interactions among thermal-hydraulic phenomena, aerosol behavior, and fission product behavior are taken into account. The code includes atmospheric models for steam/air thermodynamics, intercell flows, condensation/evaporation on structures and aerosols, aerosol behavior, and gas combustion. It also includes models for reactor cavity phenomena such as core-concrete interactions and coolant pool boiling. Heat conduction in structures, fission product decay and transport, radioactive decay heating, and the thermal-hydraulic and fission product decontamination effects of engineered safety features are also modeled. To the extent possible, the best available models for severe accident phenomena have been incorporated into CONTAIN, but it is intrinsic to the nature of accident analysis that significant uncertainty exists regarding numerous phenomena. In those cases, sensitivity studies can be performed with CONTAIN by means of user-specified input parameters. Thus, the code can be viewed as a tool designed to assist the knowledge reactor safety analyst in evaluating the consequences of specific modeling assumptions.« less

  20. Deconstructing processing speed deficits in schizophrenia: application of a parametric digit symbol coding test.

    PubMed

    Bachman, Peter; Reichenberg, Abraham; Rice, Patrick; Woolsey, Mary; Chaves, Olga; Martinez, David; Maples, Natalie; Velligan, Dawn I; Glahn, David C

    2010-05-01

    Cognitive processing inefficiency, often measured using digit symbol coding tasks, is a putative vulnerability marker for schizophrenia and a reliable indicator of illness severity and functional outcome. Indeed, performance on the digit symbol coding task may be the most severe neuropsychological deficit patients with schizophrenia display at the group level. Yet, little is known about the contributions of simpler cognitive processes to coding performance in schizophrenia (e.g. decision making, visual scanning, relational memory, motor ability). We developed an experimental behavioral task, based on a computerized digit symbol coding task, which allows the manipulation of demands placed on visual scanning efficiency and relational memory while holding decisional and motor requirements constant. Although patients (n=85) were impaired on all aspects of the task when compared to demographically matched healthy comparison subjects (n=30), they showed a particularly striking failure to benefit from the presence of predictable target information. These findings are consistent with predicted impairments in cognitive processing speed due to schizophrenia patients' well-known memory impairment, suggesting that this mnemonic deficit may have consequences for critical aspects of information processing that are traditionally considered quite separate from the memory domain. Future investigation into the mechanisms underlying the wide-ranging consequences of mnemonic deficits in schizophrenia should provide additional insight. Copyright (c) 2010 Elsevier B.V. All rights reserved.

  1. Study on detecting spatial distribution of neutrons and gamma rays using a multi-imaging plate system.

    PubMed

    Tanaka, Kenichi; Sakurai, Yoshinori; Endo, Satoru; Takada, Jun

    2014-06-01

    In order to measure the spatial distributions of neutrons and gamma rays separately using the imaging plate, the requirement for the converter to enhance specific component was investigated with the PHITS code. Consequently, enhancing fast neutrons using recoil protons from epoxy resin was not effective due to high sensitivity of the imaging plate to gamma rays. However, the converter of epoxy resin doped with (10)B was found to have potential for thermal and epithermal neutrons, and graphite for gamma rays. Copyright © 2014 Elsevier Ltd. All rights reserved.

  2. [Benefit assessment of medical services in German health service - legal framework, historical and international perspective].

    PubMed

    Windeler, Jürgen; Lange, Stefan

    2015-03-01

    The term benefit describes the (positive) causal, patient-relevant consequences of medical interventions, whether diagnostic or therapeutic. Benefit assessments form the basis of rational decision-making within a health care system. They are based on clinical trials that are able to provide valid answers to the question regarding the relevant benefit or harm that can be caused by an intervention. In Germany, evidence-based benefit assessments are fixed by law, i.e., the Social Code Book V. The application and the practical impact of these assessments could be improved.

  3. Active imaging systems to perform the strategic surveillance of an aircraft environment in bad weather conditions

    NASA Astrophysics Data System (ADS)

    Riviere, Nicolas; Hespel, Laurent; Ceolato, Romain; Drouet, Florence

    2011-11-01

    Onera, the French Aerospace Lab, develops and models active imaging systems to understand the relevant physical phenomena impacting on their performances. As a consequence, efforts have been done both on the propagation of a pulse through the atmosphere (scintillation and turbulence effects) and, on target geometries and their surface properties (radiometric and speckle effects). But these imaging systems must operate at night in all ambient illuminations and weather conditions in order to perform the strategic surveillance of the environment for various worldwide operations or to perform the enhanced navigation of an aircraft. Onera has implemented codes for 2D and 3D laser imaging systems. As we aim to image a scene even in the presence of rain, snow, fog or haze, Onera introduces such meteorological effects in these numerical models and compares simulated images with measurements provided by commercial imaging systems.

  4. Workflow for Integrating Mesoscale Heterogeneities in Materials Structure with Process Simulation of Titanium Alloys (Postprint)

    DTIC Science & Technology

    2014-10-01

    offer a practical solution to calculating the grain -scale hetero- geneity present in the deformation field. Consequently, crystal plasticity models...process/performance simulation codes (e.g., crystal plasticity finite element method). 15. SUBJECT TERMS ICME; microstructure informatics; higher...iii) protocols for direct and efficient linking of materials models/databases into process/performance simulation codes (e.g., crystal plasticity

  5. Lattice surgery on the Raussendorf lattice

    NASA Astrophysics Data System (ADS)

    Herr, Daniel; Paler, Alexandru; Devitt, Simon J.; Nori, Franco

    2018-07-01

    Lattice surgery is a method to perform quantum computation fault-tolerantly by using operations on boundary qubits between different patches of the planar code. This technique allows for universal planar code computation without eliminating the intrinsic two-dimensional nearest-neighbor properties of the surface code that eases physical hardware implementations. Lattice surgery approaches to algorithmic compilation and optimization have been demonstrated to be more resource efficient for resource-intensive components of a fault-tolerant algorithm, and consequently may be preferable over braid-based logic. Lattice surgery can be extended to the Raussendorf lattice, providing a measurement-based approach to the surface code. In this paper we describe how lattice surgery can be performed on the Raussendorf lattice and therefore give a viable alternative to computation using braiding in measurement-based implementations of topological codes.

  6. Violence and its injury consequences in American movies: a public health perspective

    PubMed Central

    McArthur, D.; Peek-Asa, C.; Webb, T.; Fisher, K.; Cook, B.; Browne, N.; Kraus, J.

    2000-01-01

    Objectives—The purpose of this study was to evaluate the seriousness and frequency of violence and the degree of associated injury depicted in the 100 top grossing American films of 1994. Methods—Each scene in each film was examined for the presentation of violent actions upon persons and coded by means of a systematic context sensitive analytic scheme. Specific degrees of violence and indices of injury severity were abstracted. Only actually depicted, not implied, actions were coded, although both explicit and implied consequences were examined. Results—The median number of violent actions per film was 16, with a range from 1 to 110. Intentional violence outnumbered unintentional violence by a factor of 10. Almost 90% of violent actions showed no consequences to the recipient's body, although more than 80% of the violent actions were executed with lethal or moderate force. Fewer than 1% of violent actions were accompanied by injuries that were then medically attended. Conclusions—Violent force in American films of 1994 was overwhelmingly intentional and in four of five cases was executed at levels likely to cause significant bodily injury. Not only action films but movies of all genres contained scenes in which the intensity of the action was not matched by correspondingly severe injury consequences. Many American films, regardless of genre, tend to minimize the consequences of violence to human beings. PMID:10875668

  7. Violence and its injury consequences in American movies: a public health perspective.

    PubMed

    McArthur, D; Peek-Asa, C; Webb, T; Fisher, K; Cook, B; Browne, N; Kraus, J

    2000-06-01

    The purpose of this study was to evaluate the seriousness and frequency of violence and the degree of associated injury depicted in the 100 top grossing American films of 1994. Each scene in each film was examined for the presentation of violent actions upon persons and coded by means of a systematic context sensitive analytic scheme. Specific degrees of violence and indices of injury severity were abstracted. Only actually depicted, not implied, actions were coded, although both explicit and implied consequences were examined. The median number of violent actions per film was 16, with a range from 1 to 110. Intentional violence outnumbered unintentional violence by a factor of 10. Almost 90% of violent actions showed no consequences to the recipient's body, although more than 80% of the violent actions were executed with lethal or moderate force. Fewer than 1% of violent actions were accompanied by injuries that were then medically attended. Violent force in American films of 1994 was overwhelmingly intentional and in four of five cases was executed at levels likely to cause significant bodily injury. Not only action films but movies of all genres contained scenes in which the intensity of the action was not matched by correspondingly severe injury consequences. Many American films, regardless of genre, tend to minimize the consequences of violence to human beings.

  8. WEC3: Wave Energy Converter Code Comparison Project: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Combourieu, Adrien; Lawson, Michael; Babarit, Aurelien

    This paper describes the recently launched Wave Energy Converter Code Comparison (WEC3) project and present preliminary results from this effort. The objectives of WEC3 are to verify and validate numerical modelling tools that have been developed specifically to simulate wave energy conversion devices and to inform the upcoming IEA OES Annex VI Ocean Energy Modelling Verification and Validation project. WEC3 is divided into two phases. Phase 1 consists of a code-to-code verification and Phase II entails code-to-experiment validation. WEC3 focuses on mid-fidelity codes that simulate WECs using time-domain multibody dynamics methods to model device motions and hydrodynamic coefficients to modelmore » hydrodynamic forces. Consequently, high-fidelity numerical modelling tools, such as Navier-Stokes computational fluid dynamics simulation, and simple frequency domain modelling tools were not included in the WEC3 project.« less

  9. Analysis of the influence of the heat transfer phenomena on the late phase of the ThAI Iod-12 test

    NASA Astrophysics Data System (ADS)

    Gonfiotti, B.; Paci, S.

    2014-11-01

    Iodine is one of the major contributors to the source term during a severe accident in a Nuclear Power Plant for its volatility and high radiological consequences. Therefore, large efforts have been made to describe the Iodine behaviour during an accident, especially in the containment system. Due to the lack of experimental data, in the last years many attempts were carried out to fill the gaps on the knowledge of Iodine behaviour. In this framework, two tests (ThAI Iod-11 and Iod-12) were carried out inside a multi-compartment steel vessel. A quite complex transient characterizes these two tests; therefore they are also suitable for thermal- hydraulic benchmarks. The two tests were originally released for a benchmark exercise during the SARNET2 EU Project. At the end of this benchmark a report covering the main findings was issued, stating that the common codes employed in SA studies were able to simulate the tests but with large discrepancies. The present work is then related to the application of the new versions of ASTEC and MELCOR codes with the aim of carry out a new code-to-code comparison vs. ThAI Iod-12 experimental data, focusing on the influence of the heat exchanges with the outer environment, which seems to be one of the most challenging issues to cope with.

  10. Prompt Radiation Protection Factors

    DTIC Science & Technology

    2018-02-01

    dimensional Monte-Carlo radiation transport code MCNP (Monte Carlo N-Particle) and the evaluation of the protection factors (ratio of dose in the open to...radiation was performed using the three dimensional Monte- Carlo radiation transport code MCNP (Monte Carlo N-Particle) and the evaluation of the protection...by detonation of a nuclear device have placed renewed emphasis on evaluation of the consequences in case of such an event. The Defense Threat

  11. Evaluation of EIT system performance.

    PubMed

    Yasin, Mamatjan; Böhm, Stephan; Gaggero, Pascal O; Adler, Andy

    2011-07-01

    An electrical impedance tomography (EIT) system images internal conductivity from surface electrical stimulation and measurement. Such systems necessarily comprise multiple design choices from cables and hardware design to calibration and image reconstruction. In order to compare EIT systems and study the consequences of changes in system performance, this paper describes a systematic approach to evaluate the performance of the EIT systems. The system to be tested is connected to a saline phantom in which calibrated contrasting test objects are systematically positioned using a position controller. A set of evaluation parameters are proposed which characterize (i) data and image noise, (ii) data accuracy, (iii) detectability of single contrasts and distinguishability of multiple contrasts, and (iv) accuracy of reconstructed image (amplitude, resolution, position and ringing). Using this approach, we evaluate three different EIT systems and illustrate the use of these tools to evaluate and compare performance. In order to facilitate the use of this approach, all details of the phantom, test objects and position controller design are made publicly available including the source code of the evaluation and reporting software.

  12. The moral code in Islam and organ donation in Western countries: reinterpreting religious scriptures to meet utilitarian medical objectives.

    PubMed

    Rady, Mohamed Y; Verheijde, Joseph L

    2014-06-02

    End-of-life organ donation is controversial in Islam. The controversy stems from: (1) scientifically flawed medical criteria of death determination; (2) invasive perimortem procedures for preserving transplantable organs; and (3) incomplete disclosure of information to consenting donors and families. Data from a survey of Muslims residing in Western countries have shown that the interpretation of religious scriptures and advice of faith leaders were major barriers to willingness for organ donation. Transplant advocates have proposed corrective interventions: (1) reinterpreting religious scriptures, (2) reeducating faith leaders, and (3) utilizing media campaigns to overcome religious barriers in Muslim communities. This proposal disregards the intensifying scientific, legal, and ethical controversies in Western societies about the medical criteria of death determination in donors. It would also violate the dignity and inviolability of human life which are pertinent values incorporated in the Islamic moral code. Reinterpreting religious scriptures to serve the utilitarian objectives of a controversial end-of-life practice, perceived to be socially desirable, transgresses the Islamic moral code. It may also have deleterious practical consequences, as donors can suffer harm before death. The negative normative consequences of utilitarian secular moral reasoning reset the Islamic moral code upholding the sanctity and dignity of human life.

  13. The moral code in Islam and organ donation in Western countries: reinterpreting religious scriptures to meet utilitarian medical objectives

    PubMed Central

    2014-01-01

    End-of-life organ donation is controversial in Islam. The controversy stems from: (1) scientifically flawed medical criteria of death determination; (2) invasive perimortem procedures for preserving transplantable organs; and (3) incomplete disclosure of information to consenting donors and families. Data from a survey of Muslims residing in Western countries have shown that the interpretation of religious scriptures and advice of faith leaders were major barriers to willingness for organ donation. Transplant advocates have proposed corrective interventions: (1) reinterpreting religious scriptures, (2) reeducating faith leaders, and (3) utilizing media campaigns to overcome religious barriers in Muslim communities. This proposal disregards the intensifying scientific, legal, and ethical controversies in Western societies about the medical criteria of death determination in donors. It would also violate the dignity and inviolability of human life which are pertinent values incorporated in the Islamic moral code. Reinterpreting religious scriptures to serve the utilitarian objectives of a controversial end-of-life practice, perceived to be socially desirable, transgresses the Islamic moral code. It may also have deleterious practical consequences, as donors can suffer harm before death. The negative normative consequences of utilitarian secular moral reasoning reset the Islamic moral code upholding the sanctity and dignity of human life. PMID:24888748

  14. PFLOTRAN Verification: Development of a Testing Suite to Ensure Software Quality

    NASA Astrophysics Data System (ADS)

    Hammond, G. E.; Frederick, J. M.

    2016-12-01

    In scientific computing, code verification ensures the reliability and numerical accuracy of a model simulation by comparing the simulation results to experimental data or known analytical solutions. The model is typically defined by a set of partial differential equations with initial and boundary conditions, and verification ensures whether the mathematical model is solved correctly by the software. Code verification is especially important if the software is used to model high-consequence systems which cannot be physically tested in a fully representative environment [Oberkampf and Trucano (2007)]. Justified confidence in a particular computational tool requires clarity in the exercised physics and transparency in its verification process with proper documentation. We present a quality assurance (QA) testing suite developed by Sandia National Laboratories that performs code verification for PFLOTRAN, an open source, massively-parallel subsurface simulator. PFLOTRAN solves systems of generally nonlinear partial differential equations describing multiphase, multicomponent and multiscale reactive flow and transport processes in porous media. PFLOTRAN's QA test suite compares the numerical solutions of benchmark problems in heat and mass transport against known, closed-form, analytical solutions, including documentation of the exercised physical process models implemented in each PFLOTRAN benchmark simulation. The QA test suite development strives to follow the recommendations given by Oberkampf and Trucano (2007), which describes four essential elements in high-quality verification benchmark construction: (1) conceptual description, (2) mathematical description, (3) accuracy assessment, and (4) additional documentation and user information. Several QA tests within the suite will be presented, including details of the benchmark problems and their closed-form analytical solutions, implementation of benchmark problems in PFLOTRAN simulations, and the criteria used to assess PFLOTRAN's performance in the code verification procedure. References Oberkampf, W. L., and T. G. Trucano (2007), Verification and Validation Benchmarks, SAND2007-0853, 67 pgs., Sandia National Laboratories, Albuquerque, NM.

  15. Certifying Auto-Generated Flight Code

    NASA Technical Reports Server (NTRS)

    Denney, Ewen

    2008-01-01

    Model-based design and automated code generation are being used increasingly at NASA. Many NASA projects now use MathWorks Simulink and Real-Time Workshop for at least some of their modeling and code development. However, there are substantial obstacles to more widespread adoption of code generators in safety-critical domains. Since code generators are typically not qualified, there is no guarantee that their output is correct, and consequently the generated code still needs to be fully tested and certified. Moreover, the regeneration of code can require complete recertification, which offsets many of the advantages of using a generator. Indeed, manual review of autocode can be more challenging than for hand-written code. Since the direct V&V of code generators is too laborious and complicated due to their complex (and often proprietary) nature, we have developed a generator plug-in to support the certification of the auto-generated code. Specifically, the AutoCert tool supports certification by formally verifying that the generated code is free of different safety violations, by constructing an independently verifiable certificate, and by explaining its analysis in a textual form suitable for code reviews. The generated documentation also contains substantial tracing information, allowing users to trace between model, code, documentation, and V&V artifacts. This enables missions to obtain assurance about the safety and reliability of the code without excessive manual V&V effort and, as a consequence, eases the acceptance of code generators in safety-critical contexts. The generation of explicit certificates and textual reports is particularly well-suited to supporting independent V&V. The primary contribution of this approach is the combination of human-friendly documentation with formal analysis. The key technical idea is to exploit the idiomatic nature of auto-generated code in order to automatically infer logical annotations. The annotation inference algorithm itself is generic, and parametrized with respect to a library of coding patterns that depend on the safety policies and the code generator. The patterns characterize the notions of definitions and uses that are specific to the given safety property. For example, for initialization safety, definitions correspond to variable initializations while uses are statements which read a variable, whereas for array bounds safety, definitions are the array declarations, while uses are statements which access an array variable. The inferred annotations are thus highly dependent on the actual program and the properties being proven. The annotations, themselves, need not be trusted, but are crucial to obtain the automatic formal verification of the safety properties without requiring access to the internals of the code generator. The approach has been applied to both in-house and commercial code generators, but is independent of the particular generator used. It is currently being adapted to flight code generated using MathWorks Real-Time Workshop, an automatic code generator that translates from Simulink/Stateflow models into embedded C code.

  16. Governance Challenges in Telecoupled Food Systems

    NASA Astrophysics Data System (ADS)

    Eakin, H.; Mahanti, A.; Rueda, X.

    2015-12-01

    Distal connections increasingly influence food systems' governance: social actors in one geographic context produce intended and unintended consequences in distant locations, coupling the dynamics of socio-ecological systems. While these linkages are not new, economic globalization, urbanization and the dynamics of consumer preferences have accentuated these 'telecoupled' relationships in food systems. Telecoupling refers to the unexpected causal interactions among human-environment systems that were otherwise not linked. This paper presents a framework for evaluating telecoupling processes and outcomes in food systems, focusing on how the disparate scales of drivers and outcomes, diverse values of actors involved, and spatial and social distance combine to challenge socio-ecological system governance. We draw from two examples of food systems (coffee and maize) to argue that telecoupling, as a challenge for food systems, emerges when the institutions and mechanisms of governance acting over one system do not account for the consequences and interactions involving a different system. Telecoupling can stimulate new forms of governance, such as the development of codes of conduct and certification schemes, with positive impacts on food and livelihood security. Our cases suggest that the emergence of new governance arrangements is at least partially contingent on the prior existence of alternative social networks, which cultivate shared values, meanings and goals in food systems, as well as the capacity of affected actors to mobilize political influence and demonstrate plausible causal links. In the absence of such networks and associated capacities, the prior governance arrangements, although poorly adjusted to the new circumstances, are likely to persist, reinforcing existing power relations and the probability of undesirable social and ecological outcomes.

  17. A new Fortran 90 program to compute regular and irregular associated Legendre functions (new version announcement)

    NASA Astrophysics Data System (ADS)

    Schneider, Barry I.; Segura, Javier; Gil, Amparo; Guan, Xiaoxu; Bartschat, Klaus

    2018-04-01

    This is a revised and updated version of a modern Fortran 90 code to compute the regular Plm (x) and irregular Qlm (x) associated Legendre functions for all x ∈(- 1 , + 1) (on the cut) and | x | > 1 and integer degree (l) and order (m). The necessity to revise the code comes as a consequence of some comments of Prof. James Bremer of the UC//Davis Mathematics Department, who discovered that there were errors in the code for large integer degree and order for the normalized regular Legendre functions on the cut.

  18. [How do first codes of medical ethics inspire contemporary physicians?].

    PubMed

    Paprocka-Lipińska, Anna; Basińska, Krystyna

    2014-02-01

    First codes of medical ethics appeared between 18th and 19th century. Their formation was inspired by changes that happened in medicine, positive in general but with some negative setbacks. Those negative consequences revealed the need to codify all those ethical duties, which were formerly passed from generation to generation by the word of mouth and individual example by master physicians. 210 years has passed since the publication of "Medical Ethics" by Thomas Percival, yet essential ethical guidelines remain the same. Similarly, ethical codes published in Poland in 19 century can still be an inspiration to modem physicians.

  19. Probabilistic Structural Analysis Methods (PSAM) for Select Space Propulsion System Components

    NASA Technical Reports Server (NTRS)

    1999-01-01

    Probabilistic Structural Analysis Methods (PSAM) are described for the probabilistic structural analysis of engine components for current and future space propulsion systems. Components for these systems are subjected to stochastic thermomechanical launch loads. Uncertainties or randomness also occurs in material properties, structural geometry, and boundary conditions. Material property stochasticity, such as in modulus of elasticity or yield strength, exists in every structure and is a consequence of variations in material composition and manufacturing processes. Procedures are outlined for computing the probabilistic structural response or reliability of the structural components. The response variables include static or dynamic deflections, strains, and stresses at one or several locations, natural frequencies, fatigue or creep life, etc. Sample cases illustrates how the PSAM methods and codes simulate input uncertainties and compute probabilistic response or reliability using a finite element model with probabilistic methods.

  20. Coordinated design of coding and modulation systems

    NASA Technical Reports Server (NTRS)

    Massey, J. L.; Ancheta, T.; Johannesson, R.; Lauer, G.; Lee, L.

    1976-01-01

    The joint optimization of the coding and modulation systems employed in telemetry systems was investigated. Emphasis was placed on formulating inner and outer coding standards used by the Goddard Spaceflight Center. Convolutional codes were found that are nearly optimum for use with Viterbi decoding in the inner coding of concatenated coding systems. A convolutional code, the unit-memory code, was discovered and is ideal for inner system usage because of its byte-oriented structure. Simulations of sequential decoding on the deep-space channel were carried out to compare directly various convolutional codes that are proposed for use in deep-space systems.

  1. Illustration of Some Consequences of the Indistinguishability of Electrons

    ERIC Educational Resources Information Center

    Moore, John W.; Davies, William G.

    1976-01-01

    Discusses how color-coded overhead transparencies of computer-generated dot-density diagrams can be used to illustrate hybrid orbitals and the principle of the indistinguishability of electrons. (MLH)

  2. Fuel cycle for a fusion neutron source

    NASA Astrophysics Data System (ADS)

    Ananyev, S. S.; Spitsyn, A. V.; Kuteev, B. V.

    2015-12-01

    The concept of a tokamak-based stationary fusion neutron source (FNS) for scientific research (neutron diffraction, etc.), tests of structural materials for future fusion reactors, nuclear waste transmutation, fission reactor fuel production, and control of subcritical nuclear systems (fusion-fission hybrid reactor) is being developed in Russia. The fuel cycle system is one of the most important systems of FNS that provides circulation and reprocessing of the deuterium-tritium fuel mixture in all fusion reactor systems: the vacuum chamber, neutral injection system, cryogenic pumps, tritium purification system, separation system, storage system, and tritium-breeding blanket. The existing technologies need to be significantly upgraded since the engineering solutions adopted in the ITER project can be only partially used in the FNS (considering the capacity factor higher than 0.3, tritium flow up to 200 m3Pa/s, and temperature of reactor elements up to 650°C). The deuterium-tritium fuel cycle of the stationary FNS is considered. The TC-FNS computer code developed for estimating the tritium distribution in the systems of FNS is described. The code calculates tritium flows and inventory in tokamak systems (vacuum chamber, cryogenic pumps, neutral injection system, fuel mixture purification system, isotope separation system, tritium storage system) and takes into account tritium loss in the fuel cycle due to thermonuclear burnup and β decay. For the two facility versions considered, FNS-ST and DEMO-FNS, the amount of fuel mixture needed for uninterrupted operation of all fuel cycle systems is 0.9 and 1.4 kg, consequently, and the tritium consumption is 0.3 and 1.8 kg per year, including 35 and 55 g/yr, respectively, due to tritium decay.

  3. [Self-regulation systems to control tobacco advertising. An empirical analysis].

    PubMed

    Martín, Marta; Quiles, M del Carmen; López, Carmen

    2004-01-01

    Against the background of the debate aroused by the tobacco advertising ban as a result of Directive 98/43/EC and of the Proposed Directive of 5/9/2001, we aimed to evaluate how self-regulation of tobacco advertising systems has worked in the last 5 years and to evaluate its effectiveness and relevance as a potential tool in public health prevention. We performed a content and discourse analysis of all advertisements appearing in the Sunday supplements of the three weekly newspapers with the widest circulation in Spain (El Pais, El Mundo, and ABC) between January 1995 and January 2000 to detect infractions of the norms of the self-regulation code of the Spanish Tobacco Association (Asociacion Espanola de Tabaco [AET]) regarding: a) the identity of models used in advertising; b) direct or indirect claims for the therapeutic properties of smoking; c) depiction of cigarettes in advertisements, and d) printed warnings on advertisements. We examined 910 banners and 369 advertisements. Very few advertisements displayed rational arguments on elements such as price (13%) or product components (7%). Although the AET's code was generally respected, the advertisements displayed a series of subtleties that allowed the industry to get around the code: 10 of the 369 advertisements reviewed depicted famous people (mainly pilots and artists) and one third of them used iconic personages (Joe Camel or Marlboro Man); one advertisement suggested the therapeutic properties of tobacco and almost all linked smoking with social success and leisure. Although cigarettes were not depicted, 18% of the advertisements showed substitutes for cigarettes in various places (12%) and a large percentage infringed the code's recommendations on printed warnings. The industry's use of creative subtleties infringing its self-imposed norms begs the question of how far self-regulation is viable when a failure in the system can have serious consequences for public health.

  4. A rapid response air quality analysis system for use in projects having stringent quality assurance requirements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bowman, A.W.

    1990-04-01

    This paper describes an approach to solve air quality problems which frequently occur during iterations of the baseline change process. From a schedule standpoint, it is desirable to perform this evaluation in as short a time as possible while budgetary pressures limit the size of the staff available to do the work. Without a method in place to deal with baseline change proposal requests the environment analysts may not be able to produce the analysis results in the time frame expected. Using a concept called the Rapid Response Air Quality Analysis System (RAAS), the problems of timing and cost becomemore » tractable. The system could be adapted to assess other atmospheric pathway impacts, e.g., acoustics or visibility. The air quality analysis system used to perform the EA analysis (EA) for the Salt Repository Project (part of the Civilian Radioactive Waste Management Program), and later to evaluate the consequences of proposed baseline changes, consists of three components: Emission source data files; Emission rates contained in spreadsheets; Impact assessment model codes. The spreadsheets contain user-written codes (macros) that calculate emission rates from (1) emission source data (e.g., numbers and locations of sources, detailed operating schedules, and source specifications including horsepower, load factor, and duty cycle); (2) emission factors such as those published by the U.S. Environmental Protection Agency, and (3) control efficiencies.« less

  5. From assessment to improvement of elderly care in general practice using decision support to increase adherence to ACOVE quality indicators: study protocol for randomized control trial

    PubMed Central

    2014-01-01

    Background Previous efforts such as Assessing Care of Vulnerable Elders (ACOVE) provide quality indicators for assessing the care of elderly patients, but thus far little has been done to leverage this knowledge to improve care for these patients. We describe a clinical decision support system to improve general practitioner (GP) adherence to ACOVE quality indicators and a protocol for investigating impact on GPs’ adherence to the rules. Design We propose two randomized controlled trials among a group of Dutch GP teams on adherence to ACOVE quality indicators. In both trials a clinical decision support system provides un-intrusive feedback appearing as a color-coded, dynamically updated, list of items needing attention. The first trial pertains to real-time automatically verifiable rules. The second trial concerns non-automatically verifiable rules (adherence cannot be established by the clinical decision support system itself, but the GPs report whether they will adhere to the rules). In both trials we will randomize teams of GPs caring for the same patients into two groups, A and B. For the automatically verifiable rules, group A GPs receive support only for a specific inter-related subset of rules, and group B GPs receive support only for the remainder of the rules. For non-automatically verifiable rules, group A GPs receive feedback framed as actions with positive consequences, and group B GPs receive feedback framed as inaction with negative consequences. GPs indicate whether they adhere to non-automatically verifiable rules. In both trials, the main outcome measure is mean adherence, automatically derived or self-reported, to the rules. Discussion We relied on active end-user involvement in selecting the rules to support, and on a model for providing feedback displayed as color-coded real-time messages concerning the patient visiting the GP at that time, without interrupting the GP’s workflow with pop-ups. While these aspects are believed to increase clinical decision support system acceptance and its impact on adherence to the selected clinical rules, systems with these properties have not yet been evaluated. Trial registration Controlled Trials NTR3566 PMID:24642339

  6. A comparison of mapped and measured total ionospheric electron content using global positioning system and beacon satellite observations

    NASA Technical Reports Server (NTRS)

    Lanyi, Gabor E.; Roth, Titus

    1988-01-01

    Total ionospheric electron contents (TEC) were measured by global positioning system (GPS) dual-frequency receivers developed by the Jet Propulsion Laboratory. The measurements included P-code (precise ranging code) and carrier phase data for six GPS satellites during multiple five-hour observing sessions. A set of these GPS TEC measurements were mapped from the GPS lines of sight to the line of sight of a Faraday beacon satellite by statistically fitting the TEC data to a simple model of the ionosphere. The mapped GPS TEC values were compared with the Faraday rotation measurements. Because GPS transmitter offsets are different for each satellite and because some GPS receiver offsets were uncalibrated, the sums of the satellite and receiver offsets were estimated simultaneously with the TEC in a least squares procedure. The accuracy of this estimation procedure is evaluated indicating that the error of the GPS-determined line of sight TEC can be at or below 1 x 10 to the 16th el/sq cm. Consequently, the current level of accuracy is comparable to the Faraday rotation technique; however, GPS provides superior sky coverage.

  7. Supersonics Project - Airport Noise Tech Challenge

    NASA Technical Reports Server (NTRS)

    Bridges, James

    2010-01-01

    The Airport Noise Tech Challenge research effort under the Supersonics Project is reviewed. While the goal of "Improved supersonic jet noise models validated on innovative nozzle concepts" remains the same, the success of the research effort has caused the thrust of the research to be modified going forward in time. The main activities from FY06-10 focused on development and validation of jet noise prediction codes. This required innovative diagnostic techniques to be developed and deployed, extensive jet noise and flow databases to be created, and computational tools to be developed and validated. Furthermore, in FY09-10 systems studies commissioned by the Supersonics Project showed that viable supersonic aircraft were within reach using variable cycle engine architectures if exhaust nozzle technology could provide 3-5dB of suppression. The Project then began to focus on integrating the technologies being developed in its Tech Challenge areas to bring about successful system designs. Consequently, the Airport Noise Tech Challenge area has shifted efforts from developing jet noise prediction codes to using them to develop low-noise nozzle concepts for integration into supersonic aircraft. The new plan of research is briefly presented by technology and timelines.

  8. Hauser-Feshbach calculations in deformed nuclei

    DOE PAGES

    Grimes, S. M.

    2013-08-22

    Hauser Feshbach calculations for deformed nuclei are typically done with level densities appropriate for deformed nuclei but with Hauser Feshbach codes which enforce spherical symmetry by not including K as a parameter in the decay sums. A code has been written which does allow the full K dependence to be included. Calculations with the code have been compared with those from a conventional Hauser Feshbach code. The evaporation portion (continuum) is only slightly affected by this change but the cross sections to individual (resolved) levels are changed substantially. It is found that cross sections to neighboring levels with the samemore » J but differing K are not the same. The predicted consequences of K mixing will also be discussed.« less

  9. Life Experience of Patients With Unilateral Vocal Fold Paralysis.

    PubMed

    Francis, David O; Sherman, Ariel E; Hovis, Kristen L; Bonnet, Kemberlee; Schlundt, David; Garrett, C Gaelyn; Davies, Louise

    2018-05-01

    Clinicians and patients benefit when they have a clear understanding of how medical conditions influence patients' life experiences. Patients' perspectives on life with unilateral vocal fold paralysis have not been well described. To promote patient-centered care by characterizing the patient experiences of living with unilateral vocal fold paralysis. This study used mixed methods: surveys using the voice and dysphagia handicap indexes (VHI and DHI) and semistructured interviews with adults with unilateral vocal cord paralysis recruited from a tertiary voice center. Recorded interviews were transcribed, coded using a hierarchical coding system, and analyzed using an iterative inductive-deductive approach. Symptom domains of the patient experience. In 36 patients (26 [72%] were female, and the median age and interquartile range [IQR] were 63 years [48-68 years]; median interview duration, 42 minutes), median VHI and DHI scores were 96 (IQR, 77-108) and 55.5 (IQR, 35-89) at the time of interviews, respectively. Frustration, isolation, fear, and altered self-identity were primary themes permeating patients' experiences. Frustrations related to limitations in communication, employment, and the medical system. Sources of fear included a loss of control, fear of further dysfunction or permanent disability, concern for health consequences (eg, aspiration pneumonia), and/or an inability to call for help in emergency situations. These experiences were modified by the following factors: resilience, self-efficacy, perceived sense of control, and social support systems. Effects of unilateral vocal fold paralysis extend beyond impaired voice and other somatic symptoms. Awareness of the extent to which these patients experience frustration, isolation, fear, and altered self-identity is important. A patient-centered approach to optimizing unilateral vocal fold paralysis treatment is enhanced by an understanding of both the physical dimension of this condition and how patients cope with the considerable emotional and social consequences. Recognizing the psychosocial dimensions of disease allows clinicians to communicate more effectively, be more empathetic, and to better personalize treatment plans, which may lead to improved patient care and patient satisfaction.

  10. High-Content Optical Codes for Protecting Rapid Diagnostic Tests from Counterfeiting.

    PubMed

    Gökçe, Onur; Mercandetti, Cristina; Delamarche, Emmanuel

    2018-06-19

    Warnings and reports on counterfeit diagnostic devices are released several times a year by regulators and public health agencies. Unfortunately, mishandling, altering, and counterfeiting point-of-care diagnostics (POCDs) and rapid diagnostic tests (RDTs) is lucrative, relatively simple and can lead to devastating consequences. Here, we demonstrate how to implement optical security codes in silicon- and nitrocellulose-based flow paths for device authentication using a smartphone. The codes are created by inkjet spotting inks directly on nitrocellulose or on micropillars. Codes containing up to 32 elements per mm 2 and 8 colors can encode as many as 10 45 combinations. Codes on silicon micropillars can be erased by setting a continuous flow path across the entire array of code elements or for nitrocellulose by simply wicking a liquid across the code. Static or labile code elements can further be formed on nitrocellulose to create a hidden code using poly(ethylene glycol) (PEG) or glycerol additives to the inks. More advanced codes having a specific deletion sequence can also be created in silicon microfluidic devices using an array of passive routing nodes, which activate in a particular, programmable sequence. Such codes are simple to fabricate, easy to view, and efficient in coding information; they can be ideally used in combination with information on a package to protect diagnostic devices from counterfeiting.

  11. Blast and the Consequences on Traumatic Brain Injury-Multiscale Mechanical Modeling of Brain

    DTIC Science & Technology

    2011-02-17

    blast simulation. LS-DYNA as an explicit FE code has been employed to simulate this multi- material fluid –structure interaction problem. The 3-D head...formulation is implemented to model the air-blast simulation. LS-DYNA as an explicit FE code has been employed to simulate this multi-material fluid ...Biomechanics Study of Influencing Parameters for brain under Impact ............................... 12 5.1 The Impact of Cerebrospinal Fluid

  12. Variable thickness transient ground-water flow model. Volume 3. Program listings

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Reisenauer, A.E.

    1979-12-01

    The Assessment of Effectiveness of Geologic Isolation Systems (AEGIS) Program is developing and applying the methodology for assessing the far-field, long-term post-closure safety of deep geologic nuclear waste repositories. AEGIS is being performed by Pacific Northwest Laboratory (PNL) under contract with the Office of Nuclear Waste Isolation (OWNI) for the Department of Energy (DOE). One task within AEGIS is the development of methodology for analysis of the consequences (water pathway) from loss of repository containment as defined by various release scenarios. Analysis of the long-term, far-field consequences of release scenarios requires the application of numerical codes which simulate the hydrologicmore » systems, model the transport of released radionuclides through the hydrologic systems to the biosphere, and, where applicable, assess the radiological dose to humans. Hydrologic and transport models are available at several levels of complexity or sophistication. Model selection and use are determined by the quantity and quality of input data. Model development under AEGIS and related programs provides three levels of hydrologic models, two levels of transport models, and one level of dose models (with several separate models). This is the third of 3 volumes of the description of the VTT (Variable Thickness Transient) Groundwater Hydrologic Model - second level (intermediate complexity) two-dimensional saturated groundwater flow.« less

  13. COOL: A code for Dynamic Monte Carlo Simulation of molecular dynamics

    NASA Astrophysics Data System (ADS)

    Barletta, Paolo

    2012-02-01

    Cool is a program to simulate evaporative and sympathetic cooling for a mixture of two gases co-trapped in an harmonic potential. The collisions involved are assumed to be exclusively elastic, and losses are due to evaporation from the trap. Each particle is followed individually in its trajectory, consequently properties such as spatial densities or energy distributions can be readily evaluated. The code can be used sequentially, by employing one output as input for another run. The code can be easily generalised to describe more complicated processes, such as the inclusion of inelastic collisions, or the possible presence of more than two species in the trap. New version program summaryProgram title: COOL Catalogue identifier: AEHJ_v2_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEHJ_v2_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 1 097 733 No. of bytes in distributed program, including test data, etc.: 18 425 722 Distribution format: tar.gz Programming language: C++ Computer: Desktop Operating system: Linux RAM: 500 Mbytes Classification: 16.7, 23 Catalogue identifier of previous version: AEHJ_v1_0 Journal reference of previous version: Comput. Phys. Comm. 182 (2011) 388 Does the new version supersede the previous version?: Yes Nature of problem: Simulation of the sympathetic process occurring for two molecular gases co-trapped in a deep optical trap. Solution method: The Direct Simulation Monte Carlo method exploits the decoupling, over a short time period, of the inter-particle interaction from the trapping potential. The particle dynamics is thus exclusively driven by the external optical field. The rare inter-particle collisions are considered with an acceptance/rejection mechanism, that is, by comparing a random number to the collisional probability defined in terms of the inter-particle cross section and centre-of-mass energy. All particles in the trap are individually simulated so that at each time step a number of useful quantities, such as the spatial densities or the energy distributions, can be readily evaluated. Reasons for new version: A number of issues made the old version very difficult to be ported on different architectures, and impossible to compile on Windows. Furthermore, the test runs results could only be replicated poorly, as a consequence of the simulations being very sensitive to the machine background noise. In practise, as the particles are simulated for billions and billions of steps, the consequence of a small difference in the initial conditions due to the finiteness of double precision real can have macroscopic effects in the output. This is not a problem in its own right, but a feature of such simulations. However, for sake of completeness we have introduced a quadruple precision version of the code which yields the same results independently of the software used to compile it, or the hardware architecture where the code is run. Summary of revisions: A number of bugs in the dynamic memory allocation have been detected and removed, mostly in the cool.cpp file. All files have been renamed with a .cpp ending, rather than .c++, to make them compatible with Windows. The Random Number Generator routine, which is the computational core of the algorithm, has been re-written in C++, and there is no need any longer for cross FORTRAN-C++ compilation. A quadruple precision version of the code is provided alongside the original double precision one. The makefile allows the user to choose which one to compile by setting the switch PRECISION to either double or quad. The source code and header files have been organised into directories to make the code file system look neater. Restrictions: The in-trap motion of the particles is treated classically. Running time: The running time is relatively short, 1-2 hours. However it is convenient to replicate each simulation several times with different initialisations of the random sequence.

  14. Modeling emission lag after photoexcitation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jensen, Kevin L.; Petillo, John J.; Ovtchinnikov, Serguei

    A theoretical model of delayed emission following photoexcitation from metals and semiconductors is given. Its numerical implementation is designed for beam optics codes used to model photocathodes in rf photoinjectors. The model extends the Moments approach for predicting photocurrent and mean transverse energy as moments of an emitted electron distribution by incorporating time of flight and scattering events that result in emission delay on a sub-picosecond level. The model accounts for a dynamic surface extraction field and changes in the energy distribution and time of emission as a consequence of the laser penetration depth and multiple scattering events during transport.more » Usage in the Particle-in-Cell code MICHELLE to predict the bunch shape and duration with or without laser jitter is given. The consequences of delayed emission effects for ultra-short pulses are discussed.« less

  15. Modeling emission lag after photoexcitation

    DOE PAGES

    Jensen, Kevin L.; Petillo, John J.; Ovtchinnikov, Serguei; ...

    2017-10-28

    A theoretical model of delayed emission following photoexcitation from metals and semiconductors is given. Its numerical implementation is designed for beam optics codes used to model photocathodes in rf photoinjectors. The model extends the Moments approach for predicting photocurrent and mean transverse energy as moments of an emitted electron distribution by incorporating time of flight and scattering events that result in emission delay on a sub-picosecond level. The model accounts for a dynamic surface extraction field and changes in the energy distribution and time of emission as a consequence of the laser penetration depth and multiple scattering events during transport.more » Usage in the Particle-in-Cell code MICHELLE to predict the bunch shape and duration with or without laser jitter is given. The consequences of delayed emission effects for ultra-short pulses are discussed.« less

  16. Evaluation and implementation of QR Code Identity Tag system for Healthcare in Turkey.

    PubMed

    Uzun, Vassilya; Bilgin, Sami

    2016-01-01

    For this study, we designed a QR Code Identity Tag system to integrate into the Turkish healthcare system. This system provides QR code-based medical identification alerts and an in-hospital patient identification system. Every member of the medical system is assigned a unique QR Code Tag; to facilitate medical identification alerts, the QR Code Identity Tag can be worn as a bracelet or necklace or carried as an ID card. Patients must always possess the QR Code Identity bracelets within hospital grounds. These QR code bracelets link to the QR Code Identity website, where detailed information is stored; a smartphone or standalone QR code scanner can be used to scan the code. The design of this system allows authorized personnel (e.g., paramedics, firefighters, or police) to access more detailed patient information than the average smartphone user: emergency service professionals are authorized to access patient medical histories to improve the accuracy of medical treatment. In Istanbul, we tested the self-designed system with 174 participants. To analyze the QR Code Identity Tag system's usability, the participants completed the System Usability Scale questionnaire after using the system.

  17. Suite of Benchmark Tests to Conduct Mesh-Convergence Analysis of Nonlinear and Non-constant Coefficient Transport Codes

    NASA Astrophysics Data System (ADS)

    Zamani, K.; Bombardelli, F. A.

    2014-12-01

    Verification of geophysics codes is imperative to avoid serious academic as well as practical consequences. In case that access to any given source code is not possible, the Method of Manufactured Solution (MMS) cannot be employed in code verification. In contrast, employing the Method of Exact Solution (MES) has several practical advantages. In this research, we first provide four new one-dimensional analytical solutions designed for code verification; these solutions are able to uncover the particular imperfections of the Advection-diffusion-reaction equation, such as nonlinear advection, diffusion or source terms, as well as non-constant coefficient equations. After that, we provide a solution of Burgers' equation in a novel setup. Proposed solutions satisfy the continuity of mass for the ambient flow, which is a crucial factor for coupled hydrodynamics-transport solvers. Then, we use the derived analytical solutions for code verification. To clarify gray-literature issues in the verification of transport codes, we designed a comprehensive test suite to uncover any imperfection in transport solvers via a hierarchical increase in the level of tests' complexity. The test suite includes hundreds of unit tests and system tests to check vis-a-vis the portions of the code. Examples for checking the suite start by testing a simple case of unidirectional advection; then, bidirectional advection and tidal flow and build up to nonlinear cases. We design tests to check nonlinearity in velocity, dispersivity and reactions. The concealing effect of scales (Peclet and Damkohler numbers) on the mesh-convergence study and appropriate remedies are also discussed. For the cases in which the appropriate benchmarks for mesh convergence study are not available, we utilize symmetry. Auxiliary subroutines for automation of the test suite and report generation are designed. All in all, the test package is not only a robust tool for code verification but it also provides comprehensive insight on the ADR solvers capabilities. Such information is essential for any rigorous computational modeling of ADR equation for surface/subsurface pollution transport. We also convey our experiences in finding several errors which were not detectable with routine verification techniques.

  18. A mechanistic explanation of popularity: genes, rule breaking, and evocative gene-environment correlations.

    PubMed

    Burt, Alexandra

    2009-04-01

    Previous work has suggested that the serotonergic system plays a key role in "popularity" or likeability. A polymorphism within the 5HT-sub(2A) serotonin receptor gene (-G1438A) has also been associated with popularity, suggesting that genes may predispose individuals to particular social experiences. However, because genes cannot code directly for others' reactions, any legitimate association should be mediated via the individual's behavior (i.e., genes-->behaviors-->social consequences), a phenomenon referred to as an evocative gene-environment correlation (rGE). The current study aimed to identify one such mediating behavior. The author focused on rule breaking given its prior links to both the serotonergic system and to increased popularity during adolescence. Two samples of previously unacquainted late-adolescent boys completed a peer-based interaction paradigm designed to assess their popularity. Analyses revealed that rule breaking partially mediated the genetic effect on popularity, thereby furthering our understanding of the biological mechanisms that underlie popularity. Moreover, the present results represent the first meaningfully explicated evidence that genes predispose individuals not only to particular behaviors but also to the social consequences of those behaviors. (c) 2009 APA, all rights reserved.

  19. System Design Description for the TMAD Code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Finfrock, S.H.

    This document serves as the System Design Description (SDD) for the TMAD Code System, which includes the TMAD code and the LIBMAKR code. The SDD provides a detailed description of the theory behind the code, and the implementation of that theory. It is essential for anyone who is attempting to review or modify the code or who otherwise needs to understand the internal workings of the code. In addition, this document includes, in Appendix A, the System Requirements Specification for the TMAD System.

  20. Towards a complete map of the human long non-coding RNA transcriptome.

    PubMed

    Uszczynska-Ratajczak, Barbara; Lagarde, Julien; Frankish, Adam; Guigó, Roderic; Johnson, Rory

    2018-05-23

    Gene maps, or annotations, enable us to navigate the functional landscape of our genome. They are a resource upon which virtually all studies depend, from single-gene to genome-wide scales and from basic molecular biology to medical genetics. Yet present-day annotations suffer from trade-offs between quality and size, with serious but often unappreciated consequences for downstream studies. This is particularly true for long non-coding RNAs (lncRNAs), which are poorly characterized compared to protein-coding genes. Long-read sequencing technologies promise to improve current annotations, paving the way towards a complete annotation of lncRNAs expressed throughout a human lifetime.

  1. Code development for ships -- A demonstration

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ayyub, B.; Mansour, A.E.; White, G.

    1996-12-31

    A demonstration summary of a reliability-based structural design code for ships is presented for two ship types, a cruiser and a tanker. For both ship types, code requirements cover four failure modes: hull girder bulking, unstiffened plate yielding and buckling, stiffened plate buckling, and fatigue of critical detail. Both serviceability and ultimate limit states are considered. Because of limitation on the length, only hull girder modes are presented in this paper. Code requirements for other modes will be presented in future publication. A specific provision of the code will be a safety check expression. The design variables are to bemore » taken at their nominal values, typically values in the safe side of the respective distributions. Other safety check expressions for hull girder failure that include load combination factors, as well as consequence of failure factors, are considered. This paper provides a summary of safety check expressions for the hull girder modes.« less

  2. Error-correction coding for digital communications

    NASA Astrophysics Data System (ADS)

    Clark, G. C., Jr.; Cain, J. B.

    This book is written for the design engineer who must build the coding and decoding equipment and for the communication system engineer who must incorporate this equipment into a system. It is also suitable as a senior-level or first-year graduate text for an introductory one-semester course in coding theory. Fundamental concepts of coding are discussed along with group codes, taking into account basic principles, practical constraints, performance computations, coding bounds, generalized parity check codes, polynomial codes, and important classes of group codes. Other topics explored are related to simple nonalgebraic decoding techniques for group codes, soft decision decoding of block codes, algebraic techniques for multiple error correction, the convolutional code structure and Viterbi decoding, syndrome decoding techniques, and sequential decoding techniques. System applications are also considered, giving attention to concatenated codes, coding for the white Gaussian noise channel, interleaver structures for coded systems, and coding for burst noise channels.

  3. Health system's response for physician workforce shortages and the upcoming crisis in Ethiopia: a grounded theory research.

    PubMed

    Assefa, Tsion; Haile Mariam, Damen; Mekonnen, Wubegzier; Derbew, Miliard

    2017-12-28

    A rapid transition from severe physician workforce shortage to massive production to ensure the physician workforce demand puts the Ethiopian health care system in a variety of challenges. Therefore, this study discovered how the health system response for physician workforce shortage using the so-called flooding strategy was viewed by different stakeholders. The study adopted the grounded theory research approach to explore the causes, contexts, and consequences (at the present, in the short and long term) of massive medical student admission to the medical schools on patient care, medical education workforce, and medical students. Forty-three purposively selected individuals were involved in a semi-structured interview from different settings: academics, government health care system, and non-governmental organizations (NGOs). Data coding, classification, and categorization were assisted using ATLAs.ti qualitative data analysis scientific software. In relation to the health system response, eight main categories were emerged: (1) reasons for rapid medical education expansion; (2) preparation for medical education expansion; (3) the consequences of rapid medical education expansion; (4) massive production/flooding as human resources for health (HRH) development strategy; (5) cooperation on HRH development; (6) HRH strategies and planning; (7) capacity of system for HRH development; and (8) institutional continuity for HRH development. The demand for physician workforce and gaining political acceptance were cited as main reasons which motivated the government to scale up the medical education rapidly. However, the rapid expansion was beyond the capacity of medical schools' human resources, patient flow, and size of teaching hospitals. As a result, there were potential adverse consequences in clinical service delivery, and teaching learning process at the present: "the number should consider the available resources such as number of classrooms, patient flows, medical teachers, library…". In the future, it was anticipated to end in surplus in physician workforce, unemployment, inefficiency, and pressure on the system: "…flooding may seem a good strategy superficially but it is a dangerous strategy. It may put the country into crisis, even if good physicians are being produced; they may not get a place where to go…". Massive physician workforce production which is not closely aligned with the training capacity of the medical schools and the absorption of graduates in to the health system will end up in unanticipated adverse consequences.

  4. Cracking the code: the accuracy of coding shoulder procedures and the repercussions.

    PubMed

    Clement, N D; Murray, I R; Nie, Y X; McBirnie, J M

    2013-05-01

    Coding of patients' diagnosis and surgical procedures is subject to error levels of up to 40% with consequences on distribution of resources and financial recompense. Our aim was to explore and address reasons behind coding errors of shoulder diagnosis and surgical procedures and to evaluate a potential solution. A retrospective review of 100 patients who had undergone surgery was carried out. Coding errors were identified and the reasons explored. A coding proforma was designed to address these errors and was prospectively evaluated for 100 patients. The financial implications were also considered. Retrospective analysis revealed the correct primary diagnosis was assigned in 54 patients (54%) had an entirely correct diagnosis, and only 7 (7%) patients had a correct procedure code assigned. Coders identified indistinct clinical notes and poor clarity of procedure codes as reasons for errors. The proforma was significantly more likely to assign the correct diagnosis (odds ratio 18.2, p < 0.0001) and the correct procedure code (odds ratio 310.0, p < 0.0001). Using the proforma resulted in a £28,562 increase in revenue for the 100 patients evaluated relative to the income generated from the coding department. High error levels for coding are due to misinterpretation of notes and ambiguity of procedure codes. This can be addressed by allowing surgeons to assign the diagnosis and procedure using a simplified list that is passed directly to coding.

  5. Examining the relationship between comprehension and production processes in code-switched language

    PubMed Central

    Guzzardo Tamargo, Rosa E.; Valdés Kroff, Jorge R.; Dussias, Paola E.

    2016-01-01

    We employ code-switching (the alternation of two languages in bilingual communication) to test the hypothesis, derived from experience-based models of processing (e.g., Boland, Tanenhaus, Carlson, & Garnsey, 1989; Gennari & MacDonald, 2009), that bilinguals are sensitive to the combinatorial distributional patterns derived from production and that they use this information to guide processing during the comprehension of code-switched sentences. An analysis of spontaneous bilingual speech confirmed the existence of production asymmetries involving two auxiliary + participle phrases in Spanish–English code-switches. A subsequent eye-tracking study with two groups of bilingual code-switchers examined the consequences of the differences in distributional patterns found in the corpus study for comprehension. Participants’ comprehension costs mirrored the production patterns found in the corpus study. Findings are discussed in terms of the constraints that may be responsible for the distributional patterns in code-switching production and are situated within recent proposals of the links between production and comprehension. PMID:28670049

  6. Examining the relationship between comprehension and production processes in code-switched language.

    PubMed

    Guzzardo Tamargo, Rosa E; Valdés Kroff, Jorge R; Dussias, Paola E

    2016-08-01

    We employ code-switching (the alternation of two languages in bilingual communication) to test the hypothesis, derived from experience-based models of processing (e.g., Boland, Tanenhaus, Carlson, & Garnsey, 1989; Gennari & MacDonald, 2009), that bilinguals are sensitive to the combinatorial distributional patterns derived from production and that they use this information to guide processing during the comprehension of code-switched sentences. An analysis of spontaneous bilingual speech confirmed the existence of production asymmetries involving two auxiliary + participle phrases in Spanish-English code-switches. A subsequent eye-tracking study with two groups of bilingual code-switchers examined the consequences of the differences in distributional patterns found in the corpus study for comprehension. Participants' comprehension costs mirrored the production patterns found in the corpus study. Findings are discussed in terms of the constraints that may be responsible for the distributional patterns in code-switching production and are situated within recent proposals of the links between production and comprehension.

  7. Novel microscopy-based screening method reveals regulators of contact-dependent intercellular transfer

    PubMed Central

    Michael Frei, Dominik; Hodneland, Erlend; Rios-Mondragon, Ivan; Burtey, Anne; Neumann, Beate; Bulkescher, Jutta; Schölermann, Julia; Pepperkok, Rainer; Gerdes, Hans-Hermann; Kögel, Tanja

    2015-01-01

    Contact-dependent intercellular transfer (codeIT) of cellular constituents can have functional consequences for recipient cells, such as enhanced survival and drug resistance. Pathogenic viruses, prions and bacteria can also utilize this mechanism to spread to adjacent cells and potentially evade immune detection. However, little is known about the molecular mechanism underlying this intercellular transfer process. Here, we present a novel microscopy-based screening method to identify regulators and cargo of codeIT. Single donor cells, carrying fluorescently labelled endocytic organelles or proteins, are co-cultured with excess acceptor cells. CodeIT is quantified by confocal microscopy and image analysis in 3D, preserving spatial information. An siRNA-based screening using this method revealed the involvement of several myosins and small GTPases as codeIT regulators. Our data indicates that cellular protrusions and tubular recycling endosomes are important for codeIT. We automated image acquisition and analysis to facilitate large-scale chemical and genetic screening efforts to identify key regulators of codeIT. PMID:26271723

  8. FIR Filter of DS-CDMA UWB Modem Transmitter

    NASA Astrophysics Data System (ADS)

    Kang, Kyu-Min; Cho, Sang-In; Won, Hui-Chul; Choi, Sang-Sung

    This letter presents low-complexity digital pulse shaping filter structures of a direct sequence code division multiple access (DS-CDMA) ultra wide-band (UWB) modem transmitter with a ternary spreading code. The proposed finite impulse response (FIR) filter structures using a look-up table (LUT) have the effect of saving the amount of memory by about 50% to 80% in comparison to the conventional FIR filter structures, and consequently are suitable for a high-speed parallel data process.

  9. Concatenated coding systems employing a unit-memory convolutional code and a byte-oriented decoding algorithm

    NASA Technical Reports Server (NTRS)

    Lee, L.-N.

    1977-01-01

    Concatenated coding systems utilizing a convolutional code as the inner code and a Reed-Solomon code as the outer code are considered. In order to obtain very reliable communications over a very noisy channel with relatively modest coding complexity, it is proposed to concatenate a byte-oriented unit-memory convolutional code with an RS outer code whose symbol size is one byte. It is further proposed to utilize a real-time minimal-byte-error probability decoding algorithm, together with feedback from the outer decoder, in the decoder for the inner convolutional code. The performance of the proposed concatenated coding system is studied, and the improvement over conventional concatenated systems due to each additional feature is isolated.

  10. Concatenated coding systems employing a unit-memory convolutional code and a byte-oriented decoding algorithm

    NASA Technical Reports Server (NTRS)

    Lee, L. N.

    1976-01-01

    Concatenated coding systems utilizing a convolutional code as the inner code and a Reed-Solomon code as the outer code are considered. In order to obtain very reliable communications over a very noisy channel with relatively small coding complexity, it is proposed to concatenate a byte oriented unit memory convolutional code with an RS outer code whose symbol size is one byte. It is further proposed to utilize a real time minimal byte error probability decoding algorithm, together with feedback from the outer decoder, in the decoder for the inner convolutional code. The performance of the proposed concatenated coding system is studied, and the improvement over conventional concatenated systems due to each additional feature is isolated.

  11. A proto-architecture for innate directionally selective visual maps.

    PubMed

    Adams, Samantha V; Harris, Chris M

    2014-01-01

    Self-organizing artificial neural networks are a popular tool for studying visual system development, in particular the cortical feature maps present in real systems that represent properties such as ocular dominance (OD), orientation-selectivity (OR) and direction selectivity (DS). They are also potentially useful in artificial systems, for example robotics, where the ability to extract and learn features from the environment in an unsupervised way is important. In this computational study we explore a DS map that is already latent in a simple artificial network. This latent selectivity arises purely from the cortical architecture without any explicit coding for DS and prior to any self-organising process facilitated by spontaneous activity or training. We find DS maps with local patchy regions that exhibit features similar to maps derived experimentally and from previous modeling studies. We explore the consequences of changes to the afferent and lateral connectivity to establish the key features of this proto-architecture that support DS.

  12. Software Model Checking of ARINC-653 Flight Code with MCP

    NASA Technical Reports Server (NTRS)

    Thompson, Sarah J.; Brat, Guillaume; Venet, Arnaud

    2010-01-01

    The ARINC-653 standard defines a common interface for Integrated Modular Avionics (IMA) code. In particular, ARINC-653 Part 1 specifies a process- and partition-management API that is analogous to POSIX threads, but with certain extensions and restrictions intended to support the implementation of high reliability flight code. MCP is a software model checker, developed at NASA Ames, that provides capabilities for model checking C and C++ source code. In this paper, we present recent work aimed at implementing extensions to MCP that support ARINC-653, and we discuss the challenges and opportunities that consequentially arise. Providing support for ARINC-653 s time and space partitioning is nontrivial, though there are implicit benefits for partial order reduction possible as a consequence of the API s strict interprocess communication policy.

  13. Application of CFX-10 to the Investigation of RPV Coolant Mixing in VVER Reactors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moretti, Fabio; Melideo, Daniele; Terzuoli, Fulvio

    2006-07-01

    Coolant mixing phenomena occurring in the pressure vessel of a nuclear reactor constitute one of the main objectives of investigation by researchers concerned with nuclear reactor safety. For instance, mixing plays a relevant role in reactivity-induced accidents initiated by de-boration or boron dilution events, followed by transport of a de-borated slug into the vessel of a pressurized water reactor. Another example is constituted by temperature mixing, which may sensitively affect the consequences of a pressurized thermal shock scenario. Predictive analysis of mixing phenomena is strongly improved by the availability of computational tools able to cope with the inherent three-dimensionality ofmore » such problem, like system codes with three-dimensional capabilities, and Computational Fluid Dynamics (CFD) codes. The present paper deals with numerical analyses of coolant mixing in the reactor pressure vessel of a VVER-1000 reactor, performed by the ANSYS CFX-10 CFD code. In particular, the 'swirl' effect that has been observed to take place in the downcomer of such kind of reactor has been addressed, with the aim of assessing the capability of the codes to predict that effect, and to understand the reasons for its occurrence. Results have been compared against experimental data from V1000CT-2 Benchmark. Moreover, a boron mixing problem has been investigated, in the hypothesis that a de-borated slug, transported by natural circulation, enters the vessel. Sensitivity analyses have been conducted on some geometrical features, model parameters and boundary conditions. (authors)« less

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dokhane, A.; Canepa, S.; Ferroukhi, H.

    For stability analyses of the Swiss operating Boiling-Water-Reactors (BWRs), the methodology employed and validated so far at the Paul Scherrer Inst. (PSI) was based on the RAMONA-3 code with a hybrid upstream static lattice/core analysis approach using CASMO-4 and PRESTO-2. More recently, steps were undertaken towards a new methodology based on the SIMULATE-3K (S3K) code for the dynamical analyses combined with the CMSYS system relying on the CASMO/SIMULATE-3 suite of codes and which was established at PSI to serve as framework for the development and validation of reference core models of all the Swiss reactors and operated cycles. This papermore » presents a first validation of the new methodology on the basis of a benchmark recently organised by a Swiss utility and including the participation of several international organisations with various codes/methods. Now in parallel, a transition from CASMO-4E (C4E) to CASMO-5M (C5M) as basis for the CMSYS core models was also recently initiated at PSI. Consequently, it was considered adequate to address the impact of this transition both for the steady-state core analyses as well as for the stability calculations and to achieve thereby, an integral approach for the validation of the new S3K methodology. Therefore, a comparative assessment of C4 versus C5M is also presented in this paper with particular emphasis on the void coefficients and their impact on the downstream stability analysis results. (authors)« less

  15. Improving the safety of street-vended food.

    PubMed

    Moy, G; Hazzard, A; Käferstein, F

    1997-01-01

    An integrated plan of action for improving street food involving health and other regulatory authorities, vendors and consumers should address not only food safety, but also environmental health management, including consideration of inadequate sanitation and waste management, possible environmental pollution, congestion and disturbances to traffic. However, WHO cautions that, in view of their importance in the diets of urban populations, particularly the socially disadvantaged, every effort should be made to preserve the benefits provided by varied, inexpensive and often nutritious street food. Therefore, authorities concerned with street food management must balance efforts aimed at reducing the negative aspects on the environment with the benefits of street food and its important role in the community. Health authorities charged with responsibility for food safety control should match risk management action to the level of assessed risk. The rigorous application of codes and enforcement of regulations more suited to larger and permanent food service establishments is unlikely to be justifiable. Such rigorous application of codes and regulations may result in disappearance of the trade with consequent aggravation of hunger and malnutrition. Moreover, most codes and regulations have not been based on any systematic identification and assessment of health hazards associated with different types of foods and operations as embodied in the HACCP approach which has been recognized by Codex as the most cost-effective means for promoting food safety. WHO encourages the development of regulations that empower vendors to take greater responsibility for the preparation of safe food, and of codes of practice based on the HACCP system.

  16. Resurrection of DNA Function In Vivo from an Extinct Genome

    PubMed Central

    Pask, Andrew J.; Behringer, Richard R.; Renfree, Marilyn B.

    2008-01-01

    There is a burgeoning repository of information available from ancient DNA that can be used to understand how genomes have evolved and to determine the genetic features that defined a particular species. To assess the functional consequences of changes to a genome, a variety of methods are needed to examine extinct DNA function. We isolated a transcriptional enhancer element from the genome of an extinct marsupial, the Tasmanian tiger (Thylacinus cynocephalus or thylacine), obtained from 100 year-old ethanol-fixed tissues from museum collections. We then examined the function of the enhancer in vivo. Using a transgenic approach, it was possible to resurrect DNA function in transgenic mice. The results demonstrate that the thylacine Col2A1 enhancer directed chondrocyte-specific expression in this extinct mammalian species in the same way as its orthologue does in mice. While other studies have examined extinct coding DNA function in vitro, this is the first example of the restoration of extinct non-coding DNA and examination of its function in vivo. Our method using transgenesis can be used to explore the function of regulatory and protein-coding sequences obtained from any extinct species in an in vivo model system, providing important insights into gene evolution and diversity. PMID:18493600

  17. Intricate and Cell Type-Specific Populations of Endogenous Circular DNA (eccDNA) in Caenorhabditis elegans and Homo sapiens.

    PubMed

    Shoura, Massa J; Gabdank, Idan; Hansen, Loren; Merker, Jason; Gotlib, Jason; Levene, Stephen D; Fire, Andrew Z

    2017-10-05

    Investigations aimed at defining the 3D configuration of eukaryotic chromosomes have consistently encountered an endogenous population of chromosome-derived circular genomic DNA, referred to as extrachromosomal circular DNA (eccDNA). While the production, distribution, and activities of eccDNAs remain understudied, eccDNA formation from specific regions of the linear genome has profound consequences on the regulatory and coding capabilities for these regions. Here, we define eccDNA distributions in Caenorhabditis elegans and in three human cell types, utilizing a set of DNA topology-dependent approaches for enrichment and characterization. The use of parallel biophysical, enzymatic, and informatic approaches provides a comprehensive profiling of eccDNA robust to isolation and analysis methodology. Results in human and nematode systems provide quantitative analysis of the eccDNA loci at both unique and repetitive regions. Our studies converge on and support a consistent picture, in which endogenous genomic DNA circles are present in normal physiological states, and in which the circles come from both coding and noncoding genomic regions. Prominent among the coding regions generating DNA circles are several genes known to produce a diversity of protein isoforms, with mucin proteins and titin as specific examples. Copyright © 2017 Shoura et al.

  18. SOARCA Peach Bottom Atomic Power Station Long-Term Station Blackout Uncertainty Analysis: Convergence of the Uncertainty Results

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bixler, Nathan E.; Osborn, Douglas M.; Sallaberry, Cedric Jean-Marie

    2014-02-01

    This paper describes the convergence of MELCOR Accident Consequence Code System, Version 2 (MACCS2) probabilistic results of offsite consequences for the uncertainty analysis of the State-of-the-Art Reactor Consequence Analyses (SOARCA) unmitigated long-term station blackout scenario at the Peach Bottom Atomic Power Station. The consequence metrics evaluated are individual latent-cancer fatality (LCF) risk and individual early fatality risk. Consequence results are presented as conditional risk (i.e., assuming the accident occurs, risk per event) to individuals of the public as a result of the accident. In order to verify convergence for this uncertainty analysis, as recommended by the Nuclear Regulatory Commission’s Advisorymore » Committee on Reactor Safeguards, a ‘high’ source term from the original population of Monte Carlo runs has been selected to be used for: (1) a study of the distribution of consequence results stemming solely from epistemic uncertainty in the MACCS2 parameters (i.e., separating the effect from the source term uncertainty), and (2) a comparison between Simple Random Sampling (SRS) and Latin Hypercube Sampling (LHS) in order to validate the original results obtained with LHS. Three replicates (each using a different random seed) of size 1,000 each using LHS and another set of three replicates of size 1,000 using SRS are analyzed. The results show that the LCF risk results are well converged with either LHS or SRS sampling. The early fatality risk results are less well converged at radial distances beyond 2 miles, and this is expected due to the sparse data (predominance of “zero” results).« less

  19. Arbitrariness is not enough: towards a functional approach to the genetic code.

    PubMed

    Lacková, Ľudmila; Matlach, Vladimír; Faltýnek, Dan

    2017-12-01

    Arbitrariness in the genetic code is one of the main reasons for a linguistic approach to molecular biology: the genetic code is usually understood as an arbitrary relation between amino acids and nucleobases. However, from a semiotic point of view, arbitrariness should not be the only condition for definition of a code, consequently it is not completely correct to talk about "code" in this case. Yet we suppose that there exist a code in the process of protein synthesis, but on a higher level than the nucleic bases chains. Semiotically, a code should be always associated with a function and we propose to define the genetic code not only relationally (in basis of relation between nucleobases and amino acids) but also in terms of function (function of a protein as meaning of the code). Even if the functional definition of meaning in the genetic code has been discussed in the field of biosemiotics, its further implications have not been considered. In fact, if the function of a protein represents the meaning of the genetic code (the sign's object), then it is crucial to reconsider the notion of its expression (the sign) as well. In our contribution, we will show that the actual model of the genetic code is not the only possible and we will propose a more appropriate model from a semiotic point of view.

  20. A Fixed Point VHDL Component Library for a High Efficiency Reconfigurable Radio Design Methodology

    NASA Technical Reports Server (NTRS)

    Hoy, Scott D.; Figueiredo, Marco A.

    2006-01-01

    Advances in Field Programmable Gate Array (FPGA) technologies enable the implementation of reconfigurable radio systems for both ground and space applications. The development of such systems challenges the current design paradigms and requires more robust design techniques to meet the increased system complexity. Among these techniques is the development of component libraries to reduce design cycle time and to improve design verification, consequently increasing the overall efficiency of the project development process while increasing design success rates and reducing engineering costs. This paper describes the reconfigurable radio component library developed at the Software Defined Radio Applications Research Center (SARC) at Goddard Space Flight Center (GSFC) Microwave and Communications Branch (Code 567). The library is a set of fixed-point VHDL components that link the Digital Signal Processing (DSP) simulation environment with the FPGA design tools. This provides a direct synthesis path based on the latest developments of the VHDL tools as proposed by the BEE VBDL 2004 which allows for the simulation and synthesis of fixed-point math operations while maintaining bit and cycle accuracy. The VHDL Fixed Point Reconfigurable Radio Component library does not require the use of the FPGA vendor specific automatic component generators and provide a generic path from high level DSP simulations implemented in Mathworks Simulink to any FPGA device. The access to the component synthesizable, source code provides full design verification capability:

  1. A predictive coding account of MMN reduction in schizophrenia.

    PubMed

    Wacongne, Catherine

    2016-04-01

    The mismatch negativity (MMN) is thought to be an index of the automatic activation of a specialized network for active prediction and deviance detection in the auditory cortex. It is consistently reduced in schizophrenic patients and has received a lot of interest as a clinical and translational tool. The main neuronal hypothesis regarding the mechanisms leading to a reduced MMN in schizophrenic patients is a dysfunction of NMDA receptors (NMDA-R). However, this hypothesis has never been implemented in a neuronal model. In this paper, we examine the consequences of NMDA-R dysfunction in a neuronal model of MMN based on predictive coding principle. I also investigate how predictive processes may interact with synaptic adaptation in MMN generations and examine the consequences of this interaction for the use of MMN paradigms in schizophrenia research. Copyright © 2015 Elsevier B.V. All rights reserved.

  2. Efficient preparation of large-block-code ancilla states for fault-tolerant quantum computation

    NASA Astrophysics Data System (ADS)

    Zheng, Yi-Cong; Lai, Ching-Yi; Brun, Todd A.

    2018-03-01

    Fault-tolerant quantum computation (FTQC) schemes that use multiqubit large block codes can potentially reduce the resource overhead to a great extent. A major obstacle is the requirement for a large number of clean ancilla states of different types without correlated errors inside each block. These ancilla states are usually logical stabilizer states of the data-code blocks, which are generally difficult to prepare if the code size is large. Previously, we have proposed an ancilla distillation protocol for Calderbank-Shor-Steane (CSS) codes by classical error-correcting codes. It was assumed that the quantum gates in the distillation circuit were perfect; however, in reality, noisy quantum gates may introduce correlated errors that are not treatable by the protocol. In this paper, we show that additional postselection by another classical error-detecting code can be applied to remove almost all correlated errors. Consequently, the revised protocol is fully fault tolerant and capable of preparing a large set of stabilizer states sufficient for FTQC using large block codes. At the same time, the yield rate can be boosted from O (t-2) to O (1 ) in practice for an [[n ,k ,d =2 t +1

  3. Non-coding landscapes of colorectal cancer

    PubMed Central

    Ragusa, Marco; Barbagallo, Cristina; Statello, Luisa; Condorelli, Angelo Giuseppe; Battaglia, Rosalia; Tamburello, Lucia; Barbagallo, Davide; Di Pietro, Cinzia; Purrello, Michele

    2015-01-01

    For two decades Vogelstein’s model has been the paradigm for describing the sequence of molecular changes within protein-coding genes that would lead to overt colorectal cancer (CRC). This model is now too simplistic in the light of recent studies, which have shown that our genome is pervasively transcribed in RNAs other than mRNAs, denominated non-coding RNAs (ncRNAs). The discovery that mutations in genes encoding these RNAs [i.e., microRNAs (miRNAs), long non-coding RNAs, and circular RNAs] are causally involved in cancer phenotypes has profoundly modified our vision of tumour molecular genetics and pathobiology. By exploiting a wide range of different mechanisms, ncRNAs control fundamental cellular processes, such as proliferation, differentiation, migration, angiogenesis and apoptosis: these data have also confirmed their role as oncogenes or tumor suppressors in cancer development and progression. The existence of a sophisticated RNA-based regulatory system, which dictates the correct functioning of protein-coding networks, has relevant biological and biomedical consequences. Different miRNAs involved in neoplastic and degenerative diseases exhibit potential predictive and prognostic properties. Furthermore, the key roles of ncRNAs make them very attractive targets for innovative therapeutic approaches. Several recent reports have shown that ncRNAs can be secreted by cells into the extracellular environment (i.e., blood and other body fluids): this suggests the existence of extracellular signalling mechanisms, which may be exploited by cells in physiology and pathology. In this review, we will summarize the most relevant issues on the involvement of cellular and extracellular ncRNAs in disease. We will then specifically describe their involvement in CRC pathobiology and their translational applications to CRC diagnosis, prognosis and therapy. PMID:26556998

  4. Consequence analysis in LPG installation using an integrated computer package.

    PubMed

    Ditali, S; Colombi, M; Moreschini, G; Senni, S

    2000-01-07

    This paper presents the prototype of the computer code, Atlantide, developed to assess the consequences associated with accidental events that can occur in a LPG storage plant. The characteristic of Atlantide is to be simple enough but at the same time adequate to cope with consequence analysis as required by Italian legislation in fulfilling the Seveso Directive. The application of Atlantide is appropriate for LPG storage/transferring installations. The models and correlations implemented in the code are relevant to flashing liquid releases, heavy gas dispersion and other typical phenomena such as BLEVE/Fireball. The computer code allows, on the basis of the operating/design characteristics, the study of the relevant accidental events from the evaluation of the release rate (liquid, gaseous and two-phase) in the unit involved, to the analysis of the subsequent evaporation and dispersion, up to the assessment of the final phenomena of fire and explosion. This is done taking as reference simplified Event Trees which describe the evolution of accidental scenarios, taking into account the most likely meteorological conditions, the different release situations and other features typical of a LPG installation. The limited input data required and the automatic linking between the single models, that are activated in a defined sequence, depending on the accidental event selected, minimize both the time required for the risk analysis and the possibility of errors. Models and equations implemented in Atlantide have been selected from public literature or in-house developed software and tailored with the aim to be easy to use and fast to run but, nevertheless, able to provide realistic simulation of the accidental event as well as reliable results, in terms of physical effects and hazardous areas. The results have been compared with those of other internationally recognized codes and with the criteria adopted by Italian authorities to verify the Safety Reports for LPG installations. A brief of the theoretical basis of each model implemented in Atlantide and an example of application are included in the paper.

  5. Multi-phase model development to assess RCIC system capabilities under severe accident conditions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kirkland, Karen Vierow; Ross, Kyle; Beeny, Bradley

    The Reactor Core Isolation Cooling (RCIC) System is a safety-related system that provides makeup water for core cooling of some Boiling Water Reactors (BWRs) with a Mark I containment. The RCIC System consists of a steam-driven Terry turbine that powers a centrifugal, multi-stage pump for providing water to the reactor pressure vessel. The Fukushima Dai-ichi accidents demonstrated that the RCIC System can play an important role under accident conditions in removing core decay heat. The unexpectedly sustained, good performance of the RCIC System in the Fukushima reactor demonstrates, firstly, that its capabilities are not well understood, and secondly, that themore » system has high potential for extended core cooling in accident scenarios. Better understanding and analysis tools would allow for more options to cope with a severe accident situation and to reduce the consequences. The objectives of this project were to develop physics-based models of the RCIC System, incorporate them into a multi-phase code and validate the models. This Final Technical Report details the progress throughout the project duration and the accomplishments.« less

  6. Channel coding in the space station data system network

    NASA Technical Reports Server (NTRS)

    Healy, T.

    1982-01-01

    A detailed discussion of the use of channel coding for error correction, privacy/secrecy, channel separation, and synchronization is presented. Channel coding, in one form or another, is an established and common element in data systems. No analysis and design of a major new system would fail to consider ways in which channel coding could make the system more effective. The presence of channel coding on TDRS, Shuttle, the Advanced Communication Technology Satellite Program system, the JSC-proposed Space Operations Center, and the proposed 30/20 GHz Satellite Communication System strongly support the requirement for the utilization of coding for the communications channel. The designers of the space station data system have to consider the use of channel coding.

  7. Energy-efficient neural information processing in individual neurons and neuronal networks.

    PubMed

    Yu, Lianchun; Yu, Yuguo

    2017-11-01

    Brains are composed of networks of an enormous number of neurons interconnected with synapses. Neural information is carried by the electrical signals within neurons and the chemical signals among neurons. Generating these electrical and chemical signals is metabolically expensive. The fundamental issue raised here is whether brains have evolved efficient ways of developing an energy-efficient neural code from the molecular level to the circuit level. Here, we summarize the factors and biophysical mechanisms that could contribute to the energy-efficient neural code for processing input signals. The factors range from ion channel kinetics, body temperature, axonal propagation of action potentials, low-probability release of synaptic neurotransmitters, optimal input and noise, the size of neurons and neuronal clusters, excitation/inhibition balance, coding strategy, cortical wiring, and the organization of functional connectivity. Both experimental and computational evidence suggests that neural systems may use these factors to maximize the efficiency of energy consumption in processing neural signals. Studies indicate that efficient energy utilization may be universal in neuronal systems as an evolutionary consequence of the pressure of limited energy. As a result, neuronal connections may be wired in a highly economical manner to lower energy costs and space. Individual neurons within a network may encode independent stimulus components to allow a minimal number of neurons to represent whole stimulus characteristics efficiently. This basic principle may fundamentally change our view of how billions of neurons organize themselves into complex circuits to operate and generate the most powerful intelligent cognition in nature. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  8. Assessment of the effects and limitations of the 1998 to 2008 Abbreviated Injury Scale map using a large population-based dataset.

    PubMed

    Palmer, Cameron S; Franklyn, Melanie

    2011-01-07

    Trauma systems should consistently monitor a given trauma population over a period of time. The Abbreviated Injury Scale (AIS) and derived scores such as the Injury Severity Score (ISS) are commonly used to quantify injury severities in trauma registries. To reflect contemporary trauma management and treatment, the most recent version of the AIS (AIS08) contains many codes which differ in severity from their equivalents in the earlier 1998 version (AIS98). Consequently, the adoption of AIS08 may impede comparisons between data coded using different AIS versions. It may also affect the number of patients classified as major trauma. The entire AIS98-coded injury dataset of a large population based trauma registry was retrieved and mapped to AIS08 using the currently available AIS98-AIS08 dictionary map. The percentage of codes which had increased or decreased in severity, or could not be mapped, was examined in conjunction with the effect of these changes to the calculated ISS. The potential for free text information accompanying AIS coding to improve the quality of AIS mapping was explored. A total of 128280 AIS98-coded injuries were evaluated in 32134 patients, 15471 patients of whom were classified as major trauma. Although only 4.5% of dictionary codes decreased in severity from AIS98 to AIS08, this represented almost 13% of injuries in the registry. In 4.9% of patients, no injuries could be mapped. ISS was potentially unreliable in one-third of patients, as they had at least one AIS98 code which could not be mapped. Using AIS08, the number of patients classified as major trauma decreased by between 17.3% and 30.3%. Evaluation of free text descriptions for some injuries demonstrated the potential to improve mapping between AIS versions. Converting AIS98-coded data to AIS08 results in a significant decrease in the number of patients classified as major trauma. Many AIS98 codes are missing from the existing AIS map, and across a trauma population the AIS08 dataset estimates which it produces are of insufficient quality to be used in practice. However, it may be possible to improve AIS98 to AIS08 mapping to the point where it is useful to established registries.

  9. Assessment of the effects and limitations of the 1998 to 2008 Abbreviated Injury Scale map using a large population-based dataset

    PubMed Central

    2011-01-01

    Background Trauma systems should consistently monitor a given trauma population over a period of time. The Abbreviated Injury Scale (AIS) and derived scores such as the Injury Severity Score (ISS) are commonly used to quantify injury severities in trauma registries. To reflect contemporary trauma management and treatment, the most recent version of the AIS (AIS08) contains many codes which differ in severity from their equivalents in the earlier 1998 version (AIS98). Consequently, the adoption of AIS08 may impede comparisons between data coded using different AIS versions. It may also affect the number of patients classified as major trauma. Methods The entire AIS98-coded injury dataset of a large population based trauma registry was retrieved and mapped to AIS08 using the currently available AIS98-AIS08 dictionary map. The percentage of codes which had increased or decreased in severity, or could not be mapped, was examined in conjunction with the effect of these changes to the calculated ISS. The potential for free text information accompanying AIS coding to improve the quality of AIS mapping was explored. Results A total of 128280 AIS98-coded injuries were evaluated in 32134 patients, 15471 patients of whom were classified as major trauma. Although only 4.5% of dictionary codes decreased in severity from AIS98 to AIS08, this represented almost 13% of injuries in the registry. In 4.9% of patients, no injuries could be mapped. ISS was potentially unreliable in one-third of patients, as they had at least one AIS98 code which could not be mapped. Using AIS08, the number of patients classified as major trauma decreased by between 17.3% and 30.3%. Evaluation of free text descriptions for some injuries demonstrated the potential to improve mapping between AIS versions. Conclusions Converting AIS98-coded data to AIS08 results in a significant decrease in the number of patients classified as major trauma. Many AIS98 codes are missing from the existing AIS map, and across a trauma population the AIS08 dataset estimates which it produces are of insufficient quality to be used in practice. However, it may be possible to improve AIS98 to AIS08 mapping to the point where it is useful to established registries. PMID:21214906

  10. Imitation learning based on an intrinsic motivation mechanism for efficient coding

    PubMed Central

    Triesch, Jochen

    2013-01-01

    A hypothesis regarding the development of imitation learning is presented that is rooted in intrinsic motivations. It is derived from a recently proposed form of intrinsically motivated learning (IML) for efficient coding in active perception, wherein an agent learns to perform actions with its sense organs to facilitate efficient encoding of the sensory data. To this end, actions of the sense organs that improve the encoding of the sensory data trigger an internally generated reinforcement signal. Here it is argued that the same IML mechanism might also support the development of imitation when general actions beyond those of the sense organs are considered: The learner first observes a tutor performing a behavior and learns a model of the the behavior's sensory consequences. The learner then acts itself and receives an internally generated reinforcement signal reflecting how well the sensory consequences of its own behavior are encoded by the sensory model. Actions that are more similar to those of the tutor will lead to sensory signals that are easier to encode and produce a higher reinforcement signal. Through this, the learner's behavior is progressively tuned to make the sensory consequences of its actions match the learned sensory model. I discuss this mechanism in the context of human language acquisition and bird song learning where similar ideas have been proposed. The suggested mechanism also offers an account for the development of mirror neurons and makes a number of predictions. Overall, it establishes a connection between principles of efficient coding, intrinsic motivations and imitation. PMID:24204350

  11. Design study of multi-imaging plate system for BNCT irradiation field at Kyoto university reactor.

    PubMed

    Tanaka, Kenichi; Sakurai, Yoshinori; Kajimoto, Tsuyoshi; Tanaka, Hiroki; Takata, Takushi; Endo, Satoru

    2016-09-01

    The converter configuration for a multi-imaging plate system was investigated for the application of quality assurance in the irradiation field profile for boron neutron capture therapy. This was performed by the simulation calculation using the PHITS code in the fields at the Heavy Water Neutron Irradiation Facility of Kyoto University Reactor. The converter constituents investigated were carbon for gamma rays, and polyethylene with and without LiF at varied (6)Li concentration for thermal, epithermal, and fast neutrons. Consequently, potential combinations of the converters were found for two components, gamma rays and thermal neutrons, for the standard thermal neutron mode and three components of gamma rays, epithermal neutrons, and thermal or fast neutrons, for the standard mixed or epithermal neutron modes, respectively. Copyright © 2016 Elsevier Ltd. All rights reserved.

  12. Eigenmode multiplexing with SLM for volume holographic data storage

    NASA Astrophysics Data System (ADS)

    Chen, Guanghao; Miller, Bo E.; Takashima, Yuzuru

    2017-08-01

    The cavity supports the orthogonal reference beam families as its eigenmodes while enhancing the reference beam power. Such orthogonal eigenmodes are used as additional degree of freedom to multiplex data pages, consequently increase storage densities for volume Holographic Data Storage Systems (HDSS) when the maximum number of multiplexed data page is limited by geometrical factor. Image bearing holograms are multiplexed by orthogonal phase code multiplexing via Hermite-Gaussian eigenmodes in a Fe:LiNbO3 medium with a 532 nm laser at multiple Bragg angles by using Liquid Crystal on Silicon (LCOS) spatial light modulators (SLMs) in reference arms. Total of nine holograms are recorded with three angular and three eigenmode.

  13. Treatment Recommendation Actions, Contingencies, and Responses: An Introduction.

    PubMed

    Stivers, Tanya; Barnes, Rebecca K

    2017-08-21

    In the era of patient participation in health care decision making, we know surprisingly little about the ways in which treatment recommendations are made, the contexts that shape their formulation, and the consequences of these formulations. In this article, we introduce a systematic collective investigation of how recommendations for medications are responded to and made in primary versus secondary care, in the US versus the UK, and in contexts where the medication was over the counter versus by prescription. This article provides an overview of the coding system that was used in this project including describing what constitutes a recommendation, the primary action types clinicians use for recommendations, and the types of responses provided by patients to recommendations.

  14. Information technology and medication safety: what is the benefit?

    PubMed Central

    Kaushal, R; Bates, D

    2002-01-01

    

 Medication errors occur frequently and have significant clinical and financial consequences. Several types of information technologies can be used to decrease rates of medication errors. Computerized physician order entry with decision support significantly reduces serious inpatient medication error rates in adults. Other available information technologies that may prove effective for inpatients include computerized medication administration records, robots, automated pharmacy systems, bar coding, "smart" intravenous devices, and computerized discharge prescriptions and instructions. In outpatients, computerization of prescribing and patient oriented approaches such as personalized web pages and delivery of web based information may be important. Public and private mandates for information technology interventions are growing, but further development, application, evaluation, and dissemination are required. PMID:12486992

  15. Field estimates of gravity terrain corrections and Y2K-compatible method to convert from gravity readings with multiple base stations to tide- and long-term drift-corrected observations

    USGS Publications Warehouse

    Plouff, Donald

    2000-01-01

    Gravity observations are directly made or are obtained from other sources by the U.S. Geological Survey in order to prepare maps of the anomalous gravity field and consequently to interpret the subsurface distribution of rock densities and associated lithologic or geologic units. Observations are made in the field with gravity meters at new locations and at reoccupations of previously established gravity "stations." This report illustrates an interactively-prompted series of steps needed to convert gravity "readings" to values that are tied to established gravity datums and includes computer programs to implement those steps. Inasmuch as individual gravity readings have small variations, gravity-meter (instrument) drift may not be smoothly variable, and acommodations may be needed for ties to previously established stations, the reduction process is iterative. Decision-making by the program user is prompted by lists of best values and graphical displays. Notes about irregularities of topography, which affect the value of observed gravity but are not shown in sufficient detail on topographic maps, must be recorded in the field. This report illustrates ways to record field notes (distances, heights, and slope angles) and includes computer programs to convert field notes to gravity terrain corrections. This report includes approaches that may serve as models for other applications, for example: portrayal of system flow; style of quality control to document and validate computer applications; lack of dependence on proprietary software except source code compilation; method of file-searching with a dwindling list; interactive prompting; computer code to write directly in the PostScript (Adobe Systems Incorporated) printer language; and high-lighting the four-digit year on the first line of time-dependent data sets for assured Y2K compatibility. Computer source codes provided are written in the Fortran scientific language. In order for the programs to operate, they first must be converted (compiled) into an executable form on the user's computer. Although program testing was done in a UNIX (tradename of American Telephone and Telegraph Company) computer environment, it is anticipated that only a system-dependent date-and-time function may need to be changed for adaptation to other computer platforms that accept standard Fortran code.d del iliscipit volorer sequi ting etue feum zzriliquatum zzriustrud esenibh ex esto esequat.

  16. Coupling between a multi-physics workflow engine and an optimization framework

    NASA Astrophysics Data System (ADS)

    Di Gallo, L.; Reux, C.; Imbeaux, F.; Artaud, J.-F.; Owsiak, M.; Saoutic, B.; Aiello, G.; Bernardi, P.; Ciraolo, G.; Bucalossi, J.; Duchateau, J.-L.; Fausser, C.; Galassi, D.; Hertout, P.; Jaboulay, J.-C.; Li-Puma, A.; Zani, L.

    2016-03-01

    A generic coupling method between a multi-physics workflow engine and an optimization framework is presented in this paper. The coupling architecture has been developed in order to preserve the integrity of the two frameworks. The objective is to provide the possibility to replace a framework, a workflow or an optimizer by another one without changing the whole coupling procedure or modifying the main content in each framework. The coupling is achieved by using a socket-based communication library for exchanging data between the two frameworks. Among a number of algorithms provided by optimization frameworks, Genetic Algorithms (GAs) have demonstrated their efficiency on single and multiple criteria optimization. Additionally to their robustness, GAs can handle non-valid data which may appear during the optimization. Consequently GAs work on most general cases. A parallelized framework has been developed to reduce the time spent for optimizations and evaluation of large samples. A test has shown a good scaling efficiency of this parallelized framework. This coupling method has been applied to the case of SYCOMORE (SYstem COde for MOdeling tokamak REactor) which is a system code developed in form of a modular workflow for designing magnetic fusion reactors. The coupling of SYCOMORE with the optimization platform URANIE enables design optimization along various figures of merit and constraints.

  17. The cost of implementing inpatient bar code medication administration.

    PubMed

    Sakowski, Julie Ann; Ketchel, Alan

    2013-02-01

    To calculate the costs associated with implementing and operating an inpatient bar-code medication administration (BCMA) system in the community hospital setting and to estimate the cost per harmful error prevented. This is a retrospective, observational study. Costs were calculated from the hospital perspective and a cost-consequence analysis was performed to estimate the cost per preventable adverse drug event averted. Costs were collected from financial records and key informant interviews at 4 not-for profit community hospitals. Costs included direct expenditures on capital, infrastructure, additional personnel, and the opportunity costs of time for existing personnel working on the project. The number of adverse drug events prevented using BCMA was estimated by multiplying the number of doses administered using BCMA by the rate of harmful errors prevented by interventions in response to system warnings. Our previous work found that BCMA identified and intercepted medication errors in 1.1% of doses administered, 9% of which potentially could have resulted in lasting harm. The cost of implementing and operating BCMA including electronic pharmacy management and drug repackaging over 5 years is $40,000 (range: $35,600 to $54,600) per BCMA-enabled bed and $2000 (range: $1800 to $2600) per harmful error prevented. BCMA can be an effective and potentially cost-saving tool for preventing the harm and costs associated with medication errors.

  18. Olfaction

    PubMed Central

    Pinto1, Jayant M.

    2011-01-01

    Olfaction represents an ancient, evolutionarily critical physiologic system. In humans, chemosensation mediates safety, nutrition, sensation of pleasure, and general well-being. Factors that affect human olfaction included structural aspects of the nasal cavity that can modulate airflow and therefore odorant access to the olfactory cleft, and inflammatory disease, which can affect both airflow as well as olfactory nerve function. After signals are generated, olfactory information is processed and coded in the olfactory bulb and disseminated to several areas in the brain. The discovery of olfactory receptors by Axel and Buck sparked greater understanding of the molecular basis of olfaction. However, the precise mechanisms used by this system are still under great scrutiny due to the complexity of understanding how an enormous number of chemically diverse odorant molecules are coded into signals understood by the brain. Additionally, it has been challenging to dissect olfactory sensation due to the multiple areas of areas of the brain that receive and modulate this information. Consequently, our knowledge of olfactory dysfunction in humans remains primitive. Aging represents the major cause of loss of smell, although a number of clinical and environmental factors are thought to affect chemosensory function. Treatment options focus on reducing sinonasal inflammation when present, ruling out other treatable causes, and counseling patients on safety measures. PMID:21364221

  19. Multiphase, multi-electrode Joule heat computations for glass melter and in situ vitrification simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lowery, P.S.; Lessor, D.L.

    Waste glass melter and in situ vitrification (ISV) processes represent the combination of electrical thermal, and fluid flow phenomena to produce a stable waste-from product. Computational modeling of the thermal and fluid flow aspects of these processes provides a useful tool for assessing the potential performance of proposed system designs. These computations can be performed at a fraction of the cost of experiment. Consequently, computational modeling of vitrification systems can also provide and economical means for assessing the suitability of a proposed process application. The computational model described in this paper employs finite difference representations of the basic continuum conservationmore » laws governing the thermal, fluid flow, and electrical aspects of the vitrification process -- i.e., conservation of mass, momentum, energy, and electrical charge. The resulting code is a member of the TEMPEST family of codes developed at the Pacific Northwest Laboratory (operated by Battelle for the US Department of Energy). This paper provides an overview of the numerical approach employed in TEMPEST. In addition, results from several TEMPEST simulations of sample waste glass melter and ISV processes are provided to illustrate the insights to be gained from computational modeling of these processes. 3 refs., 13 figs.« less

  20. Why language really is not a communication system: a cognitive view of language evolution

    PubMed Central

    Reboul, Anne C.

    2015-01-01

    While most evolutionary scenarios for language see it as a communication system with consequences on the language-ready brain, there are major difficulties for such a view. First, language has a core combination of features—semanticity, discrete infinity, and decoupling—that makes it unique among communication systems and that raise deep problems for the view that it evolved for communication. Second, extant models of communication systems—the code model of communication (Millikan, 2005) and the ostensive model of communication (Scott-Phillips, 2015) cannot account for language evolution. I propose an alternative view, according to which language first evolved as a cognitive tool, following Fodor’s (1975, 2008) Language of Thought Hypothesis, and was then exapted (externalized) for communication. On this view, a language-ready brain is a brain profoundly reorganized in terms of connectivity, allowing the human conceptual system to emerge, triggering the emergence of syntax. Language as used in communication inherited its core combination of features from the Language of Thought. PMID:26441802

  1. Mean Line Pump Flow Model in Rocket Engine System Simulation

    NASA Technical Reports Server (NTRS)

    Veres, Joseph P.; Lavelle, Thomas M.

    2000-01-01

    A mean line pump flow modeling method has been developed to provide a fast capability for modeling turbopumps of rocket engines. Based on this method, a mean line pump flow code PUMPA has been written that can predict the performance of pumps at off-design operating conditions, given the loss of the diffusion system at the design point. The pump code can model axial flow inducers, mixed-flow and centrifugal pumps. The code can model multistage pumps in series. The code features rapid input setup and computer run time, and is an effective analysis and conceptual design tool. The map generation capability of the code provides the map information needed for interfacing with a rocket engine system modeling code. The off-design and multistage modeling capabilities of the code permit parametric design space exploration of candidate pump configurations and provide pump performance data for engine system evaluation. The PUMPA code has been integrated with the Numerical Propulsion System Simulation (NPSS) code and an expander rocket engine system has been simulated. The mean line pump flow code runs as an integral part of the NPSS rocket engine system simulation and provides key pump performance information directly to the system model at all operating conditions.

  2. Automated Concurrent Blackboard System Generation in C++

    NASA Technical Reports Server (NTRS)

    Kaplan, J. A.; McManus, J. W.; Bynum, W. L.

    1999-01-01

    In his 1992 Ph.D. thesis, "Design and Analysis Techniques for Concurrent Blackboard Systems", John McManus defined several performance metrics for concurrent blackboard systems and developed a suite of tools for creating and analyzing such systems. These tools allow a user to analyze a concurrent blackboard system design and predict the performance of the system before any code is written. The design can be modified until simulated performance is satisfactory. Then, the code generator can be invoked to generate automatically all of the code required for the concurrent blackboard system except for the code implementing the functionality of each knowledge source. We have completed the port of the source code generator and a simulator for a concurrent blackboard system. The source code generator generates the necessary C++ source code to implement the concurrent blackboard system using Parallel Virtual Machine (PVM) running on a heterogeneous network of UNIX(trademark) workstations. The concurrent blackboard simulator uses the blackboard specification file to predict the performance of the concurrent blackboard design. The only part of the source code for the concurrent blackboard system that the user must supply is the code implementing the functionality of the knowledge sources.

  3. The Exchange Data Communication System based on Centralized Database for the Meat Industry

    NASA Astrophysics Data System (ADS)

    Kobayashi, Yuichi; Taniguchi, Yoji; Terada, Shuji; Komoda, Norihisa

    We propose applying the EDI system that is based on centralized database and supports conversion of code data to the meat industry. This system makes it possible to share exchange data on beef between enterprises from producers to retailers by using Web EDI technology. In order to efficiently convert code direct conversion of a sender's code to a receiver's code using a code map is used. This system that mounted this function has been implemented in September 2004. Twelve enterprises including retailers, and processing traders, and wholesalers were using the system as of June 2005. In this system, the number of code maps relevant to the introductory cost of the code conversion function was lower than the theoretical value and were close to the case that a standard code is mediated.

  4. Finite-element three-dimensional ground-water (FE3DGW) flow model - formulation, program listings and users' manual

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gupta, S.K.; Cole, C.R.; Bond, F.W.

    1979-12-01

    The Assessment of Effectiveness of Geologic Isolation Systems (AEGIS) Program is developing and applying the methodology for assessing the far-field, long-term post-closure safety of deep geologic nuclear waste repositories. AEGIS is being performed by Pacific Northwest Laboratory (PNL) under contract with the Office of Nuclear Waste Isolation (OWNI) for the Department of Energy (DOE). One task within AEGIS is the development of methodology for analysis of the consequences (water pathway) from loss of repository containment as defined by various release scenarios. Analysis of the long-term, far-field consequences of release scenarios requires the application of numerical codes which simulate the hydrologicmore » systems, model the transport of released radionuclides through the hydrologic systems to the biosphere, and, where applicable, assess the radiological dose to humans. Hydrologic and transport models are available at several levels of complexity or sophistication. Model selection and use are determined by the quantity and quality of input data. Model development under AEGIS and related programs provides three levels of hydrologic models, two levels of transport models, and one level of dose models (with several separate models). This document consists of the description of the FE3DGW (Finite Element, Three-Dimensional Groundwater) Hydrologic model third level (high complexity) three-dimensional, finite element approach (Galerkin formulation) for saturated groundwater flow.« less

  5. mRNA changes in nucleus accumbens related to methamphetamine addiction in mice

    NASA Astrophysics Data System (ADS)

    Zhu, Li; Li, Jiaqi; Dong, Nan; Guan, Fanglin; Liu, Yufeng; Ma, Dongliang; Goh, Eyleen L. K.; Chen, Teng

    2016-11-01

    Methamphetamine (METH) is a highly addictive psychostimulant that elicits aberrant changes in the expression of microRNAs (miRNAs) and long non-coding RNAs (lncRNAs) in the nucleus accumbens of mice, indicating a potential role of METH in post-transcriptional regulations. To decipher the potential consequences of these post-transcriptional regulations in response to METH, we performed strand-specific RNA sequencing (ssRNA-Seq) to identify alterations in mRNA expression and their alternative splicing in the nucleus accumbens of mice following exposure to METH. METH-mediated changes in mRNAs were analyzed and correlated with previously reported changes in non-coding RNAs (miRNAs and lncRNAs) to determine the potential functions of these mRNA changes observed here and how non-coding RNAs are involved. A total of 2171 mRNAs were differentially expressed in response to METH with functions involved in synaptic plasticity, mitochondrial energy metabolism and immune response. 309 and 589 of these mRNAs are potential targets of miRNAs and lncRNAs respectively. In addition, METH treatment decreases mRNA alternative splicing, and there are 818 METH-specific events not observed in saline-treated mice. Our results suggest that METH-mediated addiction could be attributed by changes in miRNAs and lncRNAs and consequently, changes in mRNA alternative splicing and expression. In conclusion, our study reported a methamphetamine-modified nucleus accumbens transcriptome and provided non-coding RNA-mRNA interaction networks possibly involved in METH addiction.

  6. Design of ACM system based on non-greedy punctured LDPC codes

    NASA Astrophysics Data System (ADS)

    Lu, Zijun; Jiang, Zihong; Zhou, Lin; He, Yucheng

    2017-08-01

    In this paper, an adaptive coded modulation (ACM) scheme based on rate-compatible LDPC (RC-LDPC) codes was designed. The RC-LDPC codes were constructed by a non-greedy puncturing method which showed good performance in high code rate region. Moreover, the incremental redundancy scheme of LDPC-based ACM system over AWGN channel was proposed. By this scheme, code rates vary from 2/3 to 5/6 and the complication of the ACM system is lowered. Simulations show that more and more obvious coding gain can be obtained by the proposed ACM system with higher throughput.

  7. Visual adaptation and face perception

    PubMed Central

    Webster, Michael A.; MacLeod, Donald I. A.

    2011-01-01

    The appearance of faces can be strongly affected by the characteristics of faces viewed previously. These perceptual after-effects reflect processes of sensory adaptation that are found throughout the visual system, but which have been considered only relatively recently in the context of higher level perceptual judgements. In this review, we explore the consequences of adaptation for human face perception, and the implications of adaptation for understanding the neural-coding schemes underlying the visual representation of faces. The properties of face after-effects suggest that they, in part, reflect response changes at high and possibly face-specific levels of visual processing. Yet, the form of the after-effects and the norm-based codes that they point to show many parallels with the adaptations and functional organization that are thought to underlie the encoding of perceptual attributes like colour. The nature and basis for human colour vision have been studied extensively, and we draw on ideas and principles that have been developed to account for norms and normalization in colour vision to consider potential similarities and differences in the representation and adaptation of faces. PMID:21536555

  8. Visual adaptation and face perception.

    PubMed

    Webster, Michael A; MacLeod, Donald I A

    2011-06-12

    The appearance of faces can be strongly affected by the characteristics of faces viewed previously. These perceptual after-effects reflect processes of sensory adaptation that are found throughout the visual system, but which have been considered only relatively recently in the context of higher level perceptual judgements. In this review, we explore the consequences of adaptation for human face perception, and the implications of adaptation for understanding the neural-coding schemes underlying the visual representation of faces. The properties of face after-effects suggest that they, in part, reflect response changes at high and possibly face-specific levels of visual processing. Yet, the form of the after-effects and the norm-based codes that they point to show many parallels with the adaptations and functional organization that are thought to underlie the encoding of perceptual attributes like colour. The nature and basis for human colour vision have been studied extensively, and we draw on ideas and principles that have been developed to account for norms and normalization in colour vision to consider potential similarities and differences in the representation and adaptation of faces.

  9. Encrypted holographic data storage based on orthogonal-phase-code multiplexing.

    PubMed

    Heanue, J F; Bashaw, M C; Hesselink, L

    1995-09-10

    We describe an encrypted holographic data-storage system that combines orthogonal-phase-code multiplexing with a random-phase key. The system offers the security advantages of random-phase coding but retains the low cross-talk performance and the minimum code storage requirements typical in an orthogonal-phase-code-multiplexing system.

  10. Interframe vector wavelet coding technique

    NASA Astrophysics Data System (ADS)

    Wus, John P.; Li, Weiping

    1997-01-01

    Wavelet coding is often used to divide an image into multi- resolution wavelet coefficients which are quantized and coded. By 'vectorizing' scalar wavelet coding and combining this with vector quantization (VQ), vector wavelet coding (VWC) can be implemented. Using a finite number of states, finite-state vector quantization (FSVQ) takes advantage of the similarity between frames by incorporating memory into the video coding system. Lattice VQ eliminates the potential mismatch that could occur using pre-trained VQ codebooks. It also eliminates the need for codebook storage in the VQ process, thereby creating a more robust coding system. Therefore, by using the VWC coding method in conjunction with the FSVQ system and lattice VQ, the formulation of a high quality very low bit rate coding systems is proposed. A coding system using a simple FSVQ system where the current state is determined by the previous channel symbol only is developed. To achieve a higher degree of compression, a tree-like FSVQ system is implemented. The groupings are done in this tree-like structure from the lower subbands to the higher subbands in order to exploit the nature of subband analysis in terms of the parent-child relationship. Class A and Class B video sequences from the MPEG-IV testing evaluations are used in the evaluation of this coding method.

  11. Ethics and skeptics: what lies behind ethical codes in occupational health.

    PubMed

    Guidotti, Tee L

    2005-02-01

    Ethical codes, or systems, are conditioned and their enforcement is permitted by social processes and attitudes. In occupational health, our efforts to adhere to our own ethical frameworks often are undermined by forces and interests outside the field. Failure to acknowledge the profoundly social nature of ethical codes impedes our ability to anticipate consequences, to legitimate decisions based on utility and benefit, and to find social structures that support, rather than invalidate, our view of ethical behavior. We examine three sets of social philosophies. Jane Jacobs, the visionary urban planner, has written Systems of Survival: A Dialogue on the Moral Foundations of Commerce and Politics, which is a restatement in modern terms of a critical passage in Plato's most important dialogue, the Republic. She (and Plato) postulate two major ethical systems, renamed here the "guardian system," which is characterized by loyalty, cohesiveness, and confidentiality, and the "marketplace system," which is characterized by trade, decentralization, and shared information. Occupational health, in this formulation, often runs afoul of the guardian mentality and also may be subject to inappropriate negotiation and compromise in the marketplace system. George Lakoff, a semiotician, has written Moral Politics: What Conservatives Know That Liberals Don't, which argues that there are two fundamental social paradigms based on concepts of the family. One, which he calls the Strict Father, emphasizes discipline, the positive aspects of taking risks, and the need to individuals to be self-sufficient. The other, which he calls the Nurturing Parent, emphasizes empowerment, the positive aspects of security, and the need for community and relationships. Occupational health practice violates aspects of both and therefore is supported by neither. Classical Chinese thought involved many schools of thought, including Confucianism and Legalism. It has been suggested that Confucianism provides little support for government regulation or occupational health, however this is questioned. Occupational health may improve its standing as a social priority by recognizing and maneuvering within social frameworks that accomodate it, rejecting social frameworks that invalidate it, and reinforcing positive cultural trends within society that support it.

  12. High rate concatenated coding systems using bandwidth efficient trellis inner codes

    NASA Technical Reports Server (NTRS)

    Deng, Robert H.; Costello, Daniel J., Jr.

    1989-01-01

    High-rate concatenated coding systems with bandwidth-efficient trellis inner codes and Reed-Solomon (RS) outer codes are investigated for application in high-speed satellite communication systems. Two concatenated coding schemes are proposed. In one the inner code is decoded with soft-decision Viterbi decoding, and the outer RS code performs error-correction-only decoding (decoding without side information). In the other, the inner code is decoded with a modified Viterbi algorithm, which produces reliability information along with the decoded output. In this algorithm, path metrics are used to estimate the entire information sequence, whereas branch metrics are used to provide reliability information on the decoded sequence. This information is used to erase unreliable bits in the decoded output. An errors-and-erasures RS decoder is then used for the outer code. The two schemes have been proposed for high-speed data communication on NASA satellite channels. The rates considered are at least double those used in current NASA systems, and the results indicate that high system reliability can still be achieved.

  13. 42 CFR 405.512 - Carriers' procedural terminology and coding systems.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 42 Public Health 2 2013-10-01 2013-10-01 false Carriers' procedural terminology and coding systems... Determining Reasonable Charges § 405.512 Carriers' procedural terminology and coding systems. (a) General. Procedural terminology and coding systems are designed to provide physicians and third party payers with a...

  14. 42 CFR 405.512 - Carriers' procedural terminology and coding systems.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 42 Public Health 2 2011-10-01 2011-10-01 false Carriers' procedural terminology and coding systems... Determining Reasonable Charges § 405.512 Carriers' procedural terminology and coding systems. (a) General. Procedural terminology and coding systems are designed to provide physicians and third party payers with a...

  15. 42 CFR 405.512 - Carriers' procedural terminology and coding systems.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 42 Public Health 2 2014-10-01 2014-10-01 false Carriers' procedural terminology and coding systems... Determining Reasonable Charges § 405.512 Carriers' procedural terminology and coding systems. (a) General. Procedural terminology and coding systems are designed to provide physicians and third party payers with a...

  16. 42 CFR 405.512 - Carriers' procedural terminology and coding systems.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 42 Public Health 2 2012-10-01 2012-10-01 false Carriers' procedural terminology and coding systems... Determining Reasonable Charges § 405.512 Carriers' procedural terminology and coding systems. (a) General. Procedural terminology and coding systems are designed to provide physicians and third party payers with a...

  17. 42 CFR 405.512 - Carriers' procedural terminology and coding systems.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 42 Public Health 2 2010-10-01 2010-10-01 false Carriers' procedural terminology and coding systems... Determining Reasonable Charges § 405.512 Carriers' procedural terminology and coding systems. (a) General. Procedural terminology and coding systems are designed to provide physicians and third party payers with a...

  18. A novel concatenated code based on the improved SCG-LDPC code for optical transmission systems

    NASA Astrophysics Data System (ADS)

    Yuan, Jian-guo; Xie, Ya; Wang, Lin; Huang, Sheng; Wang, Yong

    2013-01-01

    Based on the optimization and improvement for the construction method of systematically constructed Gallager (SCG) (4, k) code, a novel SCG low density parity check (SCG-LDPC)(3969, 3720) code to be suitable for optical transmission systems is constructed. The novel SCG-LDPC (6561,6240) code with code rate of 95.1% is constructed by increasing the length of SCG-LDPC (3969,3720) code, and in a way, the code rate of LDPC codes can better meet the high requirements of optical transmission systems. And then the novel concatenated code is constructed by concatenating SCG-LDPC(6561,6240) code and BCH(127,120) code with code rate of 94.5%. The simulation results and analyses show that the net coding gain (NCG) of BCH(127,120)+SCG-LDPC(6561,6240) concatenated code is respectively 2.28 dB and 0.48 dB more than those of the classic RS(255,239) code and SCG-LDPC(6561,6240) code at the bit error rate (BER) of 10-7.

  19. Trace-shortened Reed-Solomon codes

    NASA Technical Reports Server (NTRS)

    Mceliece, R. J.; Solomon, G.

    1994-01-01

    Reed-Solomon (RS) codes have been part of standard NASA telecommunications systems for many years. RS codes are character-oriented error-correcting codes, and their principal use in space applications has been as outer codes in concatenated coding systems. However, for a given character size, say m bits, RS codes are limited to a length of, at most, 2(exp m). It is known in theory that longer character-oriented codes would be superior to RS codes in concatenation applications, but until recently no practical class of 'long' character-oriented codes had been discovered. In 1992, however, Solomon discovered an extensive class of such codes, which are now called trace-shortened Reed-Solomon (TSRS) codes. In this article, we will continue the study of TSRS codes. Our main result is a formula for the dimension of any TSRS code, as a function of its error-correcting power. Using this formula, we will give several examples of TSRS codes, some of which look very promising as candidate outer codes in high-performance coded telecommunications systems.

  20. An Interactive Concatenated Turbo Coding System

    NASA Technical Reports Server (NTRS)

    Liu, Ye; Tang, Heng; Lin, Shu; Fossorier, Marc

    1999-01-01

    This paper presents a concatenated turbo coding system in which a Reed-Solomon outer code is concatenated with a binary turbo inner code. In the proposed system, the outer code decoder and the inner turbo code decoder interact to achieve both good bit error and frame error performances. The outer code decoder helps the inner turbo code decoder to terminate its decoding iteration while the inner turbo code decoder provides soft-output information to the outer code decoder to carry out a reliability-based soft- decision decoding. In the case that the outer code decoding fails, the outer code decoder instructs the inner code decoder to continue its decoding iterations until the outer code decoding is successful or a preset maximum number of decoding iterations is reached. This interaction between outer and inner code decoders reduces decoding delay. Also presented in the paper are an effective criterion for stopping the iteration process of the inner code decoder and a new reliability-based decoding algorithm for nonbinary codes.

  1. Melting and reactive flow of a volatilized mantle beneath mid-ocean ridges: theory and numerical models

    NASA Astrophysics Data System (ADS)

    Keller, Tobias; Katz, Richard F.

    2015-04-01

    Laboratory experiments indicate that even small concentrations volatiles (H2O or CO2) in the upper mantle significantly affect the silicate melting behavior [HK96,DH06]. The presence of volatiles stabilizes volatile-rich melt at high pressure, thus vastly increasing the volume of the upper mantle expected to be partially molten [H10,DH10]. These small-degree melts have important consequences for chemical differentiation and could affect the dynamics of mantle flow. We have developed theory and numerical implementation to simulate thermo-chemically coupled magma/mantle dynamics in terms of a two-phase (rock+melt), three component (dunite+MORB+volatilized MORB) physical model. The fluid dynamics is based on McKenzie's equations [McK84], while the thermo-chemical formulation of the system is represented by a novel disequilibrium multi-component melting model based on thermo-dynamic theory [RBS11]. This physical model is implemented as a parallel, two-dimensional, finite-volume code that leverages tools from the PETSc toolkit. Application of this simulation code to a mid-ocean ridge system suggests that the methodology captures the leading-order features of both hydrated and carbonated mantle melting, including deep, low-degree, volatile-rich melt formation. Melt segregation leads to continuous dynamic thermo-chemical dis-equilibration, while phenomenological reaction rates are applied to continually move the system towards re-equilibration. The simulations will be used first to characterize volatile extraction from the MOR system assuming a chemically homogeneous mantle. Subsequently, simulations will be extended to investigate the consequences of heterogeneity in lithology [KW12] and volatile content. These studies will advance our understanding of the role of volatiles in the dynamic and chemical evolution of the upper mantle. Moreover, they will help to gauge the significance of the coupling between the deep carbon cycle and the ocean/atmosphere system. REFERENCES HK96 Hirth & Kohlstedt (1996), Earth Planet Sci Lett DH06 Dasgupta & Hirschmann (2006), doi:10.1038/nature04612. H10 Hirschmann (2010), doi:10.1016/j.pepi.2009.12.003. DH10 Dasgupta & Hirschmann (2010), doi:10.1016/j.epsl.2010.06.039. McK84 McKenzie (1984), J Pet KW12 Katz & Weatherley (2012), doi: 10.1016/j.epsl.2012.04.042. RBS11 Rudge, Bercovici & Spiegelman (2011), doi: 10.1111/j.1365-246X.2010.04870.x

  2. Hybrid emergency radiation detection: a wireless sensor network application for consequence management of a radiological release

    NASA Astrophysics Data System (ADS)

    Kyker, Ronald D.; Berry, Nina; Stark, Doug; Nachtigal, Noel; Kershaw, Chris

    2004-08-01

    The Hybrid Emergency Radiation Detection (HERD) system is a rapidly deployable ad-hoc wireless sensor network for monitoring the radiation hazard associated with a radiation release. The system is designed for low power, small size, low cost, and rapid deployment in order to provide early notification and minimize exposure. The many design tradeoffs, decisions, and challenges in the implementation of this wireless sensor network design will be presented and compared to the commercial systems available. Our research in a scaleable modular architectural highlights the need and implementation of a system level approach that provides flexibility and adaptability for a variety of applications. This approach seeks to minimize power, provide mission specific specialization, and provide the capability to upgrade the system with the most recent technology advancements by encapsulation and modularity. The implementation of a low power, widely available Real Time Operating System (RTOS) for multitasking with an improvement in code maintenance, portability, and reuse will be presented. Finally future design enhancements technology trends affecting wireless sensor networks will be presented.

  3. Error-correction coding

    NASA Technical Reports Server (NTRS)

    Hinds, Erold W. (Principal Investigator)

    1996-01-01

    This report describes the progress made towards the completion of a specific task on error-correcting coding. The proposed research consisted of investigating the use of modulation block codes as the inner code of a concatenated coding system in order to improve the overall space link communications performance. The study proposed to identify and analyze candidate codes that will complement the performance of the overall coding system which uses the interleaved RS (255,223) code as the outer code.

  4. CRAC2 model description

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ritchie, L.T.; Alpert, D.J.; Burke, R.P.

    1984-03-01

    The CRAC2 computer code is a revised version of CRAC (Calculation of Reactor Accident Consequences) which was developed for the Reactor Safety Study. This document provides an overview of the CRAC2 code and a description of each of the models used. Significant improvements incorporated into CRAC2 include an improved weather sequence sampling technique, a new evacuation model, and new output capabilities. In addition, refinements have been made to the atmospheric transport and deposition model. Details of the modeling differences between CRAC2 and CRAC are emphasized in the model descriptions.

  5. [The quality of patient care under the German DRG system using as example the inguinal hernia repair].

    PubMed

    Rudroff, C; Schweins, M; Heiss, M M

    2008-02-01

    The DRG system in Germany was introduced to improve and at the same time simplify the reimbursement of costs in German hospitals. Cost effectiveness and economic efficiency were the declared goals. Structural changes and increased competition among different hospitals were the consequences. The effect on the qualitiy of patient care has been discussed with some concern. Furthermore, doubts have been expressed about the correct representation of the various diagnoses and treatments in the coding system and the financial revenue. Inguinal hernia repair serves as an example to illustrate some common problems with the reimbursement in the DRG system. Virtual patients were grouped using a "Web Grouper" and analysed using the cost accounting from the G-DRG-Browser of the InEK. Additionally, the reimbursement for ambulant hernia repair was estimated. The DRG coding did not differentiate the various operative procedures for inguinal hernia repair. They all generated the same revenues. For example, the increased costs for bilateral inguinal hernia repair are not represented in the payment. Furthermore, no difference is made between primary and recurrent inguinal hernia. In the case of a short-term hospital stay, part of the revenue is retained. In the case of ambulatory treatment of inguinal hernia, the reimbursement is by far not a real compensation for the actual costs. The ideal patient in the DRG system suffers from a primary inguinal hernia, undergoes an open hernia repair without mesh, and remains for 2-3 days in hospital. Minimally invasive procedures, repair of bilateral inguinal hernia and ambulant operation are by far less profitable--if at all. The current revenues for inguinal hernia repair require improvement and adjustment to reality in order to accomplish the goals which the DRG system in Germany aims at.

  6. User Instructions for the Systems Assessment Capability, Rev. 1, Computer Codes Volume 3: Utility Codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Eslinger, Paul W.; Aaberg, Rosanne L.; Lopresti, Charles A.

    2004-09-14

    This document contains detailed user instructions for a suite of utility codes developed for Rev. 1 of the Systems Assessment Capability. The suite of computer codes for Rev. 1 of Systems Assessment Capability performs many functions.

  7. Transient dynamics capability at Sandia National Laboratories

    NASA Technical Reports Server (NTRS)

    Attaway, Steven W.; Biffle, Johnny H.; Sjaardema, G. D.; Heinstein, M. W.; Schoof, L. A.

    1993-01-01

    A brief overview of the transient dynamics capabilities at Sandia National Laboratories, with an emphasis on recent new developments and current research is presented. In addition, the Sandia National Laboratories (SNL) Engineering Analysis Code Access System (SEACAS), which is a collection of structural and thermal codes and utilities used by analysts at SNL, is described. The SEACAS system includes pre- and post-processing codes, analysis codes, database translation codes, support libraries, Unix shell scripts for execution, and an installation system. SEACAS is used at SNL on a daily basis as a production, research, and development system for the engineering analysts and code developers. Over the past year, approximately 190 days of CPU time were used by SEACAS codes on jobs running from a few seconds up to two and one-half days of CPU time. SEACAS is running on several different systems at SNL including Cray Unicos, Hewlett Packard PH-UX, Digital Equipment Ultrix, and Sun SunOS. An overview of SEACAS, including a short description of the codes in the system, are presented. Abstracts and references for the codes are listed at the end of the report.

  8. Mathematical fundamentals for the noise immunity of the genetic code.

    PubMed

    Fimmel, Elena; Strüngmann, Lutz

    2018-02-01

    Symmetry is one of the essential and most visible patterns that can be seen in nature. Starting from the left-right symmetry of the human body, all types of symmetry can be found in crystals, plants, animals and nature as a whole. Similarly, principals of symmetry are also some of the fundamental and most useful tools in modern mathematical natural science that play a major role in theory and applications. As a consequence, it is not surprising that the desire to understand the origin of life, based on the genetic code, forces us to involve symmetry as a mathematical concept. The genetic code can be seen as a key to biological self-organisation. All living organisms have the same molecular bases - an alphabet consisting of four letters (nitrogenous bases): adenine, cytosine, guanine, and thymine. Linearly ordered sequences of these bases contain the genetic information for synthesis of proteins in all forms of life. Thus, one of the most fascinating riddles of nature is to explain why the genetic code is as it is. Genetic coding possesses noise immunity which is the fundamental feature that allows to pass on the genetic information from parents to their descendants. Hence, since the time of the discovery of the genetic code, scientists have tried to explain the noise immunity of the genetic information. In this chapter we will discuss recent results in mathematical modelling of the genetic code with respect to noise immunity, in particular error-detection and error-correction. We will focus on two central properties: Degeneracy and frameshift correction. Different amino acids are encoded by different quantities of codons and a connection between this degeneracy and the noise immunity of genetic information is a long standing hypothesis. Biological implications of the degeneracy have been intensively studied and whether the natural code is a frozen accident or a highly optimised product of evolution is still controversially discussed. Symmetries in the structure of degeneracy of the genetic code are essential and give evidence of substantial advantages of the natural code over other possible ones. In the present chapter we will present a recent approach to explain the degeneracy of the genetic code by algorithmic methods from bioinformatics, and discuss its biological consequences. The biologists recognised this problem immediately after the detection of the non-overlapping structure of the genetic code, i.e., coding sequences are to be read in a unique way determined by their reading frame. But how does the reading head of the ribosome recognises an error in the grouping of codons, caused by e.g. insertion or deletion of a base, that can be fatal during the translation process and may result in nonfunctional proteins? In this chapter we will discuss possible solutions to the frameshift problem with a focus on the theory of so-called circular codes that were discovered in large gene populations of prokaryotes and eukaryotes in the early 90s. Circular codes allow to detect a frameshift of one or two positions and recently a beautiful theory of such codes has been developed using statistics, group theory and graph theory. Copyright © 2017 Elsevier B.V. All rights reserved.

  9. Technology and medication errors: impact in nursing homes.

    PubMed

    Baril, Chantal; Gascon, Viviane; St-Pierre, Liette; Lagacé, Denis

    2014-01-01

    The purpose of this paper is to study a medication distribution technology's (MDT) impact on medication errors reported in public nursing homes in Québec Province. The work was carried out in six nursing homes (800 patients). Medication error data were collected from nursing staff through a voluntary reporting process before and after MDT was implemented. The errors were analysed using: totals errors; medication error type; severity and patient consequences. A statistical analysis verified whether there was a significant difference between the variables before and after introducing MDT. The results show that the MDT detected medication errors. The authors' analysis also indicates that errors are detected more rapidly resulting in less severe consequences for patients. MDT is a step towards safer and more efficient medication processes. Our findings should convince healthcare administrators to implement technology such as electronic prescriber or bar code medication administration systems to improve medication processes and to provide better healthcare to patients. Few studies have been carried out in long-term healthcare facilities such as nursing homes. The authors' study extends what is known about MDT's impact on medication errors in nursing homes.

  10. The application of coded excitation technology in medical ultrasonic Doppler imaging

    NASA Astrophysics Data System (ADS)

    Li, Weifeng; Chen, Xiaodong; Bao, Jing; Yu, Daoyin

    2008-03-01

    Medical ultrasonic Doppler imaging is one of the most important domains of modern medical imaging technology. The application of coded excitation technology in medical ultrasonic Doppler imaging system has the potential of higher SNR and deeper penetration depth than conventional pulse-echo imaging system, it also improves the image quality, and enhances the sensitivity of feeble signal, furthermore, proper coded excitation is beneficial to received spectrum of Doppler signal. Firstly, this paper analyzes the application of coded excitation technology in medical ultrasonic Doppler imaging system abstractly, showing the advantage and bright future of coded excitation technology, then introduces the principle and the theory of coded excitation. Secondly, we compare some coded serials (including Chirp and fake Chirp signal, Barker codes, Golay's complementary serial, M-sequence, etc). Considering Mainlobe Width, Range Sidelobe Level, Signal-to-Noise Ratio and sensitivity of Doppler signal, we choose Barker codes as coded serial. At last, we design the coded excitation circuit. The result in B-mode imaging and Doppler flow measurement coincided with our expectation, which incarnated the advantage of application of coded excitation technology in Digital Medical Ultrasonic Doppler Endoscope Imaging System.

  11. Accuracy and time requirements of a bar-code inventory system for medical supplies.

    PubMed

    Hanson, L B; Weinswig, M H; De Muth, J E

    1988-02-01

    The effects of implementing a bar-code system for issuing medical supplies to nursing units at a university teaching hospital were evaluated. Data on the time required to issue medical supplies to three nursing units at a 480-bed, tertiary-care teaching hospital were collected (1) before the bar-code system was implemented (i.e., when the manual system was in use), (2) one month after implementation, and (3) four months after implementation. At the same times, the accuracy of the central supply perpetual inventory was monitored using 15 selected items. One-way analysis of variance tests were done to determine any significant differences between the bar-code and manual systems. Using the bar-code system took longer than using the manual system because of a significant difference in the time required for order entry into the computer. Multiple-use requirements of the central supply computer system made entering bar-code data a much slower process. There was, however, a significant improvement in the accuracy of the perpetual inventory. Using the bar-code system for issuing medical supplies to the nursing units takes longer than using the manual system. However, the accuracy of the perpetual inventory was significantly improved with the implementation of the bar-code system.

  12. A web-based clinical trial management system for a sham-controlled multicenter clinical trial in depression.

    PubMed

    Durkalski, Valerie; Wenle Zhao; Dillon, Catherine; Kim, Jaemyung

    2010-04-01

    Clinical trial investigators and sponsors invest vast amounts of resources and energy into conducting trials and often face daily challenges with data management, project management, and data quality control. Rather than waiting months for study progress reports, investigators need the ability to use real-time data for the coordination and management of study activities across all study team members including site investigators, oversight committees, data and safety monitoring boards, and medical safety monitors. Web-based data management systems are beginning to meet this need but what distinguishes one system from the other are user needs/requirements and cost. To illustrate the development and implementation of a web-based data and project management system for a multicenter clinical trial designed to test the superiority of repeated transcranial magnetic stimulation versus sham for the treatment of patients with major depression. The authors discuss the reasons for not using a commercially available system for this study and describe the approach to developing their own web-based system for the OPT-TMS study. Timelines, effort, system architecture, and lessons learned are shared with the hope that this information will direct clinical trial researchers and software developers towards more efficient, user-friendly systems. The developers use a combination of generic and custom application code to allow for the flexibility to adapt the system to the needs of the study. Features of the system include: central participant registration and randomization; secure data entry at the site; participant progress/study calendar; safety data reporting; device accounting; monitor verification; and user-configurable generic reports and built-in customized reports. Hard coding was more time-efficient to address project-specific issues compared with the effort of creating a generic code application. As a consequence of this strategy, the required maintenance of the system is increased and the value of using this system for other trials is reduced. Web-based central computerized systems offer time-saving, secure options for managing clinical trial data. The choice of a commercially available system or an internally developed system is determined by the requirements of the study and users. Pros and cons to both approaches were discussed. If the intention is to use the system for various trials (single and multi-center, phases I-III) across various therapeutic areas, then the overall design should be a generic structure that simplifies the general application with minimal loss of functionality.

  13. 42 CFR 414.502 - Definitions.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... laboratory test for which a new or substantially revised Healthcare Common Procedure Coding System Code is assigned on or after January 1, 2005. Substantially Revised Healthcare Common Procedure Coding System Code...

  14. 42 CFR 414.502 - Definitions.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... laboratory test for which a new or substantially revised Healthcare Common Procedure Coding System Code is assigned on or after January 1, 2005. Substantially Revised Healthcare Common Procedure Coding System Code...

  15. Channel coding for underwater acoustic single-carrier CDMA communication system

    NASA Astrophysics Data System (ADS)

    Liu, Lanjun; Zhang, Yonglei; Zhang, Pengcheng; Zhou, Lin; Niu, Jiong

    2017-01-01

    CDMA is an effective multiple access protocol for underwater acoustic networks, and channel coding can effectively reduce the bit error rate (BER) of the underwater acoustic communication system. For the requirements of underwater acoustic mobile networks based on CDMA, an underwater acoustic single-carrier CDMA communication system (UWA/SCCDMA) based on the direct-sequence spread spectrum is proposed, and its channel coding scheme is studied based on convolution, RA, Turbo and LDPC coding respectively. The implementation steps of the Viterbi algorithm of convolutional coding, BP and minimum sum algorithms of RA coding, Log-MAP and SOVA algorithms of Turbo coding, and sum-product algorithm of LDPC coding are given. An UWA/SCCDMA simulation system based on Matlab is designed. Simulation results show that the UWA/SCCDMA based on RA, Turbo and LDPC coding have good performance such that the communication BER is all less than 10-6 in the underwater acoustic channel with low signal to noise ratio (SNR) from -12 dB to -10dB, which is about 2 orders of magnitude lower than that of the convolutional coding. The system based on Turbo coding with Log-MAP algorithm has the best performance.

  16. Advanced transportation system studies. Alternate propulsion subsystem concepts: Propulsion database

    NASA Technical Reports Server (NTRS)

    Levack, Daniel

    1993-01-01

    The Advanced Transportation System Studies alternate propulsion subsystem concepts propulsion database interim report is presented. The objective of the database development task is to produce a propulsion database which is easy to use and modify while also being comprehensive in the level of detail available. The database is to be available on the Macintosh computer system. The task is to extend across all three years of the contract. Consequently, a significant fraction of the effort in this first year of the task was devoted to the development of the database structure to ensure a robust base for the following years' efforts. Nonetheless, significant point design propulsion system descriptions and parametric models were also produced. Each of the two propulsion databases, parametric propulsion database and propulsion system database, are described. The descriptions include a user's guide to each code, write-ups for models used, and sample output. The parametric database has models for LOX/H2 and LOX/RP liquid engines, solid rocket boosters using three different propellants, a hybrid rocket booster, and a NERVA derived nuclear thermal rocket engine.

  17. [Transposition errors during learning to reproduce a sequence by the right- and the left-hand movements: simulation of positional and movement coding].

    PubMed

    Liakhovetskiĭ, V A; Bobrova, E V; Skopin, G N

    2012-01-01

    Transposition errors during the reproduction of a hand movement sequence make it possible to receive important information on the internal representation of this sequence in the motor working memory. Analysis of such errors showed that learning to reproduce sequences of the left-hand movements improves the system of positional coding (coding ofpositions), while learning of the right-hand movements improves the system of vector coding (coding of movements). Learning of the right-hand movements after the left-hand performance involved the system of positional coding "imposed" by the left hand. Learning of the left-hand movements after the right-hand performance activated the system of vector coding. Transposition errors during learning to reproduce movement sequences can be explained by neural network using either vector coding or both vector and positional coding.

  18. Enabling Handicapped Nonreaders to Independently Obtain Information: Initial Development of an Inexpensive Bar Code Reader System.

    ERIC Educational Resources Information Center

    VanBiervliet, Alan

    A project to develop and evaluate a bar code reader system as a self-directed information and instructional aid for handicapped nonreaders is described. The bar code technology involves passing a light sensitive pen or laser over a printed code with bars which correspond to coded numbers. A system would consist of a portable device which could…

  19. NOTE: MCDE: a new Monte Carlo dose engine for IMRT

    NASA Astrophysics Data System (ADS)

    Reynaert, N.; DeSmedt, B.; Coghe, M.; Paelinck, L.; Van Duyse, B.; DeGersem, W.; DeWagter, C.; DeNeve, W.; Thierens, H.

    2004-07-01

    A new accurate Monte Carlo code for IMRT dose computations, MCDE (Monte Carlo dose engine), is introduced. MCDE is based on BEAMnrc/DOSXYZnrc and consequently the accurate EGSnrc electron transport. DOSXYZnrc is reprogrammed as a component module for BEAMnrc. In this way both codes are interconnected elegantly, while maintaining the BEAM structure and only minimal changes to BEAMnrc.mortran are necessary. The treatment head of the Elekta SLiplus linear accelerator is modelled in detail. CT grids consisting of up to 200 slices of 512 × 512 voxels can be introduced and up to 100 beams can be handled simultaneously. The beams and CT data are imported from the treatment planning system GRATIS via a DICOM interface. To enable the handling of up to 50 × 106 voxels the system was programmed in Fortran95 to enable dynamic memory management. All region-dependent arrays (dose, statistics, transport arrays) were redefined. A scoring grid was introduced and superimposed on the geometry grid, to be able to limit the number of scoring voxels. The whole system uses approximately 200 MB of RAM and runs on a PC cluster consisting of 38 1.0 GHz processors. A set of in-house made scripts handle the parallellization and the centralization of the Monte Carlo calculations on a server. As an illustration of MCDE, a clinical example is discussed and compared with collapsed cone convolution calculations. At present, the system is still rather slow and is intended to be a tool for reliable verification of IMRT treatment planning in the case of the presence of tissue inhomogeneities such as air cavities.

  20. Technology Infusion of CodeSonar into the Space Network Ground Segment

    NASA Technical Reports Server (NTRS)

    Benson, Markland J.

    2009-01-01

    This slide presentation reviews the applicability of CodeSonar to the Space Network software. CodeSonar is a commercial off the shelf system that analyzes programs written in C, C++ or Ada for defects in the code. Software engineers use CodeSonar results as an input to the existing source code inspection process. The study is focused on large scale software developed using formal processes. The systems studied are mission critical in nature but some use commodity computer systems.

  1. Trellis coded multilevel DPSK system with doppler correction for mobile satellite channels

    NASA Technical Reports Server (NTRS)

    Divsalar, Dariush (Inventor); Simon, Marvin K. (Inventor)

    1991-01-01

    A trellis coded multilevel differential phase shift keyed mobile communication system. The system of the present invention includes a trellis encoder for translating input signals into trellis codes; a differential encoder for differentially encoding the trellis coded signals; a transmitter for transmitting the differentially encoded trellis coded signals; a receiver for receiving the transmitted signals; a differential demodulator for demodulating the received differentially encoded trellis coded signals; and a trellis decoder for decoding the differentially demodulated signals.

  2. Identification of a Novel GJA8 (Cx50) Point Mutation Causes Human Dominant Congenital Cataracts

    NASA Astrophysics Data System (ADS)

    Ge, Xiang-Lian; Zhang, Yilan; Wu, Yaming; Lv, Jineng; Zhang, Wei; Jin, Zi-Bing; Qu, Jia; Gu, Feng

    2014-02-01

    Hereditary cataracts are clinically and genetically heterogeneous lens diseases that cause a significant proportion of visual impairment and blindness in children. Human cataracts have been linked with mutations in two genes, GJA3 and GJA8, respectively. To identify the causative mutation in a family with hereditary cataracts, family members were screened for mutations by PCR for both genes. Sequencing the coding regions of GJA8, coding for connexin 50, revealed a C > A transversion at nucleotide 264, which caused p.P88T mutation. To dissect the molecular consequences of this mutation, plasmids carrying wild-type and mutant mouse ORFs of Gja8 were generated and ectopically expressed in HEK293 cells and human lens epithelial cells, respectively. The recombinant proteins were assessed by confocal microscopy and Western blotting. The results demonstrate that the molecular consequences of the p.P88T mutation in GJA8 include changes in connexin 50 protein localization patterns, accumulation of mutant protein, and increased cell growth.

  3. Enterobacter aerogenes Hormaeche and Edwards 1960 (Approved Lists 1980) and Klebsiella mobilis Bascomb et al. 1971 (Approved Lists 1980) share the same nomenclatural type (ATCC 13048) on the Approved Lists and are homotypic synonyms, with consequences for the name Klebsiella mobilis Bascomb et al. 1971 (Approved Lists 1980).

    PubMed

    Tindall, B J; Sutton, G; Garrity, G M

    2017-02-01

    Enterobacter aerogenes Hormaeche and Edwards 1960 (Approved Lists 1980) and Klebsiella mobilis Bascomb et al. 1971 (Approved Lists 1980) were placed on the Approved Lists of Bacterial Names and were based on the same nomenclatural type, ATCC 13048. Consequently they are to be treated as homotypic synonyms. However, the names of homotypic synonyms at the rank of species normally are based on the same epithet. Examination of the Rules of the International Code of Nomenclature of Bacteria in force at the time indicates that the epithet mobilis in Klebsiella mobilis Bascomb et al. 1971 (Approved Lists 1980) was illegitimate at the time the Approved Lists were published and according to the Rules of the current International Code of Nomenclature of Prokaryotes continues to be illegitimate.

  4. Electronic patient registration and tracking at mass vaccination clinics: a clinical study.

    PubMed

    Billittier, Anthony J; Lupiani, Patrick; Masterson, Gary; Masterson, Tim; Zak, Christopher

    2003-01-01

    To protect the citizens of the United States from the use of dangerous biological agents, the Center for Disease Control and Prevention (CDC) has been actively preparing to deal with the consequences of such an attack. Their plans include the deployment of mass immunization clinics to handle postevent vaccinations. As part of the planning efforts by the Western New York Public Health Alliance, a Web-based electronic patient registration and tracking system was developed and tested at a recent trial smallpox vaccination clinic. Initial goals were to determine what the pitfalls and benefits of using such a system might be in comparison to other methods of data collection. This exercise proved that use of an electronic system capable of scanning two-dimensional bar codes was superior to both paper-based and optical character recognition (OCR) methods of data collection and management. Major improvements in speed and/or accuracy were evident in all areas of the clinic, especially in patient registration, vaccine tracking and postclinic data analysis.

  5. First-order aerodynamic and aeroelastic behavior of a single-blade installation setup

    NASA Astrophysics Data System (ADS)

    Gaunaa, M.; Bergami, L.; Guntur, S.; Zahle, F.

    2014-06-01

    Limitations on the wind speed at which blade installation can be performed bears important financial consequences. The installation cost of a wind farm could be significantly reduced by increasing the wind speed at which blade mounting operations can be carried out. This work characterizes the first-order aerodynamic and aeroelastic behavior of a single blade installation system, where the blade is grabbed by a yoke, which is lifted by the crane and stabilized by two taglines. A simple engineering model is formulated to describe the aerodynamic forcing on the blade subject to turbulent wind of arbitrary direction. The model is coupled with a schematic aeroelastic representation of the taglines system, which returns the minimum line tension required to compensate for the aerodynamic forcing. The simplified models are in excellent agreement with the aeroelastic code HAWC2, and provide a solid basis for future design of an upgraded single blade installation system able to operate at higher wind speeds.

  6. A Discussion of Using a Reconfigurable Processor to Implement the Discrete Fourier Transform

    NASA Technical Reports Server (NTRS)

    White, Michael J.

    2004-01-01

    This paper presents the design and implementation of the Discrete Fourier Transform (DFT) algorithm on a reconfigurable processor system. While highly applicable to many engineering problems, the DFT is an extremely computationally intensive algorithm. Consequently, the eventual goal of this work is to enhance the execution of a floating-point precision DFT algorithm by off loading the algorithm from the computing system. This computing system, within the context of this research, is a typical high performance desktop computer with an may of field programmable gate arrays (FPGAs). FPGAs are hardware devices that are configured by software to execute an algorithm. If it is desired to change the algorithm, the software is changed to reflect the modification, then download to the FPGA, which is then itself modified. This paper will discuss methodology for developing the DFT algorithm to be implemented on the FPGA. We will discuss the algorithm, the FPGA code effort, and the results to date.

  7. Bridging the gap between the clinician and the patient with cryopyrin-associated periodic syndromes.

    PubMed

    Cantarini, L; Lucherini, O M; Frediani, B; Brizi, M G; Bartolomei, B; Cimaz, R; Galeazzi, M; Rigante, D

    2011-01-01

    Cryopyrin-associated periodic syndromes are categorized as a spectrum of three autoinflammatory diseases, namely familial cold auto-inflammatory syndrome, Muckle-Wells syndrome and chronic infantile neurological cutaneous articular syndrome. All are caused by mutations in the NLRP3 gene coding for cryopyrin and result in active interleukin-1 release: their rarity and shared clinical indicators involving skin, joints, central nervous system and eyes often mean that correct diagnosis is delayed. Onset occurs early in childhood, and life-long therapy with interleukin-1 blocking agents usually leads to tangible clinical remission and inflammatory marker normalization in a large number of patients, justifying the need to facilitate early diagnosis and thus avoid irreversible negative consequences for tissues and organs.

  8. TLM-Tracker: software for cell segmentation, tracking and lineage analysis in time-lapse microscopy movies.

    PubMed

    Klein, Johannes; Leupold, Stefan; Biegler, Ilona; Biedendieck, Rebekka; Münch, Richard; Jahn, Dieter

    2012-09-01

    Time-lapse imaging in combination with fluorescence microscopy techniques enable the investigation of gene regulatory circuits and uncovered phenomena like culture heterogeneity. In this context, computational image processing for the analysis of single cell behaviour plays an increasing role in systems biology and mathematical modelling approaches. Consequently, we developed a software package with graphical user interface for the analysis of single bacterial cell behaviour. A new software called TLM-Tracker allows for the flexible and user-friendly interpretation for the segmentation, tracking and lineage analysis of microbial cells in time-lapse movies. The software package, including manual, tutorial video and examples, is available as Matlab code or executable binaries at http://www.tlmtracker.tu-bs.de.

  9. Integration of the Remote Agent for the NASA Deep Space One Autonomy Experiment

    NASA Technical Reports Server (NTRS)

    Dorais, Gregory A.; Bernard, Douglas E.; Gamble, Edward B., Jr.; Kanefsky, Bob; Kurien, James; Muscettola, Nicola; Nayak, P. Pandurang; Rajan, Kanna; Lau, Sonie (Technical Monitor)

    1998-01-01

    This paper describes the integration of the Remote Agent (RA), a spacecraft autonomy system which is scheduled to control the Deep Space 1 spacecraft during a flight experiment in 1999. The RA is a reusable, model-based autonomy system that is quite different from software typically used to control an aerospace system. We describe the integration challenges we faced, how we addressed them, and the lessons learned. We focus on those aspects of integrating the RA that were either easier or more difficult than integrating a more traditional large software application because the RA is a model-based autonomous system. A number of characteristics of the RA made integration process easier. One example is the model-based nature of RA. Since the RA is model-based, most of its behavior is not hard coded into procedural program code. Instead, engineers specify high level models of the spacecraft's components from which the Remote Agent automatically derives correct system-wide behavior on the fly. This high level, modular, and declarative software description allowed some interfaces between RA components and between RA and the flight software to be automatically generated and tested for completeness against the Remote Agent's models. In addition, the Remote Agent's model-based diagnosis system automatically diagnoses when the RA models are not consistent with the behavior of the spacecraft. In flight, this feature is used to diagnose failures in the spacecraft hardware. During integration, it proved valuable in finding problems in the spacecraft simulator or flight software. In addition, when modifications are made to the spacecraft hardware or flight software, the RA models are easily changed because they only capture a description of the spacecraft. one does not have to maintain procedural code that implements the correct behavior for every expected situation. On the other hand, several features of the RA made it more difficult to integrate than typical flight software. For example, the definition of correct behavior is more difficult to specify for a system that is expected to reason about and flexibly react to its environment than for a traditional flight software system. Consequently, whenever a change is made to the RA it is more time consuming to determine if the resulting behavior is correct. We conclude the paper with a discussion of future work on the Remote Agent as well as recommendations to ease integration of similar autonomy projects.

  10. SU-D-BRD-03: A Gateway for GPU Computing in Cancer Radiotherapy Research

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jia, X; Folkerts, M; Shi, F

    Purpose: Graphics Processing Unit (GPU) has become increasingly important in radiotherapy. However, it is still difficult for general clinical researchers to access GPU codes developed by other researchers, and for developers to objectively benchmark their codes. Moreover, it is quite often to see repeated efforts spent on developing low-quality GPU codes. The goal of this project is to establish an infrastructure for testing GPU codes, cross comparing them, and facilitating code distributions in radiotherapy community. Methods: We developed a system called Gateway for GPU Computing in Cancer Radiotherapy Research (GCR2). A number of GPU codes developed by our group andmore » other developers can be accessed via a web interface. To use the services, researchers first upload their test data or use the standard data provided by our system. Then they can select the GPU device on which the code will be executed. Our system offers all mainstream GPU hardware for code benchmarking purpose. After the code running is complete, the system automatically summarizes and displays the computing results. We also released a SDK to allow the developers to build their own algorithm implementation and submit their binary codes to the system. The submitted code is then systematically benchmarked using a variety of GPU hardware and representative data provided by our system. The developers can also compare their codes with others and generate benchmarking reports. Results: It is found that the developed system is fully functioning. Through a user-friendly web interface, researchers are able to test various GPU codes. Developers also benefit from this platform by comprehensively benchmarking their codes on various GPU platforms and representative clinical data sets. Conclusion: We have developed an open platform allowing the clinical researchers and developers to access the GPUs and GPU codes. This development will facilitate the utilization of GPU in radiation therapy field.« less

  11. Development of new two-dimensional spectral/spatial code based on dynamic cyclic shift code for OCDMA system

    NASA Astrophysics Data System (ADS)

    Jellali, Nabiha; Najjar, Monia; Ferchichi, Moez; Rezig, Houria

    2017-07-01

    In this paper, a new two-dimensional spectral/spatial codes family, named two dimensional dynamic cyclic shift codes (2D-DCS) is introduced. The 2D-DCS codes are derived from the dynamic cyclic shift code for the spectral and spatial coding. The proposed system can fully eliminate the multiple access interference (MAI) by using the MAI cancellation property. The effect of shot noise, phase-induced intensity noise and thermal noise are used to analyze the code performance. In comparison with existing two dimensional (2D) codes, such as 2D perfect difference (2D-PD), 2D Extended Enhanced Double Weight (2D-Extended-EDW) and 2D hybrid (2D-FCC/MDW) codes, the numerical results show that our proposed codes have the best performance. By keeping the same code length and increasing the spatial code, the performance of our 2D-DCS system is enhanced: it provides higher data rates while using lower transmitted power and a smaller spectral width.

  12. Performance of data-compression codes in channels with errors. Final report, October 1986-January 1987

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1987-10-01

    Huffman codes, comma-free codes, and block codes with shift indicators are important candidate-message compression codes for improving the efficiency of communications systems. This study was undertaken to determine if these codes could be used to increase the thruput of the fixed very-low-frequency (FVLF) communication system. This applications involves the use of compression codes in a channel with errors.

  13. The adaption and use of research codes for performance assessment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liebetrau, A.M.

    1987-05-01

    Models of real-world phenomena are developed for many reasons. The models are usually, if not always, implemented in the form of a computer code. The characteristics of a code are determined largely by its intended use. Realizations or implementations of detailed mathematical models of complex physical and/or chemical processes are often referred to as research or scientific (RS) codes. Research codes typically require large amounts of computing time. One example of an RS code is a finite-element code for solving complex systems of differential equations that describe mass transfer through some geologic medium. Considerable computing time is required because computationsmore » are done at many points in time and/or space. Codes used to evaluate the overall performance of real-world physical systems are called performance assessment (PA) codes. Performance assessment codes are used to conduct simulated experiments involving systems that cannot be directly observed. Thus, PA codes usually involve repeated simulations of system performance in situations that preclude the use of conventional experimental and statistical methods. 3 figs.« less

  14. The application of LDPC code in MIMO-OFDM system

    NASA Astrophysics Data System (ADS)

    Liu, Ruian; Zeng, Beibei; Chen, Tingting; Liu, Nan; Yin, Ninghao

    2018-03-01

    The combination of MIMO and OFDM technology has become one of the key technologies of the fourth generation mobile communication., which can overcome the frequency selective fading of wireless channel, increase the system capacity and improve the frequency utilization. Error correcting coding introduced into the system can further improve its performance. LDPC (low density parity check) code is a kind of error correcting code which can improve system reliability and anti-interference ability, and the decoding is simple and easy to operate. This paper mainly discusses the application of LDPC code in MIMO-OFDM system.

  15. VENTURE/PC manual: A multidimensional multigroup neutron diffusion code system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shapiro, A.; Huria, H.C.; Cho, K.W.

    1991-12-01

    VENTURE/PC is a recompilation of part of the Oak Ridge BOLD VENTURE code system, which will operate on an IBM PC or compatible computer. Neutron diffusion theory solutions are obtained for multidimensional, multigroup problems. This manual contains information associated with operating the code system. The purpose of the various modules used in the code system, and the input for these modules are discussed. The PC code structure is also given. Version 2 included several enhancements not given in the original version of the code. In particular, flux iterations can be done in core rather than by reading and writing tomore » disk, for problems which allow sufficient memory for such in-core iterations. This speeds up the iteration process. Version 3 does not include any of the special processors used in the previous versions. These special processors utilized formatted input for various elements of the code system. All such input data is now entered through the Input Processor, which produces standard interface files for the various modules in the code system. In addition, a Standard Interface File Handbook is included in the documentation which is distributed with the code, to assist in developing the input for the Input Processor.« less

  16. Thermal hydraulic-severe accident code interfaces for SCDAP/RELAP5/MOD3.2

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Coryell, E.W.; Siefken, L.J.; Harvego, E.A.

    1997-07-01

    The SCDAP/RELAP5 computer code is designed to describe the overall reactor coolant system thermal-hydraulic response, core damage progression, and fission product release during severe accidents. The code is being developed at the Idaho National Engineering Laboratory under the primary sponsorship of the Office of Nuclear Regulatory Research of the U.S. Nuclear Regulatory Commission. The code is the result of merging the RELAP5, SCDAP, and COUPLE codes. The RELAP5 portion of the code calculates the overall reactor coolant system, thermal-hydraulics, and associated reactor system responses. The SCDAP portion of the code describes the response of the core and associated vessel structures.more » The COUPLE portion of the code describes response of lower plenum structures and debris and the failure of the lower head. The code uses a modular approach with the overall structure, input/output processing, and data structures following the pattern established for RELAP5. The code uses a building block approach to allow the code user to easily represent a wide variety of systems and conditions through a powerful input processor. The user can represent a wide variety of experiments or reactor designs by selecting fuel rods and other assembly structures from a range of representative core component models, and arrange them in a variety of patterns within the thermalhydraulic network. The COUPLE portion of the code uses two-dimensional representations of the lower plenum structures and debris beds. The flow of information between the different portions of the code occurs at each system level time step advancement. The RELAP5 portion of the code describes the fluid transport around the system. These fluid conditions are used as thermal and mass transport boundary conditions for the SCDAP and COUPLE structures and debris beds.« less

  17. Two dimension MDW OCDMA code cross-correlation for reduction of phase induced intensity noise

    NASA Astrophysics Data System (ADS)

    Ahmed, Israa Sh.; Aljunid, Syed A.; Nordin, Junita M.; Dulaimi, Layth A. Khalil Al; Matem, Rima

    2017-11-01

    In this paper, we first review 2-D MDW code cross correlation equations and table to be improved significantly by using code correlation properties. These codes can be used in the synchronous optical CDMA systems for multi access interference cancellation and maximum suppress the phase induced intensity noise. Low Psr is due to the reduction of interference noise that is induced by the 2-D MDW code PIIN suppression. High data rate causes increases in BER, requires high effective power and severely deteriorates the system performance. The 2-D W/T MDW code has an excellent system performance where the value of PIIN is suppressed as low as possible at the optimum Psr with high data bit rate. The 2-D MDW code shows better tolerance to PIIN in comparison to others with enhanced system performance. We prove by numerical analysis that the PIIN maximally suppressed by MDW code through the minimizing property of cross correlation in comparison to 2-D PDC and 2-D MQC OCDMA code.scheme systems.

  18. Performance enhancement of optical code-division multiple-access systems using transposed modified Walsh code

    NASA Astrophysics Data System (ADS)

    Sikder, Somali; Ghosh, Shila

    2018-02-01

    This paper presents the construction of unipolar transposed modified Walsh code (TMWC) and analysis of its performance in optical code-division multiple-access (OCDMA) systems. Specifically, the signal-to-noise ratio, bit error rate (BER), cardinality, and spectral efficiency were investigated. The theoretical analysis demonstrated that the wavelength-hopping time-spreading system using TMWC was robust against multiple-access interference and more spectrally efficient than systems using other existing OCDMA codes. In particular, the spectral efficiency was calculated to be 1.0370 when TMWC of weight 3 was employed. The BER and eye pattern for the designed TMWC were also successfully obtained using OptiSystem simulation software. The results indicate that the proposed code design is promising for enhancing network capacity.

  19. Challenges in Developing Models Describing Complex Soil Systems

    NASA Astrophysics Data System (ADS)

    Simunek, J.; Jacques, D.

    2014-12-01

    Quantitative mechanistic models that consider basic physical, mechanical, chemical, and biological processes have the potential to be powerful tools to integrate our understanding of complex soil systems, and the soil science community has often called for models that would include a large number of these diverse processes. However, once attempts have been made to develop such models, the response from the community has not always been overwhelming, especially after it discovered that these models are consequently highly complex, requiring not only a large number of parameters, not all of which can be easily (or at all) measured and/or identified, and which are often associated with large uncertainties, but also requiring from their users deep knowledge of all/most of these implemented physical, mechanical, chemical and biological processes. Real, or perceived, complexity of these models then discourages users from using them even for relatively simple applications, for which they would be perfectly adequate. Due to the nonlinear nature and chemical/biological complexity of the soil systems, it is also virtually impossible to verify these types of models analytically, raising doubts about their applicability. Code inter-comparisons, which is then likely the most suitable method to assess code capabilities and model performance, requires existence of multiple models of similar/overlapping capabilities, which may not always exist. It is thus a challenge not only to developed models describing complex soil systems, but also to persuade the soil science community in using them. As a result, complex quantitative mechanistic models are still an underutilized tool in soil science research. We will demonstrate some of the challenges discussed above on our own efforts in developing quantitative mechanistic models (such as HP1/2) for complex soil systems.

  20. Efficient coding of spectrotemporal binaural sounds leads to emergence of the auditory space representation

    PubMed Central

    Młynarski, Wiktor

    2014-01-01

    To date a number of studies have shown that receptive field shapes of early sensory neurons can be reproduced by optimizing coding efficiency of natural stimulus ensembles. A still unresolved question is whether the efficient coding hypothesis explains formation of neurons which explicitly represent environmental features of different functional importance. This paper proposes that the spatial selectivity of higher auditory neurons emerges as a direct consequence of learning efficient codes for natural binaural sounds. Firstly, it is demonstrated that a linear efficient coding transform—Independent Component Analysis (ICA) trained on spectrograms of naturalistic simulated binaural sounds extracts spatial information present in the signal. A simple hierarchical ICA extension allowing for decoding of sound position is proposed. Furthermore, it is shown that units revealing spatial selectivity can be learned from a binaural recording of a natural auditory scene. In both cases a relatively small subpopulation of learned spectrogram features suffices to perform accurate sound localization. Representation of the auditory space is therefore learned in a purely unsupervised way by maximizing the coding efficiency and without any task-specific constraints. This results imply that efficient coding is a useful strategy for learning structures which allow for making behaviorally vital inferences about the environment. PMID:24639644

  1. Preliminary results of consequence assessment of a hypothetical severe accident using Thai meteorological data

    NASA Astrophysics Data System (ADS)

    Silva, K.; Lawawirojwong, S.; Promping, J.

    2017-06-01

    Consequence assessment of a hypothetical severe accident is one of the important elements of the risk assessment of a nuclear power plant. It is widely known that the meteorological conditions can significantly influence the outcomes of such assessment, since it determines the results of the calculation of the radionuclide environmental transport. This study aims to assess the impacts of the meteorological conditions to the results of the consequence assessment. The consequence assessment code, OSCAAR, of Japan Atomic Energy Agency (JAEA) is used for the assessment. The results of the consequence assessment using Thai meteorological data are compared with those using Japanese meteorological data. The Thai case has following characteristics. Low wind speed made the radionuclides concentrate at the center comparing to the Japanese case. The squalls induced the peaks in the ground concentration distribution. The evacuated land is larger than the Japanese case though the relocated land is smaller, which is attributed to the concentration of the radionuclides near the release point.

  2. 10 CFR 434.99 - Explanation of numbering system for codes.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 10 Energy 3 2011-01-01 2011-01-01 false Explanation of numbering system for codes. 434.99 Section 434.99 Energy DEPARTMENT OF ENERGY ENERGY CONSERVATION ENERGY CODE FOR NEW FEDERAL COMMERCIAL AND MULTI-FAMILY HIGH RISE RESIDENTIAL BUILDINGS § 434.99 Explanation of numbering system for codes. (a) For...

  3. 10 CFR 434.99 - Explanation of numbering system for codes.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 10 Energy 3 2010-01-01 2010-01-01 false Explanation of numbering system for codes. 434.99 Section 434.99 Energy DEPARTMENT OF ENERGY ENERGY CONSERVATION ENERGY CODE FOR NEW FEDERAL COMMERCIAL AND MULTI-FAMILY HIGH RISE RESIDENTIAL BUILDINGS § 434.99 Explanation of numbering system for codes. (a) For...

  4. 48 CFR 452.219-70 - Size Standard and NAICS Code Information.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 4 2010-10-01 2010-10-01 false Size Standard and NAICS Code Information. 452.219-70 Section 452.219-70 Federal Acquisition Regulations System DEPARTMENT OF... System Code(s) and business size standard(s) describing the products and/or services to be acquired under...

  5. Performance optimization of spectral amplitude coding OCDMA system using new enhanced multi diagonal code

    NASA Astrophysics Data System (ADS)

    Imtiaz, Waqas A.; Ilyas, M.; Khan, Yousaf

    2016-11-01

    This paper propose a new code to optimize the performance of spectral amplitude coding-optical code division multiple access (SAC-OCDMA) system. The unique two-matrix structure of the proposed enhanced multi diagonal (EMD) code and effective correlation properties, between intended and interfering subscribers, significantly elevates the performance of SAC-OCDMA system by negating multiple access interference (MAI) and associated phase induce intensity noise (PIIN). Performance of SAC-OCDMA system based on the proposed code is thoroughly analyzed for two detection techniques through analytic and simulation analysis by referring to bit error rate (BER), signal to noise ratio (SNR) and eye patterns at the receiving end. It is shown that EMD code while using SDD technique provides high transmission capacity, reduces the receiver complexity, and provides better performance as compared to complementary subtraction detection (CSD) technique. Furthermore, analysis shows that, for a minimum acceptable BER of 10-9 , the proposed system supports 64 subscribers at data rates of up to 2 Gbps for both up-down link transmission.

  6. How To Keep Your Schools Safe and Secure.

    ERIC Educational Resources Information Center

    Gilbert, Christopher B.

    1996-01-01

    Discusses unforeseen costs (including potential litigation expenses), benefits, and consequences of adopting security measures (such as metal detectors, drug dogs, security cameras, campus police, dress codes, crime watch programs, and communication devices) to counter on-campus violence and gang activity. High-tech gadgetry alone is insufficient.…

  7. The Revised 2010 Ethical Standards for School Counselors

    ERIC Educational Resources Information Center

    Huey, Wayne C.

    2011-01-01

    The American School Counselor Association (ASCA) recently revised its ethical code for professional school counselors, the "Ethical Standards for School Counselors," in 2010. Professional school counselors have a unique challenge in counseling minors in that they provide services in an educational setting. Consequently, school counselors not only…

  8. Automatic Generation of Algorithms for the Statistical Analysis of Planetary Nebulae Images

    NASA Technical Reports Server (NTRS)

    Fischer, Bernd

    2004-01-01

    Analyzing data sets collected in experiments or by observations is a Core scientific activity. Typically, experimentd and observational data are &aught with uncertainty, and the analysis is based on a statistical model of the conjectured underlying processes, The large data volumes collected by modern instruments make computer support indispensible for this. Consequently, scientists spend significant amounts of their time with the development and refinement of the data analysis programs. AutoBayes [GF+02, FS03] is a fully automatic synthesis system for generating statistical data analysis programs. Externally, it looks like a compiler: it takes an abstract problem specification and translates it into executable code. Its input is a concise description of a data analysis problem in the form of a statistical model as shown in Figure 1; its output is optimized and fully documented C/C++ code which can be linked dynamically into the Matlab and Octave environments. Internally, however, it is quite different: AutoBayes derives a customized algorithm implementing the given model using a schema-based process, and then further refines and optimizes the algorithm into code. A schema is a parameterized code template with associated semantic constraints which define and restrict the template s applicability. The schema parameters are instantiated in a problem-specific way during synthesis as AutoBayes checks the constraints against the original model or, recursively, against emerging sub-problems. AutoBayes schema library contains problem decomposition operators (which are justified by theorems in a formal logic in the domain of Bayesian networks) as well as machine learning algorithms (e.g., EM, k-Means) and nu- meric optimization methods (e.g., Nelder-Mead simplex, conjugate gradient). AutoBayes augments this schema-based approach by symbolic computation to derive closed-form solutions whenever possible. This is a major advantage over other statistical data analysis systems which use numerical approximations even in cases where closed-form solutions exist. AutoBayes is implemented in Prolog and comprises approximately 75.000 lines of code. In this paper, we take one typical scientific data analysis problem-analyzing planetary nebulae images taken by the Hubble Space Telescope-and show how AutoBayes can be used to automate the implementation of the necessary anal- ysis programs. We initially follow the analysis described by Knuth and Hajian [KHO2] and use AutoBayes to derive code for the published models. We show the details of the code derivation process, including the symbolic computations and automatic integration of library procedures, and compare the results of the automatically generated and manually implemented code. We then go beyond the original analysis and use AutoBayes to derive code for a simple image segmentation procedure based on a mixture model which can be used to automate a manual preproceesing step. Finally, we combine the original approach with the simple segmentation which yields a more detailed analysis. This also demonstrates that AutoBayes makes it easy to combine different aspects of data analysis.

  9. A novel quantum LSB-based steganography method using the Gray code for colored quantum images

    NASA Astrophysics Data System (ADS)

    Heidari, Shahrokh; Farzadnia, Ehsan

    2017-10-01

    As one of the prevalent data-hiding techniques, steganography is defined as the act of concealing secret information in a cover multimedia encompassing text, image, video and audio, imperceptibly, in order to perform interaction between the sender and the receiver in which nobody except the receiver can figure out the secret data. In this approach a quantum LSB-based steganography method utilizing the Gray code for quantum RGB images is investigated. This method uses the Gray code to accommodate two secret qubits in 3 LSBs of each pixel simultaneously according to reference tables. Experimental consequences which are analyzed in MATLAB environment, exhibit that the present schema shows good performance and also it is more secure and applicable than the previous one currently found in the literature.

  10. Solutions to Three-Dimensional Thin-Layer Navier-Stokes Equations in Rotating Coordinates for Flow Through Turbomachinery

    NASA Technical Reports Server (NTRS)

    Ghosh, Amrit Raj

    1996-01-01

    The viscous, Navier-Stokes solver for turbomachinery applications, MSUTC has been modified to include the rotating frame formulation. The three-dimensional thin-layer Navier-Stokes equations have been cast in a rotating Cartesian frame enabling the freezing of grid motion. This also allows the flow-field associated with an isolated rotor to be viewed as a steady-state problem. Consequently, local time stepping can be used to accelerate convergence. The formulation is validated by running NASA's Rotor 67 as the test case. results are compared between the rotating frame code and the absolute frame code. The use of the rotating frame approach greatly enhances the performance of the code with respect to savings in computing time, without degradation of the solution.

  11. The Maximal C³ Self-Complementary Trinucleotide Circular Code X in Genes of Bacteria, Archaea, Eukaryotes, Plasmids and Viruses.

    PubMed

    Michel, Christian J

    2017-04-18

    In 1996, a set X of 20 trinucleotides was identified in genes of both prokaryotes and eukaryotes which has on average the highest occurrence in reading frame compared to its two shifted frames. Furthermore, this set X has an interesting mathematical property as X is a maximal C 3 self-complementary trinucleotide circular code. In 2015, by quantifying the inspection approach used in 1996, the circular code X was confirmed in the genes of bacteria and eukaryotes and was also identified in the genes of plasmids and viruses. The method was based on the preferential occurrence of trinucleotides among the three frames at the gene population level. We extend here this definition at the gene level. This new statistical approach considers all the genes, i.e., of large and small lengths, with the same weight for searching the circular code X . As a consequence, the concept of circular code, in particular the reading frame retrieval, is directly associated to each gene. At the gene level, the circular code X is strengthened in the genes of bacteria, eukaryotes, plasmids, and viruses, and is now also identified in the genes of archaea. The genes of mitochondria and chloroplasts contain a subset of the circular code X . Finally, by studying viral genes, the circular code X was found in DNA genomes, RNA genomes, double-stranded genomes, and single-stranded genomes.

  12. Implementation of ASME Code, Section XI, Code Case N-770, on Alternative Examination Requirements for Class 1 Butt Welds Fabricated with Alloy 82/182

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sullivan, Edmund J.; Anderson, Michael T.

    In May 2010, the NRC issued a proposed notice of rulemaking that includes a provision to add a new section to its rules to require licensees to implement ASME Code Case N-770, ‘‘Alternative Examination Requirements and Acceptance Standards for Class 1 PWR Piping and Vessel Nozzle Butt Welds Fabricated with UNS N06082 or UNS W86182 Weld Filler Material With or Without the Application of Listed Mitigation Activities, Section XI, Division 1,’’ with 15 conditions. Code Case N-770 contains baseline and inservice inspection (ISI) requirements for unmitigated butt welds fabricated with Alloy 82/182 material and preservice and ISI requirements for mitigatedmore » butt welds. The NRC stated that application of ASME Code Case N-770 is necessary because the inspections currently required by the ASME Code, Section XI, were not written to address stress corrosion cracking Alloy 82/182 butt welds, and the safety consequences of inadequate inspections can be significant. The NRC expects to issue the final rule incorporating this code case into its regulations in the spring 2011 time frame. This paper discusses the new examination requirements, the conditions that NRC is imposing , and the major concerns with implementation of the new Code Case.« less

  13. Auto-Regulatory RNA Editing Fine-Tunes mRNA Re-Coding and Complex Behaviour in Drosophila

    PubMed Central

    Savva, Yiannis A.; Jepson, James E.C; Sahin, Asli; Sugden, Arthur U.; Dorsky, Jacquelyn S.; Alpert, Lauren; Lawrence, Charles; Reenan, Robert A.

    2014-01-01

    Auto-regulatory feedback loops are a common molecular strategy used to optimize protein function. In Drosophila many mRNAs involved in neuro-transmission are re-coded at the RNA level by the RNA editing enzyme dADAR, leading to the incorporation of amino acids that are not directly encoded by the genome. dADAR also re-codes its own transcript, but the consequences of this auto-regulation in vivo are unclear. Here we show that hard-wiring or abolishing endogenous dADAR auto-regulation dramatically remodels the landscape of re-coding events in a site-specific manner. These molecular phenotypes correlate with altered localization of dADAR within the nuclear compartment. Furthermore, auto-editing exhibits sexually dimorphic patterns of spatial regulation and can be modified by abiotic environmental factors. Finally, we demonstrate that modifying dAdar auto-editing affects adaptive complex behaviors. Our results reveal the in vivo relevance of auto-regulatory control over post-transcriptional mRNA re-coding events in fine-tuning brain function and organismal behavior. PMID:22531175

  14. The dependence of frequency distributions on multiple meanings of words, codes and signs

    NASA Astrophysics Data System (ADS)

    Yan, Xiaoyong; Minnhagen, Petter

    2018-01-01

    The dependence of the frequency distributions due to multiple meanings of words in a text is investigated by deleting letters. By coding the words with fewer letters the number of meanings per coded word increases. This increase is measured and used as an input in a predictive theory. For a text written in English, the word-frequency distribution is broad and fat-tailed, whereas if the words are only represented by their first letter the distribution becomes exponential. Both distribution are well predicted by the theory, as is the whole sequence obtained by consecutively representing the words by the first L = 6 , 5 , 4 , 3 , 2 , 1 letters. Comparisons of texts written by Chinese characters and the same texts written by letter-codes are made and the similarity of the corresponding frequency-distributions are interpreted as a consequence of the multiple meanings of Chinese characters. This further implies that the difference of the shape for word-frequencies for an English text written by letters and a Chinese text written by Chinese characters is due to the coding and not to the language per se.

  15. Alternative Pre-mRNA Splicing in Mammals and Teleost Fish: A Effective Strategy for the Regulation of Immune Responses Against Pathogen Infection.

    PubMed

    Chang, Ming Xian; Zhang, Jie

    2017-07-15

    Pre-mRNA splicing is the process by which introns are removed and the protein coding elements assembled into mature mRNAs. Alternative pre-mRNA splicing provides an important source of transcriptome and proteome complexity through selectively joining different coding elements to form mRNAs, which encode proteins with similar or distinct functions. In mammals, previous studies have shown the role of alternative splicing in regulating the function of the immune system, especially in the regulation of T-cell activation and function. As lower vertebrates, teleost fish mainly rely on a large family of pattern recognition receptors (PRRs) to recognize pathogen-associated molecular patterns (PAMPs) from various invading pathogens. In this review, we summarize recent advances in our understanding of alternative splicing of piscine PRRs including peptidoglycan recognition proteins (PGRPs), nucleotide binding and oligomerization domain (NOD)-like receptors (NLRs), retinoic acid-inducible gene-I (RIG-I)-like receptors (RLRs) and their downstream signaling molecules, compared to splicing in mammals. We also discuss what is known and unknown about the function of splicing isoforms in the innate immune responses against pathogens infection in mammals and teleost fish. Finally, we highlight the consequences of alternative splicing in the innate immune system and give our view of important directions for future studies.

  16. Beam dynamic simulations of the CLIC crab cavity and implications on the BDS

    NASA Astrophysics Data System (ADS)

    Shinton, I. R. R.; Burt, G.; Glasman, C. J.; Jones, R. M.; Wolski, A.

    2011-11-01

    The Compact Linear Collider (CLIC) is a proposed electron positron linear collider design aiming to achieve a centre of mass energy of up to 3 TeV. The main accelerating structures in CLIC operate at an X-band frequency of 11.994 GHz with an accelerating gradient of 100 MV/m. The present design requires the beams to collide at a small crossing angle of 10 mrad per line giving a resultant overall crossing angle of 20 mrad. Transverse deflecting cavities, referred to as "Crab cavities", are installed in the beam delivery system (BDS) of linear collider designs in order to ensure the final luminosity at the interaction point (IP) is comparable to that in a head on collision. We utilise the beam tracking code PLACET combined with the beam-beam code GUINEA-PIG to calculate the resulting luminosity at the IP. We follow a similar tuning procedure to that used for the design of the ILC crab cavities and anitcrab cavities. However an unexpected loss in luminosity of 10% was observed for the 20 mrad design was observed. It was discovered that the action of the crab cavities can affect the geometric aberrations resulting from the sextupoles used to correct chromatic effects in the beam delivery system. This has direct consequences regarding the design of the present CLIC BDS.

  17. Production code control system for hydrodynamics simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Slone, D.M.

    1997-08-18

    We describe how the Production Code Control System (pCCS), written in Perl, has been used to control and monitor the execution of a large hydrodynamics simulation code in a production environment. We have been able to integrate new, disparate, and often independent, applications into the PCCS framework without the need to modify any of our existing application codes. Both users and code developers see a consistent interface to the simulation code and associated applications regardless of the physical platform, whether an MPP, SMP, server, or desktop workstation. We will also describe our use of Perl to develop a configuration managementmore » system for the simulation code, as well as a code usage database and report generator. We used Perl to write a backplane that allows us plug in preprocessors, the hydrocode, postprocessors, visualization tools, persistent storage requests, and other codes. We need only teach PCCS a minimal amount about any new tool or code to essentially plug it in and make it usable to the hydrocode. PCCS has made it easier to link together disparate codes, since using Perl has removed the need to learn the idiosyncrasies of system or RPC programming. The text handling in Perl makes it easy to teach PCCS about new codes, or changes to existing codes.« less

  18. BCH codes for large IC random-access memory systems

    NASA Technical Reports Server (NTRS)

    Lin, S.; Costello, D. J., Jr.

    1983-01-01

    In this report some shortened BCH codes for possible applications to large IC random-access memory systems are presented. These codes are given by their parity-check matrices. Encoding and decoding of these codes are discussed.

  19. High performance and cost effective CO-OFDM system aided by polar code.

    PubMed

    Liu, Ling; Xiao, Shilin; Fang, Jiafei; Zhang, Lu; Zhang, Yunhao; Bi, Meihua; Hu, Weisheng

    2017-02-06

    A novel polar coded coherent optical orthogonal frequency division multiplexing (CO-OFDM) system is proposed and demonstrated through experiment for the first time. The principle of a polar coded CO-OFDM signal is illustrated theoretically and the suitable polar decoding method is discussed. Results show that the polar coded CO-OFDM signal achieves a net coding gain (NCG) of more than 10 dB at bit error rate (BER) of 10-3 over 25-Gb/s 480-km transmission in comparison with conventional CO-OFDM. Also, compared to the 25-Gb/s low-density parity-check (LDPC) coded CO-OFDM 160-km system, the polar code provides a NCG of 0.88 dB @BER = 10-3. Moreover, the polar code can relieve the laser linewidth requirement massively to get a more cost-effective CO-OFDM system.

  20. A Review on Spectral Amplitude Coding Optical Code Division Multiple Access

    NASA Astrophysics Data System (ADS)

    Kaur, Navpreet; Goyal, Rakesh; Rani, Monika

    2017-06-01

    This manuscript deals with analysis of Spectral Amplitude Coding Optical Code Division Multiple Access (SACOCDMA) system. The major noise source in optical CDMA is co-channel interference from other users known as multiple access interference (MAI). The system performance in terms of bit error rate (BER) degrades as a result of increased MAI. It is perceived that number of users and type of codes used for optical system directly decide the performance of system. MAI can be restricted by efficient designing of optical codes and implementing them with unique architecture to accommodate more number of users. Hence, it is a necessity to design a technique like spectral direct detection (SDD) technique with modified double weight code, which can provide better cardinality and good correlation property.

  1. Adaptive variable-length coding for efficient compression of spacecraft television data.

    NASA Technical Reports Server (NTRS)

    Rice, R. F.; Plaunt, J. R.

    1971-01-01

    An adaptive variable length coding system is presented. Although developed primarily for the proposed Grand Tour missions, many features of this system clearly indicate a much wider applicability. Using sample to sample prediction, the coding system produces output rates within 0.25 bit/picture element (pixel) of the one-dimensional difference entropy for entropy values ranging from 0 to 8 bit/pixel. This is accomplished without the necessity of storing any code words. Performance improvements of 0.5 bit/pixel can be simply achieved by utilizing previous line correlation. A Basic Compressor, using concatenated codes, adapts to rapid changes in source statistics by automatically selecting one of three codes to use for each block of 21 pixels. The system adapts to less frequent, but more dramatic, changes in source statistics by adjusting the mode in which the Basic Compressor operates on a line-to-line basis. Furthermore, the compression system is independent of the quantization requirements of the pulse-code modulation system.

  2. Coordinated design of coding and modulation systems

    NASA Technical Reports Server (NTRS)

    Massey, J. L.

    1976-01-01

    Work on partial unit memory codes continued; it was shown that for a given virtual state complexity, the maximum free distance over the class of all convolutional codes is achieved within the class of unit memory codes. The effect of phase-lock loop (PLL) tracking error on coding system performance was studied by using the channel cut-off rate as the measure of quality of a modulation system. Optimum modulation signal sets for a non-white Gaussian channel considered an heuristic selection rule based on a water-filling argument. The use of error correcting codes to perform data compression by the technique of syndrome source coding was researched and a weight-and-error-locations scheme was developed that is closely related to LDSC coding.

  3. [Data coding in the Israeli healthcare system - do choices provide the answers to our system's needs?].

    PubMed

    Zelingher, Julian; Ash, Nachman

    2013-05-01

    The IsraeLi healthcare system has undergone major processes for the adoption of health information technologies (HIT), and enjoys high Levels of utilization in hospital and ambulatory care. Coding is an essential infrastructure component of HIT, and ts purpose is to represent data in a simplified and common format, enhancing its manipulation by digital systems. Proper coding of data enables efficient identification, storage, retrieval and communication of data. UtiLization of uniform coding systems by different organizations enables data interoperability between them, facilitating communication and integrating data elements originating in different information systems from various organizations. Current needs in Israel for heaLth data coding include recording and reporting of diagnoses for hospitalized patients, outpatients and visitors of the Emergency Department, coding of procedures and operations, coding of pathology findings, reporting of discharge diagnoses and causes of death, billing codes, organizational data warehouses and national registries. New national projects for cLinicaL data integration, obligatory reporting of quality indicators and new Ministry of Health (MOH) requirements for HIT necessitate a high Level of interoperability that can be achieved only through the adoption of uniform coding. Additional pressures were introduced by the USA decision to stop the maintenance of the ICD-9-CM codes that are also used by Israeli healthcare, and the adoption of ICD-10-C and ICD-10-PCS as the main coding system for billing purpose. The USA has also mandated utilization of SNOMED-CT as the coding terminology for the ELectronic Health Record problem list, and for reporting quality indicators to the CMS. Hence, the Israeli MOH has recently decided that discharge diagnoses will be reported using ICD-10-CM codes, and SNOMED-CT will be used to code the cLinical information in the EHR. We reviewed the characteristics, strengths and weaknesses of these two coding systems. In summary, the adoption of ICD-10-CM is in line with the USA decision to abandon ICD-9-CM, and the Israeli heaLthcare system could benefit from USA heaLthcare efforts in this direction. The Large content of SNOMED-CT and its sophisticated hierarchical data structure will enable advanced cLinicaL decision support and quality improvement applications.

  4. Low-density parity-check codes for volume holographic memory systems.

    PubMed

    Pishro-Nik, Hossein; Rahnavard, Nazanin; Ha, Jeongseok; Fekri, Faramarz; Adibi, Ali

    2003-02-10

    We investigate the application of low-density parity-check (LDPC) codes in volume holographic memory (VHM) systems. We show that a carefully designed irregular LDPC code has a very good performance in VHM systems. We optimize high-rate LDPC codes for the nonuniform error pattern in holographic memories to reduce the bit error rate extensively. The prior knowledge of noise distribution is used for designing as well as decoding the LDPC codes. We show that these codes have a superior performance to that of Reed-Solomon (RS) codes and regular LDPC counterparts. Our simulation shows that we can increase the maximum storage capacity of holographic memories by more than 50 percent if we use irregular LDPC codes with soft-decision decoding instead of conventionally employed RS codes with hard-decision decoding. The performance of these LDPC codes is close to the information theoretic capacity.

  5. Identification and Classification of Orthogonal Frequency Division Multiple Access (OFDMA) Signals Used in Next Generation Wireless Systems

    DTIC Science & Technology

    2012-03-01

    advanced antenna systems AMC adaptive modulation and coding AWGN additive white Gaussian noise BPSK binary phase shift keying BS base station BTC ...QAM-16, and QAM-64, and coding types include convolutional coding (CC), convolutional turbo coding (CTC), block turbo coding ( BTC ), zero-terminating

  6. Monitoring the evolutionary aspect of the Gene Ontology to enhance predictability and usability.

    PubMed

    Park, Jong C; Kim, Tak-eun; Park, Jinah

    2008-04-11

    Much effort is currently made to develop the Gene Ontology (GO). Due to the dynamic nature of information it addresses, GO undergoes constant updates whose results are released at regular intervals as separate versions. Although there are a large number of computational tools to aid the development of GO, they are operating on a particular version of GO, making it difficult for GO curators to anticipate the full impact of particular changes along the time axis on a larger scale. We present a method for tapping into such an evolutionary aspect of GO, by making it possible to keep track of important temporal changes to any of the terms and relations of GO and by consequently making it possible to recognize associated trends. We have developed visualization methods for viewing the changes between two different versions of GO by constructing a colour-coded layered graph. The graph shows both versions of GO with highlights to those GO terms that are added, removed and modified between the two versions. Focusing on a specific GO term or terms of interest over a period, we demonstrate the utility of our system that can be used to make useful hypotheses about the cause of the evolution and to provide new insights into more complex changes. GO undergoes fast evolutionary changes. A snapshot of GO, as presented by each version of GO alone, overlooks such evolutionary aspects, and consequently limits the utilities of GO. The method that highlights the differences of consecutive versions or two different versions of an evolving ontology with colour-coding enhances the utility of GO for users as well as for developers. To the best of our knowledge, this is the first proposal to visualize the evolutionary aspect of GO.

  7. Program MAMO: Models for avian management optimization-user guide

    USGS Publications Warehouse

    Guillaumet, Alban; Paxton, Eben H.

    2017-01-01

    The following chapters describe the structure and code of MAMO, and walk the reader through running the different components of the program with sample data. This manual should be used alongside a computer running R, so that the reader can copy and paste code into R, observe the output, and follow along interactively. Taken together, chapters 2–4 will allow the user to replicate a simulation study investigating the consequences of climate change and two potential management actions on the population dynamics of a vulnerable and iconic Hawaiian forest bird, the ‘I‘iwi (Drepanis coccinea; hereafter IIWI).

  8. A Benchmarking Initiative for Reactive Transport Modeling Applied to Subsurface Environmental Applications

    NASA Astrophysics Data System (ADS)

    Steefel, C. I.

    2015-12-01

    Over the last 20 years, we have seen the evolution of multicomponent reactive transport modeling and the expanding range and increasing complexity of subsurface environmental applications it is being used to address. Reactive transport modeling is being asked to provide accurate assessments of engineering performance and risk for important issues with far-reaching consequences. As a result, the complexity and detail of subsurface processes, properties, and conditions that can be simulated have significantly expanded. Closed form solutions are necessary and useful, but limited to situations that are far simpler than typical applications that combine many physical and chemical processes, in many cases in coupled form. In the absence of closed form and yet realistic solutions for complex applications, numerical benchmark problems with an accepted set of results will be indispensable to qualifying codes for various environmental applications. The intent of this benchmarking exercise, now underway for more than five years, is to develop and publish a set of well-described benchmark problems that can be used to demonstrate simulator conformance with norms established by the subsurface science and engineering community. The objective is not to verify this or that specific code--the reactive transport codes play a supporting role in this regard—but rather to use the codes to verify that a common solution of the problem can be achieved. Thus, the objective of each of the manuscripts is to present an environmentally-relevant benchmark problem that tests the conceptual model capabilities, numerical implementation, process coupling, and accuracy. The benchmark problems developed to date include 1) microbially-mediated reactions, 2) isotopes, 3) multi-component diffusion, 4) uranium fate and transport, 5) metal mobility in mining affected systems, and 6) waste repositories and related aspects.

  9. Development of an object-oriented finite element program: application to metal-forming and impact simulations

    NASA Astrophysics Data System (ADS)

    Pantale, O.; Caperaa, S.; Rakotomalala, R.

    2004-07-01

    During the last 50 years, the development of better numerical methods and more powerful computers has been a major enterprise for the scientific community. In the same time, the finite element method has become a widely used tool for researchers and engineers. Recent advances in computational software have made possible to solve more physical and complex problems such as coupled problems, nonlinearities, high strain and high-strain rate problems. In this field, an accurate analysis of large deformation inelastic problems occurring in metal-forming or impact simulations is extremely important as a consequence of high amount of plastic flow. In this presentation, the object-oriented implementation, using the C++ language, of an explicit finite element code called DynELA is presented. The object-oriented programming (OOP) leads to better-structured codes for the finite element method and facilitates the development, the maintainability and the expandability of such codes. The most significant advantage of OOP is in the modeling of complex physical systems such as deformation processing where the overall complex problem is partitioned in individual sub-problems based on physical, mathematical or geometric reasoning. We first focus on the advantages of OOP for the development of scientific programs. Specific aspects of OOP, such as the inheritance mechanism, the operators overload procedure or the use of template classes are detailed. Then we present the approach used for the development of our finite element code through the presentation of the kinematics, conservative and constitutive laws and their respective implementation in C++. Finally, the efficiency and accuracy of our finite element program are investigated using a number of benchmark tests relative to metal forming and impact simulations.

  10. When Homoplasy Is Not Homoplasy: Dissecting Trait Evolution by Contrasting Composite and Reductive Coding.

    PubMed

    Torres-Montúfar, Alejandro; Borsch, Thomas; Ochoterena, Helga

    2018-05-01

    The conceptualization and coding of characters is a difficult issue in phylogenetic systematics, no matter which inference method is used when reconstructing phylogenetic trees or if the characters are just mapped onto a specific tree. Complex characters are groups of features that can be divided into simpler hierarchical characters (reductive coding), although the implied hierarchical relational information may change depending on the type of coding (composite vs. reductive). Up to now, there is no common agreement to either code characters as complex or simple. Phylogeneticists have discussed which coding method is best but have not incorporated the heuristic process of reciprocal illumination to evaluate the coding. Composite coding allows to test whether 1) several characters were linked resulting in a structure described as a complex character or trait or 2) independently evolving characters resulted in the configuration incorrectly interpreted as a complex character. We propose that complex characters or character states should be decomposed iteratively into simpler characters when the original homology hypothesis is not corroborated by a phylogenetic analysis, and the character or character state is retrieved as homoplastic. We tested this approach using the case of fruit types within subfamily Cinchonoideae (Rubiaceae). The iterative reductive coding of characters associated with drupes allowed us to unthread fruit evolution within Cinchonoideae. Our results show that drupes and berries are not homologous. As a consequence, a more precise ontology for the Cinchonoideae drupes is required.

  11. Deductive Glue Code Synthesis for Embedded Software Systems Based on Code Patterns

    NASA Technical Reports Server (NTRS)

    Liu, Jian; Fu, Jicheng; Zhang, Yansheng; Bastani, Farokh; Yen, I-Ling; Tai, Ann; Chau, Savio N.

    2006-01-01

    Automated code synthesis is a constructive process that can be used to generate programs from specifications. It can, thus, greatly reduce the software development cost and time. The use of formal code synthesis approach for software generation further increases the dependability of the system. Though code synthesis has many potential benefits, the synthesis techniques are still limited. Meanwhile, components are widely used in embedded system development. Applying code synthesis to component based software development (CBSD) process can greatly enhance the capability of code synthesis while reducing the component composition efforts. In this paper, we discuss the issues and techniques for applying deductive code synthesis techniques to CBSD. For deductive synthesis in CBSD, a rule base is the key for inferring appropriate component composition. We use the code patterns to guide the development of rules. Code patterns have been proposed to capture the typical usages of the components. Several general composition operations have been identified to facilitate systematic composition. We present the technique for rule development and automated generation of new patterns from existing code patterns. A case study of using this method in building a real-time control system is also presented.

  12. 3D neutronic codes coupled with thermal-hydraulic system codes for PWR, and BWR and VVER reactors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Langenbuch, S.; Velkov, K.; Lizorkin, M.

    1997-07-01

    This paper describes the objectives of code development for coupling 3D neutronics codes with thermal-hydraulic system codes. The present status of coupling ATHLET with three 3D neutronics codes for VVER- and LWR-reactors is presented. After describing the basic features of the 3D neutronic codes BIPR-8 from Kurchatov-Institute, DYN3D from Research Center Rossendorf and QUABOX/CUBBOX from GRS, first applications of coupled codes for different transient and accident scenarios are presented. The need of further investigations is discussed.

  13. Visual feedback system to reduce errors while operating roof bolting machines

    PubMed Central

    Steiner, Lisa J.; Burgess-Limerick, Robin; Eiter, Brianna; Porter, William; Matty, Tim

    2015-01-01

    Problem Operators of roof bolting machines in underground coal mines do so in confined spaces and in very close proximity to the moving equipment. Errors in the operation of these machines can have serious consequences, and the design of the equipment interface has a critical role in reducing the probability of such errors. Methods An experiment was conducted to explore coding and directional compatibility on actual roof bolting equipment and to determine the feasibility of a visual feedback system to alert operators of critical movements and to also alert other workers in close proximity to the equipment to the pending movement of the machine. The quantitative results of the study confirmed the potential for both selection errors and direction errors to be made, particularly during training. Results Subjective data confirmed a potential benefit of providing visual feedback of the intended operations and movements of the equipment. Impact This research may influence the design of these and other similar control systems to provide evidence for the use of warning systems to improve operator situational awareness. PMID:23398703

  14. System Dynamics to Climate-Driven Water Budget Analysis in the Eastern Snake Plains Aquifer

    NASA Astrophysics Data System (ADS)

    Ryu, J.; Contor, B.; Wylie, A.; Johnson, G.; Allen, R. G.

    2010-12-01

    Climate variability, weather extremes and climate change continue to threaten the sustainability of water resources in the western United States. Given current climate change projections, increasing temperature is likely to modify the timing, form, and intensity of precipitation events, which consequently affect regional and local hydrologic cycles. As a result, drought, water shortage, and subsequent water conflicts may become an increasing threat in monotone hydrologic systems in arid lands, such as the Eastern Snake Plain Aquifer (ESPA). The ESPA, in particular, is a critical asset in the state of Idaho. It is known as the economic lifeblood for more than half of Idaho’s population so that water resources availability and aquifer management due to climate change is of great interest, especially over the next few decades. In this study, we apply system dynamics as a methodology with which to address dynamically complex problems in ESPA’s water resources management. Aquifer recharge and discharge dynamics are coded in STELLA modeling system as input and output, respectively to identify long-term behavior of aquifer responses to climate-driven hydrological changes.

  15. Communication Civility Codes: Positive Communication through the Students' Eyes

    ERIC Educational Resources Information Center

    Pawlowski, Donna R.

    2017-01-01

    Courses: Presentational courses such as Public Speaking, Interviewing, Business and Professional, Persuasion, Interpersonal; any course where civility may be promoted in the classroom. Objectives: At the end of this single-class activity, students will have an understanding of civility in order to: (1) identify civility and consequences of…

  16. VENTURE/PC manual: A multidimensional multigroup neutron diffusion code system. Version 3

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shapiro, A.; Huria, H.C.; Cho, K.W.

    1991-12-01

    VENTURE/PC is a recompilation of part of the Oak Ridge BOLD VENTURE code system, which will operate on an IBM PC or compatible computer. Neutron diffusion theory solutions are obtained for multidimensional, multigroup problems. This manual contains information associated with operating the code system. The purpose of the various modules used in the code system, and the input for these modules are discussed. The PC code structure is also given. Version 2 included several enhancements not given in the original version of the code. In particular, flux iterations can be done in core rather than by reading and writing tomore » disk, for problems which allow sufficient memory for such in-core iterations. This speeds up the iteration process. Version 3 does not include any of the special processors used in the previous versions. These special processors utilized formatted input for various elements of the code system. All such input data is now entered through the Input Processor, which produces standard interface files for the various modules in the code system. In addition, a Standard Interface File Handbook is included in the documentation which is distributed with the code, to assist in developing the input for the Input Processor.« less

  17. Error control techniques for satellite and space communications

    NASA Technical Reports Server (NTRS)

    Costello, D. J., Jr.

    1986-01-01

    High rate concatenated coding systems with trellis inner codes and Reed-Solomon (RS) outer codes for application in satellite communication systems are considered. Two types of inner codes are studied: high rate punctured binary convolutional codes which result in overall effective information rates between 1/2 and 1 bit per channel use; and bandwidth efficient signal space trellis codes which can achieve overall effective information rates greater than 1 bit per channel use. Channel capacity calculations with and without side information performed for the concatenated coding system. Concatenated coding schemes are investigated. In Scheme 1, the inner code is decoded with the Viterbi algorithm and the outer RS code performs error-correction only (decoding without side information). In scheme 2, the inner code is decoded with a modified Viterbi algorithm which produces reliability information along with the decoded output. In this algorithm, path metrics are used to estimate the entire information sequence, while branch metrics are used to provide the reliability information on the decoded sequence. This information is used to erase unreliable bits in the decoded output. An errors-and-erasures RS decoder is then used for the outer code. These two schemes are proposed for use on NASA satellite channels. Results indicate that high system reliability can be achieved with little or no bandwidth expansion.

  18. High dynamic range coding imaging system

    NASA Astrophysics Data System (ADS)

    Wu, Renfan; Huang, Yifan; Hou, Guangqi

    2014-10-01

    We present a high dynamic range (HDR) imaging system design scheme based on coded aperture technique. This scheme can help us obtain HDR images which have extended depth of field. We adopt Sparse coding algorithm to design coded patterns. Then we utilize the sensor unit to acquire coded images under different exposure settings. With the guide of the multiple exposure parameters, a series of low dynamic range (LDR) coded images are reconstructed. We use some existing algorithms to fuse and display a HDR image by those LDR images. We build an optical simulation model and get some simulation images to verify the novel system.

  19. Mechanical code comparator

    DOEpatents

    Peter, Frank J.; Dalton, Larry J.; Plummer, David W.

    2002-01-01

    A new class of mechanical code comparators is described which have broad potential for application in safety, surety, and security applications. These devices can be implemented as micro-scale electromechanical systems that isolate a secure or otherwise controlled device until an access code is entered. This access code is converted into a series of mechanical inputs to the mechanical code comparator, which compares the access code to a pre-input combination, entered previously into the mechanical code comparator by an operator at the system security control point. These devices provide extremely high levels of robust security. Being totally mechanical in operation, an access control system properly based on such devices cannot be circumvented by software attack alone.

  20. HERCULES: A Pattern Driven Code Transformation System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kartsaklis, Christos; Hernandez, Oscar R; Hsu, Chung-Hsing

    2012-01-01

    New parallel computers are emerging, but developing efficient scientific code for them remains difficult. A scientist must manage not only the science-domain complexity but also the performance-optimization complexity. HERCULES is a code transformation system designed to help the scientist to separate the two concerns, which improves code maintenance, and facilitates performance optimization. The system combines three technologies, code patterns, transformation scripts and compiler plugins, to provide the scientist with an environment to quickly implement code transformations that suit his needs. Unlike existing code optimization tools, HERCULES is unique in its focus on user-level accessibility. In this paper we discuss themore » design, implementation and an initial evaluation of HERCULES.« less

  1. Performance analysis of optical wireless communication system based on two-fold turbo code

    NASA Astrophysics Data System (ADS)

    Chen, Jun; Huang, Dexiu; Yuan, Xiuhua

    2005-11-01

    Optical wireless communication (OWC) is beginning to emerge in the telecommunications market as a strategy to meet last-mile demand owing to its unique combination of features. Turbo codes have an impressive near Shannon-limit error correcting performance. Twofold turbo codes have been recently introduced as the least complex member of the multifold turbo code family. In this paper, at first, we present the mathematical model of signal and optical wireless channel with fading and bit error rate model with scintillation, then we provide a new turbo code method to use in OWC system, we can obtain a better BER curse of OWC system with twofold turbo code than with common turbo code.

  2. Integration of QR codes into an anesthesia information management system for resident case log management.

    PubMed

    Avidan, Alexander; Weissman, Charles; Levin, Phillip D

    2015-04-01

    Quick response (QR) codes containing anesthesia syllabus data were introduced into an anesthesia information management system. The code was generated automatically at the conclusion of each case and available for resident case logging using a smartphone or tablet. The goal of this study was to evaluate the use and usability/user-friendliness of such system. Resident case logging practices were assessed prior to introducing the QR codes. QR code use and satisfactions amongst residents was reassessed at three and six months. Before QR code introduction only 12/23 (52.2%) residents maintained a case log. Most of the remaining residents (9/23, 39.1%) expected to receive a case list from the anesthesia information management system database at the end of their residency. At three months and six months 17/26 (65.4%) and 15/25 (60.0%) residents, respectively, were using the QR codes. Satisfaction was rated as very good or good. QR codes for residents' case logging with smartphones or tablets were successfully introduced in an anesthesia information management system and used by most residents. QR codes can be successfully implemented into medical practice to support data transfer. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  3. The evolution of transcriptional regulation in eukaryotes

    NASA Technical Reports Server (NTRS)

    Wray, Gregory A.; Hahn, Matthew W.; Abouheif, Ehab; Balhoff, James P.; Pizer, Margaret; Rockman, Matthew V.; Romano, Laura A.

    2003-01-01

    Gene expression is central to the genotype-phenotype relationship in all organisms, and it is an important component of the genetic basis for evolutionary change in diverse aspects of phenotype. However, the evolution of transcriptional regulation remains understudied and poorly understood. Here we review the evolutionary dynamics of promoter, or cis-regulatory, sequences and the evolutionary mechanisms that shape them. Existing evidence indicates that populations harbor extensive genetic variation in promoter sequences, that a substantial fraction of this variation has consequences for both biochemical and organismal phenotype, and that some of this functional variation is sorted by selection. As with protein-coding sequences, rates and patterns of promoter sequence evolution differ considerably among loci and among clades for reasons that are not well understood. Studying the evolution of transcriptional regulation poses empirical and conceptual challenges beyond those typically encountered in analyses of coding sequence evolution: promoter organization is much less regular than that of coding sequences, and sequences required for the transcription of each locus reside at multiple other loci in the genome. Because of the strong context-dependence of transcriptional regulation, sequence inspection alone provides limited information about promoter function. Understanding the functional consequences of sequence differences among promoters generally requires biochemical and in vivo functional assays. Despite these challenges, important insights have already been gained into the evolution of transcriptional regulation, and the pace of discovery is accelerating.

  4. Critical evaluation of reverse engineering tool Imagix 4D!

    PubMed

    Yadav, Rashmi; Patel, Ravindra; Kothari, Abhay

    2016-01-01

    The comprehension of legacy codes is difficult to understand. Various commercial reengineering tools are available that have unique working styles, and are equipped with their inherent capabilities and shortcomings. The focus of the available tools is in visualizing static behavior not the dynamic one. Therefore, it is difficult for people who work in software product maintenance, code understanding reengineering/reverse engineering. Consequently, the need for a comprehensive reengineering/reverse engineering tool arises. We found the usage of Imagix 4D to be good as it generates the maximum pictorial representations in the form of flow charts, flow graphs, class diagrams, metrics and, to a partial extent, dynamic visualizations. We evaluated Imagix 4D with the help of a case study involving a few samples of source code. The behavior of the tool was analyzed on multiple small codes and a large code gcc C parser. Large code evaluation was performed to uncover dead code, unstructured code, and the effect of not including required files at preprocessing level. The utility of Imagix 4D to prepare decision density and complexity metrics for a large code was found to be useful in getting to know how much reengineering is required. At the outset, Imagix 4D offered limitations in dynamic visualizations, flow chart separation (large code) and parsing loops. The outcome of evaluation will eventually help in upgrading Imagix 4D and posed a need of full featured tools in the area of software reengineering/reverse engineering. It will also help the research community, especially those who are interested in the realm of software reengineering tool building.

  5. More than a score: a qualitative study of ancillary benefits of performance measurement.

    PubMed

    Powell, Adam A; White, Katie M; Partin, Melissa R; Halek, Krysten; Hysong, Sylvia J; Zarling, Edwin; Kirsh, Susan R; Bloomfield, Hanna E

    2014-08-01

    Prior research has examined clinical effects of performance measurement systems. To the extent that non-clinical effects have been researched, the focus has been on negative unintended consequences. Yet, these same systems may also have ancillary benefits for patients and providers--that is, benefits that extend beyond improvements on clinical measures. The purpose of this study is to identify and describe potential ancillary benefits of performance measures as perceived by primary care staff and facility leaders in a large US healthcare system. In-person individual semistructured interviews were conducted with 59 primary care staff and facility leaders at four Veterans Health Administration facilities. Transcribed interviews were coded and organised into thematic categories. Interviewed staff observed that local performance measurement implementation practices can result in increased patient knowledge and motivation. These effects on patients can lead to improved performance scores and additional ancillary benefits. Performance measurement implementation can also directly result in ancillary benefits for the patients and providers. Patients may experience greater satisfaction with care and psychosocial benefits associated with increased provider-patient communication. Ancillary benefits of performance measurement for providers include increased pride in individual or organisational performance and greater confidence that one's practice is grounded in evidence-based medicine. A comprehensive understanding of the effects of performance measurement systems needs to incorporate ancillary benefits as well as effects on clinical performance scores and negative unintended consequences. Although clinical performance has been the focus of most evaluations of performance measurement to date, both patient care and provider satisfaction may improve more rapidly if all three categories of effects are considered when designing and evaluating performance measurement systems. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  6. Fixed-point Design of the Lattice-reduction-aided Iterative Detection and Decoding Receiver for Coded MIMO Systems

    DTIC Science & Technology

    2011-01-01

    reliability, e.g., Turbo Codes [2] and Low Density Parity Check ( LDPC ) codes [3]. The challenge to apply both MIMO and ECC into wireless systems is on...REPORT Fixed-point Design of theLattice-reduction-aided Iterative Detection andDecoding Receiver for Coded MIMO Systems 14. ABSTRACT 16. SECURITY...illustrates the performance of coded LR aided detectors. 1. REPORT DATE (DD-MM-YYYY) 4. TITLE AND SUBTITLE 13. SUPPLEMENTARY NOTES The views, opinions

  7. Expert system for maintenance management of a boiling water reactor power plant

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hong Shen; Liou, L.W.; Levine, S.

    1992-01-01

    An expert system code has been developed for the maintenance of two boiling water reactor units in Berwick, Pennsylvania, that are operated by the Pennsylvania Power and Light Company (PP and L). The objective of this expert system code, where the knowledge of experienced operators and engineers is captured and implemented, is to support the decisions regarding which components can be safely and reliably removed from service for maintenance. It can also serve as a query-answering facility for checking the plant system status and for training purposes. The operating and maintenance information of a large number of support systems, whichmore » must be available for emergencies and/or in the event of an accident, is stored in the data base of the code. It identifies the relevant technical specifications and management rules for shutting down any one of the systems or removing a component from service to support maintenance. Because of the complexity and time needed to incorporate a large number of systems and their components, the first phase of the expert system develops a prototype code, which includes only the reactor core isolation coolant system, the high-pressure core injection system, the instrument air system, the service water system, and the plant electrical system. The next phase is scheduled to expand the code to include all other systems. This paper summarizes the prototype code and the design concept of the complete expert system code for maintenance management of all plant systems and components.« less

  8. FORTRAN Automated Code Evaluation System (faces) system documentation, version 2, mod 0. [error detection codes/user manuals (computer programs)

    NASA Technical Reports Server (NTRS)

    1975-01-01

    A system is presented which processes FORTRAN based software systems to surface potential problems before they become execution malfunctions. The system complements the diagnostic capabilities of compilers, loaders, and execution monitors rather than duplicating these functions. Also, it emphasizes frequent sources of FORTRAN problems which require inordinate manual effort to identify. The principle value of the system is extracting small sections of unusual code from the bulk of normal sequences. Code structures likely to cause immediate or future problems are brought to the user's attention. These messages stimulate timely corrective action of solid errors and promote identification of 'tricky' code. Corrective action may require recoding or simply extending software documentation to explain the unusual technique.

  9. Architecture and implementation considerations of a high-speed Viterbi decoder for a Reed-Muller subcode

    NASA Technical Reports Server (NTRS)

    Lin, Shu (Principal Investigator); Uehara, Gregory T.; Nakamura, Eric; Chu, Cecilia W. P.

    1996-01-01

    The (64, 40, 8) subcode of the third-order Reed-Muller (RM) code for high-speed satellite communications is proposed. The RM subcode can be used either alone or as an inner code of a concatenated coding system with the NASA standard (255, 233, 33) Reed-Solomon (RS) code as the outer code to achieve high performance (or low bit-error rate) with reduced decoding complexity. It can also be used as a component code in a multilevel bandwidth efficient coded modulation system to achieve reliable bandwidth efficient data transmission. The progress made toward achieving the goal of implementing a decoder system based upon this code is summarized. The development of the integrated circuit prototype sub-trellis IC, particularly focusing on the design methodology, is addressed.

  10. Recurrent and functional regulatory mutations in breast cancer.

    PubMed

    Rheinbay, Esther; Parasuraman, Prasanna; Grimsby, Jonna; Tiao, Grace; Engreitz, Jesse M; Kim, Jaegil; Lawrence, Michael S; Taylor-Weiner, Amaro; Rodriguez-Cuevas, Sergio; Rosenberg, Mara; Hess, Julian; Stewart, Chip; Maruvka, Yosef E; Stojanov, Petar; Cortes, Maria L; Seepo, Sara; Cibulskis, Carrie; Tracy, Adam; Pugh, Trevor J; Lee, Jesse; Zheng, Zongli; Ellisen, Leif W; Iafrate, A John; Boehm, Jesse S; Gabriel, Stacey B; Meyerson, Matthew; Golub, Todd R; Baselga, Jose; Hidalgo-Miranda, Alfredo; Shioda, Toshi; Bernards, Andre; Lander, Eric S; Getz, Gad

    2017-07-06

    Genomic analysis of tumours has led to the identification of hundreds of cancer genes on the basis of the presence of mutations in protein-coding regions. By contrast, much less is known about cancer-causing mutations in non-coding regions. Here we perform deep sequencing in 360 primary breast cancers and develop computational methods to identify significantly mutated promoters. Clear signals are found in the promoters of three genes. FOXA1, a known driver of hormone-receptor positive breast cancer, harbours a mutational hotspot in its promoter leading to overexpression through increased E2F binding. RMRP and NEAT1, two non-coding RNA genes, carry mutations that affect protein binding to their promoters and alter expression levels. Our study shows that promoter regions harbour recurrent mutations in cancer with functional consequences and that the mutations occur at similar frequencies as in coding regions. Power analyses indicate that more such regions remain to be discovered through deep sequencing of adequately sized cohorts of patients.

  11. A simplified procedure for correcting both errors and erasures of a Reed-Solomon code using the Euclidean algorithm

    NASA Technical Reports Server (NTRS)

    Truong, T. K.; Hsu, I. S.; Eastman, W. L.; Reed, I. S.

    1987-01-01

    It is well known that the Euclidean algorithm or its equivalent, continued fractions, can be used to find the error locator polynomial and the error evaluator polynomial in Berlekamp's key equation needed to decode a Reed-Solomon (RS) code. A simplified procedure is developed and proved to correct erasures as well as errors by replacing the initial condition of the Euclidean algorithm by the erasure locator polynomial and the Forney syndrome polynomial. By this means, the errata locator polynomial and the errata evaluator polynomial can be obtained, simultaneously and simply, by the Euclidean algorithm only. With this improved technique the complexity of time domain RS decoders for correcting both errors and erasures is reduced substantially from previous approaches. As a consequence, decoders for correcting both errors and erasures of RS codes can be made more modular, regular, simple, and naturally suitable for both VLSI and software implementation. An example illustrating this modified decoding procedure is given for a (15, 9) RS code.

  12. Nurses' attitudes toward the use of the bar-coding medication administration system.

    PubMed

    Marini, Sana Daya; Hasman, Arie; Huijer, Huda Abu-Saad; Dimassi, Hani

    2010-01-01

    This study determines nurses' attitudes toward bar-coding medication administration system use. Some of the factors underlying the successful use of bar-coding medication administration systems that are viewed as a connotative indicator of users' attitudes were used to gather data that describe the attitudinal basis for system adoption and use decisions in terms of subjective satisfaction. Only 67 nurses in the United States had the chance to respond to the e-questionnaire posted on the CARING list server for the months of June and July 2007. Participants rated their satisfaction with bar-coding medication administration system use based on system functionality, usability, and its positive/negative impact on the nursing practice. Results showed, to some extent, positive attitude, but the image profile draws attention to nurses' concerns for improving certain system characteristics. The high bar-coding medication administration system skills revealed a more negative perception of the system by the nursing staff. The reasons underlying dissatisfaction with bar-coding medication administration use by skillful users are an important source of knowledge that can be helpful for system development as well as system deployment. As a result, strengthening bar-coding medication administration system usability by magnifying its ability to eliminate medication errors and the contributing factors, maximizing system functionality by ascertaining its power as an extra eye in the medication administration process, and impacting the clinical nursing practice positively by being helpful to nurses, speeding up the medication administration process, and being user-friendly can offer a congenial settings for establishing positive attitude toward system use, which in turn leads to successful bar-coding medication administration system use.

  13. Proteomic validation of protease drug targets: pharmacoproteomics of matrix metalloproteinase inhibitor drugs using isotope-coded affinity tag labelling and tandem mass spectrometry.

    PubMed

    Butler, G S; Overall, C M

    2007-01-01

    We illustrate the use of quantitative proteomics, namely isotope-coded affinity tag labelling and tandem mass spectrometry, to assess the targets and effects of the blockade of matrix metalloproteinases by an inhibitor drug in a breast cancer cell culture system. Treatment of MT1-MMP-transfected MDA-MB-231 cells with AG3340 (Prinomastat) directly affected the processing a multitude of matrix metalloproteinase substrates, and indirectly altered the expression of an array of other proteins with diverse functions. Therefore, broad spectrum blockade of MMPs has wide-ranging biological consequences. In this human breast cancer cell line, secreted substrates accumulated uncleaved in the conditioned medium and plasma membrane protein substrates were retained on the cell surface, due to reduced processing and shedding of these proteins (cell surface receptors, growth factors and bioactive molecules) to the medium in the presence of the matrix metalloproteinase inhibitor. Hence, proteomic investigation of drug-perturbed cellular proteomes can identify new protease substrates and at the same time provides valuable information for target validation, drug efficacy and potential side effects prior to commitment to clinical trials.

  14. ExaSAT: An exascale co-design tool for performance modeling

    DOE PAGES

    Unat, Didem; Chan, Cy; Zhang, Weiqun; ...

    2015-02-09

    One of the emerging challenges to designing HPC systems is understanding and projecting the requirements of exascale applications. In order to determine the performance consequences of different hardware designs, analytic models are essential because they can provide fast feedback to the co-design centers and chip designers without costly simulations. However, current attempts to analytically model program performance typically rely on the user manually specifying a performance model. Here we introduce the ExaSAT framework that automates the extraction of parameterized performance models directly from source code using compiler analysis. The parameterized analytic model enables quantitative evaluation of a broad range ofmore » hardware design trade-offs and software optimizations on a variety of different performance metrics, with a primary focus on data movement as a metric. Finally, we demonstrate the ExaSAT framework’s ability to perform deep code analysis of a proxy application from the Department of Energy Combustion Co-design Center to illustrate its value to the exascale co-design process. ExaSAT analysis provides insights into the hardware and software trade-offs and lays the groundwork for exploring a more targeted set of design points using cycle-accurate architectural simulators.« less

  15. Classification and simulation of stereoscopic artifacts in mobile 3DTV content

    NASA Astrophysics Data System (ADS)

    Boev, Atanas; Hollosi, Danilo; Gotchev, Atanas; Egiazarian, Karen

    2009-02-01

    We identify, categorize and simulate artifacts which might occur during delivery stereoscopic video to mobile devices. We consider the stages of 3D video delivery dataflow: content creation, conversion to the desired format (multiview or source-plus-depth), coding/decoding, transmission, and visualization on 3D display. Human 3D vision works by assessing various depth cues - accommodation, binocular depth cues, pictorial cues and motion parallax. As a consequence any artifact which modifies these cues impairs the quality of a 3D scene. The perceptibility of each artifact can be estimated through subjective tests. The material for such tests needs to contain various artifacts with different amounts of impairment. We present a system for simulation of these artifacts. The artifacts are organized in groups with similar origins, and each group is simulated by a block in a simulation channel. The channel introduces the following groups of artifacts: sensor limitations, geometric distortions caused by camera optics, spatial and temporal misalignments between video channels, spatial and temporal artifacts caused by coding, transmission losses, and visualization artifacts. For the case of source-plus-depth representation, artifacts caused by format conversion are added as well.

  16. Variation in clinical coding lists in UK general practice: a barrier to consistent data entry?

    PubMed

    Tai, Tracy Waize; Anandarajah, Sobanna; Dhoul, Neil; de Lusignan, Simon

    2007-01-01

    Routinely collected general practice computer data are used for quality improvement; poor data quality including inconsistent coding can reduce their usefulness. To document the diversity of data entry systems currently in use in UK general practice and highlight possible implications for data quality. General practice volunteers provided screen shots of the clinical coding screen they would use to code a diagnosis or problem title in the clinical consultation. The six clinical conditions examined were: depression, cystitis, type 2 diabetes mellitus, sore throat, tired all the time, and myocardial infarction. We looked at the picking lists generated for these problem titles in EMIS, IPS, GPASS and iSOFT general practice clinical computer systems, using the Triset browser as a gold standard for comparison. A mean of 19.3 codes is offered in the picking list after entering a diagnosis or problem title. EMIS produced the longest picking lists and GPASS the shortest, with a mean number of choices of 35.2 and 12.7, respectively. Approximately three-quarters (73.5%) of codes are diagnoses, one-eighth (12.5%) symptom codes, and the remainder come from a range of Read chapters. There was no readily detectable consistent order in which codes were displayed. Velocity coding, whereby commonly-used codes are placed higher in the picking list, results in variation between practices even where they have the same brand of computer system. Current systems for clinical coding promote diversity rather than consistency of clinical coding. As the UK moves towards an integrated health IT system consistency of coding will become more important. A standardised, limited list of codes for primary care might help address this need.

  17. 48 CFR 219.303 - Determining North American Industry Classification System (NAICS) codes and size standards.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 3 2011-10-01 2011-10-01 false Determining North American Industry Classification System (NAICS) codes and size standards. 219.303 Section 219.303 Federal... Programs 219.303 Determining North American Industry Classification System (NAICS) codes and size standards...

  18. 48 CFR 219.303 - Determining North American Industry Classification System (NAICS) codes and size standards.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 48 Federal Acquisition Regulations System 3 2012-10-01 2012-10-01 false Determining North American Industry Classification System (NAICS) codes and size standards. 219.303 Section 219.303 Federal... Programs 219.303 Determining North American Industry Classification System (NAICS) codes and size standards...

  19. 48 CFR 219.303 - Determining North American Industry Classification System (NAICS) codes and size standards.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 48 Federal Acquisition Regulations System 3 2014-10-01 2014-10-01 false Determining North American Industry Classification System (NAICS) codes and size standards. 219.303 Section 219.303 Federal... Determining North American Industry Classification System (NAICS) codes and size standards. Contracting...

  20. 48 CFR 219.303 - Determining North American Industry Classification System (NAICS) codes and size standards.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 48 Federal Acquisition Regulations System 3 2013-10-01 2013-10-01 false Determining North American Industry Classification System (NAICS) codes and size standards. 219.303 Section 219.303 Federal... Determining North American Industry Classification System (NAICS) codes and size standards. Contracting...

  1. 75 FR 78707 - Medicare Program; First Semi-Annual Meeting of the Advisory Panel on Ambulatory Payment...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-12-16

    ... hospital payment systems; hospital medical care delivery systems; provider billing and accounting systems; APC groups; Current Procedural Terminology codes; Health Care Common Procedure Coding System (HCPCS) codes; the use of, and payment for, drugs, medical devices, and other services in the outpatient setting...

  2. Predictive hypotheses are ineffectual in resolving complex biochemical systems.

    PubMed

    Fry, Michael

    2018-03-20

    Scientific hypotheses may either predict particular unknown facts or accommodate previously-known data. Although affirmed predictions are intuitively more rewarding than accommodations of established facts, opinions divide whether predictive hypotheses are also epistemically superior to accommodation hypotheses. This paper examines the contribution of predictive hypotheses to discoveries of several bio-molecular systems. Having all the necessary elements of the system known beforehand, an abstract predictive hypothesis of semiconservative mode of DNA replication was successfully affirmed. However, in defining the genetic code whose biochemical basis was unclear, hypotheses were only partially effective and supplementary experimentation was required for its conclusive definition. Markedly, hypotheses were entirely inept in predicting workings of complex systems that included unknown elements. Thus, hypotheses did not predict the existence and function of mRNA, the multiple unidentified components of the protein biosynthesis machinery, or the manifold unknown constituents of the ubiquitin-proteasome system of protein breakdown. Consequently, because of their inability to envision unknown entities, predictive hypotheses did not contribute to the elucidation of cation theories remained the sole instrument to explain complex bio-molecular systems, the philosophical question of alleged advantage of predictive over accommodative hypotheses became inconsequential.

  3. A Theoretical Mechanism of Szilard Engine Function in Nucleic Acids and the Implications for Quantum Coherence in Biological Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Matthew Mihelic, F.

    2010-12-22

    Nucleic acids theoretically possess a Szilard engine function that can convert the energy associated with the Shannon entropy of molecules for which they have coded recognition, into the useful work of geometric reconfiguration of the nucleic acid molecule. This function is logically reversible because its mechanism is literally and physically constructed out of the information necessary to reduce the Shannon entropy of such molecules, which means that this information exists on both sides of the theoretical engine, and because information is retained in the geometric degrees of freedom of the nucleic acid molecule, a quantum gate is formed through whichmore » multi-state nucleic acid qubits can interact. Entangled biophotons emitted as a consequence of symmetry breaking nucleic acid Szilard engine (NASE) function can be used to coordinate relative positioning of different nucleic acid locations, both within and between cells, thus providing the potential for quantum coherence of an entire biological system. Theoretical implications of understanding biological systems as such 'quantum adaptive systems' include the potential for multi-agent based quantum computing, and a better understanding of systemic pathologies such as cancer, as being related to a loss of systemic quantum coherence.« less

  4. A Theoretical Mechanism of Szilard Engine Function in Nucleic Acids and the Implications for Quantum Coherence in Biological Systems

    NASA Astrophysics Data System (ADS)

    Matthew Mihelic, F.

    2010-12-01

    Nucleic acids theoretically possess a Szilard engine function that can convert the energy associated with the Shannon entropy of molecules for which they have coded recognition, into the useful work of geometric reconfiguration of the nucleic acid molecule. This function is logically reversible because its mechanism is literally and physically constructed out of the information necessary to reduce the Shannon entropy of such molecules, which means that this information exists on both sides of the theoretical engine, and because information is retained in the geometric degrees of freedom of the nucleic acid molecule, a quantum gate is formed through which multi-state nucleic acid qubits can interact. Entangled biophotons emitted as a consequence of symmetry breaking nucleic acid Szilard engine (NASE) function can be used to coordinate relative positioning of different nucleic acid locations, both within and between cells, thus providing the potential for quantum coherence of an entire biological system. Theoretical implications of understanding biological systems as such "quantum adaptive systems" include the potential for multi-agent based quantum computing, and a better understanding of systemic pathologies such as cancer, as being related to a loss of systemic quantum coherence.

  5. Survey of adaptive image coding techniques

    NASA Technical Reports Server (NTRS)

    Habibi, A.

    1977-01-01

    The general problem of image data compression is discussed briefly with attention given to the use of Karhunen-Loeve transforms, suboptimal systems, and block quantization. A survey is then conducted encompassing the four categories of adaptive systems: (1) adaptive transform coding (adaptive sampling, adaptive quantization, etc.), (2) adaptive predictive coding (adaptive delta modulation, adaptive DPCM encoding, etc.), (3) adaptive cluster coding (blob algorithms and the multispectral cluster coding technique), and (4) adaptive entropy coding.

  6. The analysis of convolutional codes via the extended Smith algorithm

    NASA Technical Reports Server (NTRS)

    Mceliece, R. J.; Onyszchuk, I.

    1993-01-01

    Convolutional codes have been the central part of most error-control systems in deep-space communication for many years. Almost all such applications, however, have used the restricted class of (n,1), also known as 'rate 1/n,' convolutional codes. The more general class of (n,k) convolutional codes contains many potentially useful codes, but their algebraic theory is difficult and has proved to be a stumbling block in the evolution of convolutional coding systems. In this article, the situation is improved by describing a set of practical algorithms for computing certain basic things about a convolutional code (among them the degree, the Forney indices, a minimal generator matrix, and a parity-check matrix), which are usually needed before a system using the code can be built. The approach is based on the classic Forney theory for convolutional codes, together with the extended Smith algorithm for polynomial matrices, which is introduced in this article.

  7. System for loading executable code into volatile memory in a downhole tool

    DOEpatents

    Hall, David R.; Bartholomew, David B.; Johnson, Monte L.

    2007-09-25

    A system for loading an executable code into volatile memory in a downhole tool string component comprises a surface control unit comprising executable code. An integrated downhole network comprises data transmission elements in communication with the surface control unit and the volatile memory. The executable code, stored in the surface control unit, is not permanently stored in the downhole tool string component. In a preferred embodiment of the present invention, the downhole tool string component comprises boot memory. In another embodiment, the executable code is an operating system executable code. Preferably, the volatile memory comprises random access memory (RAM). A method for loading executable code to volatile memory in a downhole tool string component comprises sending the code from the surface control unit to a processor in the downhole tool string component over the network. A central processing unit writes the executable code in the volatile memory.

  8. The design of the CMOS wireless bar code scanner applying optical system based on ZigBee

    NASA Astrophysics Data System (ADS)

    Chen, Yuelin; Peng, Jian

    2008-03-01

    The traditional bar code scanner is influenced by the length of data line, but the farthest distance of the wireless bar code scanner of wireless communication is generally between 30m and 100m on the market. By rebuilding the traditional CCD optical bar code scanner, a CMOS code scanner is designed based on the ZigBee to meet the demands of market. The scan system consists of the CMOS image sensor and embedded chip S3C2401X, when the two dimensional bar code is read, the results show the inaccurate and wrong code bar, resulted from image defile, disturber, reads image condition badness, signal interference, unstable system voltage. So we put forward the method which uses the matrix evaluation and Read-Solomon arithmetic to solve them. In order to construct the whole wireless optics of bar code system and to ensure its ability of transmitting bar code image signals digitally with long distances, ZigBee is used to transmit data to the base station, and this module is designed based on image acquisition system, and at last the wireless transmitting/receiving CC2430 module circuit linking chart is established. And by transplanting the embedded RTOS system LINUX to the MCU, an applying wireless CMOS optics bar code scanner and multi-task system is constructed. Finally, performance of communication is tested by evaluation software Smart RF. In broad space, every ZIGBEE node can realize 50m transmission with high reliability. When adding more ZigBee nodes, the transmission distance can be several thousands of meters long.

  9. Using read codes to identify patients with irritable bowel syndrome in general practice: a database study

    PubMed Central

    2013-01-01

    Background Estimates of the prevalence of irritable bowel syndrome (IBS) vary widely, and a large proportion of patients report having consulted their general practitioner (GP). In patients with new onset gastrointestinal symptoms in primary care it might be possible to predict those at risk of persistent symptoms. However, one of the difficulties is identifying patients within primary care. GPs use a variety of Read Codes to describe patients presenting with IBS. Furthermore, in a qualitative study, exploring GPs’ attitudes and approaches to defining patients with IBS, GPs appeared reluctant to add the IBS Read Code to the patient record until more serious conditions were ruled out. Consequently, symptom codes such as 'abdominal pain’, 'diarrhoea’ or 'constipation’ are used. The aim of the current study was to investigate the prevalence of recorded consultations for IBS and to explore the symptom profile of patients with IBS using data from the Salford Integrated Record (SIR). Methods This was a database study using the SIR, a local patient sharing record system integrating primary, community and secondary care information. Records were obtained for a cohort of patients with gastrointestinal disorders from January 2002 to December 2011. Prevalence rates, symptom recording, medication prescribing and referral patterns were compared for three patient groups (IBS, abdominal pain (AP) and Inflammatory Bowel Disease (IBD)). Results The prevalence of IBS (age standardised rate: 616 per year per 100,000 population) was much lower than expected compared with that reported in the literature. The majority of patients (69%) had no gastrointestinal symptoms recorded in the year prior to their IBS. However a proportion of these (22%) were likely to have been prescribed NICE guideline recommended medications for IBS in that year. The findings for AP and IBD were similar. Conclusions Using Read Codes to identify patients with IBS may lead to a large underestimate of the community prevalence. The IBS diagnostic Read Code was rarely applied in practice. There are similarities with many other medically unexplained symptoms which are typically difficult to diagnose in clinical practice. PMID:24295337

  10. Using read codes to identify patients with irritable bowel syndrome in general practice: a database study.

    PubMed

    Harkness, Elaine F; Grant, Laura; O'Brien, Sarah J; Chew-Graham, Carolyn A; Thompson, David G

    2013-12-02

    Estimates of the prevalence of irritable bowel syndrome (IBS) vary widely, and a large proportion of patients report having consulted their general practitioner (GP). In patients with new onset gastrointestinal symptoms in primary care it might be possible to predict those at risk of persistent symptoms. However, one of the difficulties is identifying patients within primary care. GPs use a variety of Read Codes to describe patients presenting with IBS. Furthermore, in a qualitative study, exploring GPs' attitudes and approaches to defining patients with IBS, GPs appeared reluctant to add the IBS Read Code to the patient record until more serious conditions were ruled out. Consequently, symptom codes such as 'abdominal pain', 'diarrhoea' or 'constipation' are used. The aim of the current study was to investigate the prevalence of recorded consultations for IBS and to explore the symptom profile of patients with IBS using data from the Salford Integrated Record (SIR). This was a database study using the SIR, a local patient sharing record system integrating primary, community and secondary care information. Records were obtained for a cohort of patients with gastrointestinal disorders from January 2002 to December 2011. Prevalence rates, symptom recording, medication prescribing and referral patterns were compared for three patient groups (IBS, abdominal pain (AP) and Inflammatory Bowel Disease (IBD)). The prevalence of IBS (age standardised rate: 616 per year per 100,000 population) was much lower than expected compared with that reported in the literature. The majority of patients (69%) had no gastrointestinal symptoms recorded in the year prior to their IBS. However a proportion of these (22%) were likely to have been prescribed NICE guideline recommended medications for IBS in that year. The findings for AP and IBD were similar. Using Read Codes to identify patients with IBS may lead to a large underestimate of the community prevalence. The IBS diagnostic Read Code was rarely applied in practice. There are similarities with many other medically unexplained symptoms which are typically difficult to diagnose in clinical practice.

  11. A survey to identify the clinical coding and classification systems currently in use across Europe.

    PubMed

    de Lusignan, S; Minmagh, C; Kennedy, J; Zeimet, M; Bommezijn, H; Bryant, J

    2001-01-01

    This is a survey to identify what clinical coding systems are currently in use across the European Union, and the states seeking membership to it. We sought to identify what systems are currently used and to what extent they were subject to local adaptation. Clinical coding should facilitate identifying key medical events in a computerised medical record, and aggregating information across groups of records. The emerging new driver is as the enabler of the life-long computerised medical record. A prerequisite for this level of functionality is the transfer of information between different computer systems. This transfer can be facilitated either by working on the interoperability problems between disparate systems or by harmonising the underlying data. This paper examines the extent to which the latter has occurred across Europe. Literature and Internet search. Requests for information via electronic mail to pan-European mailing lists of health informatics professionals. Coding systems are now a de facto part of health information systems across Europe. There are relatively few coding systems in existence across Europe. ICD9 and ICD 10, ICPC and Read were the most established. However the local adaptation of these classification systems either on a by country or by computer software manufacturer basis; significantly reduces the ability for the meaning coded with patients computer records to be easily transferred from one medical record system to another. There is no longer any debate as to whether a coding or classification system should be used. Convergence of different classifications systems should be encouraged. Countries and computer manufacturers within the EU should be encouraged to stop making local modifications to coding and classification systems, as this practice risks significantly slowing progress towards easy transfer of records between computer systems.

  12. Practical guide to bar coding for patient medication safety.

    PubMed

    Neuenschwander, Mark; Cohen, Michael R; Vaida, Allen J; Patchett, Jeffrey A; Kelly, Jamie; Trohimovich, Barbara

    2003-04-15

    Bar coding for the medication administration step of the drug-use process is discussed. FDA will propose a rule in 2003 that would require bar-code labels on all human drugs and biologicals. Even with an FDA mandate, manufacturer procrastination and possible shifts in product availability are likely to slow progress. Such delays should not preclude health systems from adopting bar-code-enabled point-of-care (BPOC) systems to achieve gains in patient safety. Bar-code technology is a replacement for traditional keyboard data entry. The elements of bar coding are content, which determines the meaning; data format, which refers to the embedded data and symbology, which describes the "font" in which the machine-readable code is written. For a BPOC system to deliver an acceptable level of patient protection, the hospital must first establish reliable processes for a patient identification band, caregiver badge, and medication bar coding. Medications can have either drug-specific or patient-specific bar codes. Both varieties result in the desired code that supports patient's five rights of drug administration. When medications are not available from the manufacturer in immediate-container bar-coded packaging, other means of applying the bar code must be devised, including the use of repackaging equipment, overwrapping, manual bar coding, and outsourcing. Virtually all medications should be bar coded, the bar code on the label should be easily readable, and appropriate policies, procedures, and checks should be in place. Bar coding has the potential to be not only cost-effective but to produce a return on investment. By bar coding patient identification tags, caregiver badges, and immediate-container medications, health systems can substantially increase patient safety during medication administration.

  13. Cell cycle, oncogenic and tumor suppressor pathways regulate numerous long and macro non-protein-coding RNAs

    PubMed Central

    2014-01-01

    Background The genome is pervasively transcribed but most transcripts do not code for proteins, constituting non-protein-coding RNAs. Despite increasing numbers of functional reports of individual long non-coding RNAs (lncRNAs), assessing the extent of functionality among the non-coding transcriptional output of mammalian cells remains intricate. In the protein-coding world, transcripts differentially expressed in the context of processes essential for the survival of multicellular organisms have been instrumental in the discovery of functionally relevant proteins and their deregulation is frequently associated with diseases. We therefore systematically identified lncRNAs expressed differentially in response to oncologically relevant processes and cell-cycle, p53 and STAT3 pathways, using tiling arrays. Results We found that up to 80% of the pathway-triggered transcriptional responses are non-coding. Among these we identified very large macroRNAs with pathway-specific expression patterns and demonstrated that these are likely continuous transcripts. MacroRNAs contain elements conserved in mammals and sauropsids, which in part exhibit conserved RNA secondary structure. Comparing evolutionary rates of a macroRNA to adjacent protein-coding genes suggests a local action of the transcript. Finally, in different grades of astrocytoma, a tumor disease unrelated to the initially used cell lines, macroRNAs are differentially expressed. Conclusions It has been shown previously that the majority of expressed non-ribosomal transcripts are non-coding. We now conclude that differential expression triggered by signaling pathways gives rise to a similar abundance of non-coding content. It is thus unlikely that the prevalence of non-coding transcripts in the cell is a trivial consequence of leaky or random transcription events. PMID:24594072

  14. System Design for FEC in Aeronautical Telemetry

    DTIC Science & Technology

    2012-03-12

    rate punctured convolutional codes for soft decision Viterbi...below follows that given in [8]. The final coding rate of exactly 2/3 is achieved by puncturing the rate -1/2 code as follows. We begin with the buffer c1...concatenated convolutional code (SCCC). The contributions of this paper are on the system-design level. One major contribution is to design a SCCC code

  15. Biomass Economy

    DTIC Science & Technology

    1985-11-01

    Boiler and Pressure Vessel Code HEI Heat Exchanger Institute Heat and Material Balance c. System Description (1) Condenser... Boiler and Pressure Vessel Code "AN(SI B31.1 Power Piping d. System Description (1) Deaerator The deaerator will be d direct contact feedwater heater, and...vent, and drain piping. "b . Applicable Codes ASME Boiler and Pressure Vessel Code "ANSI B31.1 - Power Piping Code

  16. Augmented burst-error correction for UNICON laser memory. [digital memory

    NASA Technical Reports Server (NTRS)

    Lim, R. S.

    1974-01-01

    A single-burst-error correction system is described for data stored in the UNICON laser memory. In the proposed system, a long fire code with code length n greater than 16,768 bits was used as an outer code to augment an existing inner shorter fire code for burst error corrections. The inner fire code is a (80,64) code shortened from the (630,614) code, and it is used to correct a single-burst-error on a per-word basis with burst length b less than or equal to 6. The outer code, with b less than or equal to 12, would be used to correct a single-burst-error on a per-page basis, where a page consists of 512 32-bit words. In the proposed system, the encoding and error detection processes are implemented by hardware. A minicomputer, currently used as a UNICON memory management processor, is used on a time-demanding basis for error correction. Based upon existing error statistics, this combination of an inner code and an outer code would enable the UNICON system to obtain a very low error rate in spite of flaws affecting the recorded data.

  17. Effect of magnetic island geometry on ECRH/ECCD and consequences to the NTM stabilization dynamics

    NASA Astrophysics Data System (ADS)

    Chatziantonaki, I.; Tsironis, C.; Isliker, H.; Vlahos, L.

    2012-09-01

    In the majority of codes that model ECCD-based NTM stabilization, the analysis of the EC propagation and absorption is performed in terms of the axisymmetric magnetic field, ignoring effects due to the island topology. In this paper, we analyze the wave propagation, absorption and current drive in the presence of NTMs, as well as the ECCD-driven island growth, focusing on the effect of the island geometry on the wave de-position. A primary evaluation of the consequences of these effects on the NTM evolution is also made in terms of the modified Rutherford equation.

  18. Simulation of spacecraft attitude dynamics using TREETOPS and model-specific computer Codes

    NASA Technical Reports Server (NTRS)

    Cochran, John E.; No, T. S.; Fitz-Coy, Norman G.

    1989-01-01

    The simulation of spacecraft attitude dynamics and control using the generic, multi-body code called TREETOPS and other codes written especially to simulate particular systems is discussed. Differences in the methods used to derive equations of motion--Kane's method for TREETOPS and the Lagrangian and Newton-Euler methods, respectively, for the other two codes--are considered. Simulation results from the TREETOPS code are compared with those from the other two codes for two example systems. One system is a chain of rigid bodies; the other consists of two rigid bodies attached to a flexible base body. Since the computer codes were developed independently, consistent results serve as a verification of the correctness of all the programs. Differences in the results are discussed. Results for the two-rigid-body, one-flexible-body system are useful also as information on multi-body, flexible, pointing payload dynamics.

  19. A family of chaotic pure analog coding schemes based on baker's map function

    NASA Astrophysics Data System (ADS)

    Liu, Yang; Li, Jing; Lu, Xuanxuan; Yuen, Chau; Wu, Jun

    2015-12-01

    This paper considers a family of pure analog coding schemes constructed from dynamic systems which are governed by chaotic functions—baker's map function and its variants. Various decoding methods, including maximum likelihood (ML), minimum mean square error (MMSE), and mixed ML-MMSE decoding algorithms, have been developed for these novel encoding schemes. The proposed mirrored baker's and single-input baker's analog codes perform a balanced protection against the fold error (large distortion) and weak distortion and outperform the classical chaotic analog coding and analog joint source-channel coding schemes in literature. Compared to the conventional digital communication system, where quantization and digital error correction codes are used, the proposed analog coding system has graceful performance evolution, low decoding latency, and no quantization noise. Numerical results show that under the same bandwidth expansion, the proposed analog system outperforms the digital ones over a wide signal-to-noise (SNR) range.

  20. Error Control Coding Techniques for Space and Satellite Communications

    NASA Technical Reports Server (NTRS)

    Lin, Shu

    2000-01-01

    This paper presents a concatenated turbo coding system in which a Reed-Solomom outer code is concatenated with a binary turbo inner code. In the proposed system, the outer code decoder and the inner turbo code decoder interact to achieve both good bit error and frame error performances. The outer code decoder helps the inner turbo code decoder to terminate its decoding iteration while the inner turbo code decoder provides soft-output information to the outer code decoder to carry out a reliability-based soft-decision decoding. In the case that the outer code decoding fails, the outer code decoder instructs the inner code decoder to continue its decoding iterations until the outer code decoding is successful or a preset maximum number of decoding iterations is reached. This interaction between outer and inner code decoders reduces decoding delay. Also presented in the paper are an effective criterion for stopping the iteration process of the inner code decoder and a new reliability-based decoding algorithm for nonbinary codes.

  1. A development and integration of database code-system with a compilation of comparator, k0 and absolute methods for INAA using microsoft access

    NASA Astrophysics Data System (ADS)

    Hoh, Siew Sin; Rapie, Nurul Nadiah; Lim, Edwin Suh Wen; Tan, Chun Yuan; Yavar, Alireza; Sarmani, Sukiman; Majid, Amran Ab.; Khoo, Kok Siong

    2013-05-01

    Instrumental Neutron Activation Analysis (INAA) is often used to determine and calculate the elemental concentrations of a sample at The National University of Malaysia (UKM) typically in Nuclear Science Programme, Faculty of Science and Technology. The objective of this study was to develop a database code-system based on Microsoft Access 2010 which could help the INAA users to choose either comparator method, k0-method or absolute method for calculating the elemental concentrations of a sample. This study also integrated k0data, Com-INAA, k0Concent, k0-Westcott and Abs-INAA to execute and complete the ECC-UKM database code-system. After the integration, a study was conducted to test the effectiveness of the ECC-UKM database code-system by comparing the concentrations between the experiments and the code-systems. 'Triple Bare Monitor' Zr-Au and Cr-Mo-Au were used in k0Concent, k0-Westcott and Abs-INAA code-systems as monitors to determine the thermal to epithermal neutron flux ratio (f). Calculations involved in determining the concentration were net peak area (Np), measurement time (tm), irradiation time (tirr), k-factor (k), thermal to epithermal neutron flux ratio (f), parameters of the neutron flux distribution epithermal (α) and detection efficiency (ɛp). For Com-INAA code-system, certified reference material IAEA-375 Soil was used to calculate the concentrations of elements in a sample. Other CRM and SRM were also used in this database codesystem. Later, a verification process to examine the effectiveness of the Abs-INAA code-system was carried out by comparing the sample concentrations between the code-system and the experiment. The results of the experimental concentration values of ECC-UKM database code-system were performed with good accuracy.

  2. A novel construction scheme of QC-LDPC codes based on the RU algorithm for optical transmission systems

    NASA Astrophysics Data System (ADS)

    Yuan, Jian-guo; Liang, Meng-qi; Wang, Yong; Lin, Jin-zhao; Pang, Yu

    2016-03-01

    A novel lower-complexity construction scheme of quasi-cyclic low-density parity-check (QC-LDPC) codes for optical transmission systems is proposed based on the structure of the parity-check matrix for the Richardson-Urbanke (RU) algorithm. Furthermore, a novel irregular QC-LDPC(4 288, 4 020) code with high code-rate of 0.937 is constructed by this novel construction scheme. The simulation analyses show that the net coding gain ( NCG) of the novel irregular QC-LDPC(4 288,4 020) code is respectively 2.08 dB, 1.25 dB and 0.29 dB more than those of the classic RS(255, 239) code, the LDPC(32 640, 30 592) code and the irregular QC-LDPC(3 843, 3 603) code at the bit error rate ( BER) of 10-6. The irregular QC-LDPC(4 288, 4 020) code has the lower encoding/decoding complexity compared with the LDPC(32 640, 30 592) code and the irregular QC-LDPC(3 843, 3 603) code. The proposed novel QC-LDPC(4 288, 4 020) code can be more suitable for the increasing development requirements of high-speed optical transmission systems.

  3. An accurate evaluation of the performance of asynchronous DS-CDMA systems with zero-correlation-zone coding in Rayleigh fading

    NASA Astrophysics Data System (ADS)

    Walker, Ernest; Chen, Xinjia; Cooper, Reginald L.

    2010-04-01

    An arbitrarily accurate approach is used to determine the bit-error rate (BER) performance for generalized asynchronous DS-CDMA systems, in Gaussian noise with Raleigh fading. In this paper, and the sequel, new theoretical work has been contributed which substantially enhances existing performance analysis formulations. Major contributions include: substantial computational complexity reduction, including a priori BER accuracy bounding; an analytical approach that facilitates performance evaluation for systems with arbitrary spectral spreading distributions, with non-uniform transmission delay distributions. Using prior results, augmented by these enhancements, a generalized DS-CDMA system model is constructed and used to evaluated the BER performance, in a variety of scenarios. In this paper, the generalized system modeling was used to evaluate the performance of both Walsh- Hadamard (WH) and Walsh-Hadamard-seeded zero-correlation-zone (WH-ZCZ) coding. The selection of these codes was informed by the observation that WH codes contain N spectral spreading values (0 to N - 1), one for each code sequence; while WH-ZCZ codes contain only two spectral spreading values (N/2 - 1,N/2); where N is the sequence length in chips. Since these codes span the spectral spreading range for DS-CDMA coding, by invoking an induction argument, the generalization of the system model is sufficiently supported. The results in this paper, and the sequel, support the claim that an arbitrary accurate performance analysis for DS-CDMA systems can be evaluated over the full range of binary coding, with minimal computational complexity.

  4. Understanding the detector behavior through Montecarlo and calibration studies in view of the SOX measurement

    NASA Astrophysics Data System (ADS)

    Caminata, A.; Agostini, M.; Altenmüller, K.; Appel, S.; Bellini, G.; Benziger, J.; Berton, N.; Bick, D.; Bonfini, G.; Bravo, D.; Caccianiga, B.; Calaprice, F.; Cavalcante, P.; Chepurnov, A.; Choi, K.; Cribier, M.; D'Angelo, D.; Davini, S.; Derbin, A.; Di Noto, L.; Drachnev, I.; Durero, M.; Empl, A.; Etenko, A.; Farinon, S.; Fischer, V.; Fomenko, K.; Franco, D.; Gabriele, F.; Gaffiot, J.; Galbiati, C.; Ghiano, C.; Giammarchi, M.; Goeger-Neff, M.; Goretti, A.; Gromov, M.; Hagner, C.; Houdy, T.; Hungerford, E.; Ianni, Aldo; Ianni, Andrea; Jonquères, N.; Jedrzejczak, K.; Kaiser, M.; Kobychev, V.; Korablev, D.; Korga, G.; Kornoukhov, V.; Kryn, D.; Lachenmaier, T.; Lasserre, T.; Laubenstein, M.; Lehnert, B.; Link, J.; Litvinovich, E.; Lombardi, F.; Lombardi, P.; Ludhova, L.; Lukyanchenko, G.; Machulin, I.; Manecki, S.; Maneschg, W.; Marcocci, S.; Maricic, J.; Mention, G.; Meroni, E.; Meyer, M.; Miramonti, L.; Misiaszek, M.; Montuschi, M.; Mosteiro, P.; Muratova, V.; Musenich, R.; Neumair, B.; Oberauer, L.; Obolensky, M.; Ortica, F.; Pallavicini, M.; Papp, L.; Perasso, L.; Pocar, A.; Ranucci, G.; Razeto, A.; Re, A.; Romani, A.; Roncin, R.; Rossi, N.; Schönert, S.; Scola, L.; Semenov, D.; Simgen, H.; Skorokhvatov, M.; Smirnov, O.; Sotnikov, A.; Sukhotin, S.; Suvorov, Y.; Tartaglia, R.; Testera, G.; Thurn, J.; Toropova, M.; Unzhakov, E.; Veyssiere, C.; Vishneva, A.; Vivier, M.; Vogelaar, R. B.; von Feilitzsch, F.; Wang, H.; Weinz, S.; Winter, J.; Wojcik, M.; Wurm, M.; Yokley, Z.; Zaimidoroga, O.; Zavatarelli, S.; Zuber, K.; Zuzel, G.

    2016-02-01

    Borexino is an unsegmented neutrino detector operating at LNGS in central Italy. The experiment has shown its performances through its unprecedented accomplishments in the solar and geoneutrino detection. These performances make it an ideal tool to accomplish a state- of-the-art experiment able to test the existence of sterile neutrinos (SOX experiment). For both the solar and the SOX analysis, a good understanding of the detector response is fundamental. Consequently, calibration campaigns with radioactive sources have been performed over the years. The calibration data are of extreme importance to develop an accurate Monte Carlo code. This code is used in all the neutrino analyses. The Borexino-SOX calibration techniques and program and the advances on the detector simulation code in view of the start of the SOX data taking are presented. 1

  5. Surface acoustic wave coding for orthogonal frequency coded devices

    NASA Technical Reports Server (NTRS)

    Malocha, Donald (Inventor); Kozlovski, Nikolai (Inventor)

    2011-01-01

    Methods and systems for coding SAW OFC devices to mitigate code collisions in a wireless multi-tag system. Each device producing plural stepped frequencies as an OFC signal with a chip offset delay to increase code diversity. A method for assigning a different OCF to each device includes using a matrix based on the number of OFCs needed and the number chips per code, populating each matrix cell with OFC chip, and assigning the codes from the matrix to the devices. The asynchronous passive multi-tag system includes plural surface acoustic wave devices each producing a different OFC signal having the same number of chips and including a chip offset time delay, an algorithm for assigning OFCs to each device, and a transceiver to transmit an interrogation signal and receive OFC signals in response with minimal code collisions during transmission.

  6. Reprogramming neurodegeneration in the big data era.

    PubMed

    Zhou, Lujia; Verstreken, Patrik

    2018-02-01

    Recent genome-wide association studies (GWAS) have identified numerous genetic risk variants for late-onset Alzheimer's disease (AD) and Parkinson's disease (PD). However, deciphering the functional consequences of GWAS data is challenging due to a lack of reliable model systems to study the genetic variants that are often of low penetrance and non-coding identities. Pluripotent stem cell (PSC) technologies offer unprecedented opportunities for molecular phenotyping of GWAS variants in human neurons and microglia. Moreover, rapid technological advances in whole-genome RNA-sequencing and epigenome mapping fuel comprehensive and unbiased investigations of molecular alterations in PSC-derived disease models. Here, we review and discuss how integrated studies that utilize PSC technologies and genome-wide approaches may bring new mechanistic insight into the pathogenesis of AD and PD. Copyright © 2018 Elsevier Ltd. All rights reserved.

  7. Field Programmable Gate Array Failure Rate Estimation Guidelines for Launch Vehicle Fault Tree Models

    NASA Technical Reports Server (NTRS)

    Al Hassan, Mohammad; Britton, Paul; Hatfield, Glen Spencer; Novack, Steven D.

    2017-01-01

    Today's launch vehicles complex electronic and avionics systems heavily utilize Field Programmable Gate Array (FPGA) integrated circuits (IC) for their superb speed and reconfiguration capabilities. Consequently, FPGAs are prevalent ICs in communication protocols such as MILSTD- 1553B and in control signal commands such as in solenoid valve actuations. This paper will identify reliability concerns and high level guidelines to estimate FPGA total failure rates in a launch vehicle application. The paper will discuss hardware, hardware description language, and radiation induced failures. The hardware contribution of the approach accounts for physical failures of the IC. The hardware description language portion will discuss the high level FPGA programming languages and software/code reliability growth. The radiation portion will discuss FPGA susceptibility to space environment radiation.

  8. Quantitative Kα line spectroscopy for energy transport in ultra-intense laser plasma interaction

    NASA Astrophysics Data System (ADS)

    Zhang, Z.; Nishimura, H.; Namimoto, T.; Fujioka, S.; Arikawa, Y.; Nakai, M.; Koga, M.; Shiraga, H.; Kojima, S.; Azechi, H.; Ozaki, T.; Chen, H.; Pakr, J.; Williams, G. J.; Nishikino, M.; Kawachi, T.; Sagisaka, A.; Orimo, S.; Ogura, K.; Pirozhkov, A.; Yogo, A.; Kiriyama, H.; Kondo, K.; Okano, Y.

    2012-10-01

    X-ray line spectra ranging from 17 to 77 keV were quantitatively measured with a Laue spectrometer, composed of a cylindrically curved crystal and a detector. The absolute sensitivity of the spectrometer system was calibrated using pre-characterized laser-produced x-ray sources and radioisotopes, for the detectors and crystal respectively. The integrated reflectivity for the crystal is in good agreement with predictions by an open code for x-ray diffraction. The energy transfer efficiency from incident laser beams to hot electrons, as the energy transfer agency for Au Kα x-ray line emissions, is derived as a consequence of this work. By considering the hot electron temperature, the transfer efficiency from LFEX laser to Au plate target is about 8% to 10%.

  9. Three-Dimensional Mechanical Model of the Human Spine and the Versatility of its Use

    NASA Astrophysics Data System (ADS)

    Sokol, Milan; Velísková, Petra; Rehák, Ľuboš; Žabka, Martin

    2014-03-01

    The aim of the work is oriented towards the simulation or modeling of the lumbar and thoracic human spine as a load-bearing 3D system in a computer program (ANSYS). The human spine model includes a determination of the geometry based on X-ray pictures of frontal and lateral projections. For this reason, another computer code, BMPCOORDINATES, was developed as an aid to obtain the most precise and realistic model of the spine. Various positions, deformations, scoliosis, rotation and torsion can be modelled. Once the geometry is done, external loading on different spinal segments is entered; consequently, the response could be analysed. This can contribute a lot to medical practice as a tool for diagnoses, and developing implants or other artificial instruments for fixing the spine.

  10. 48 CFR 19.303 - Determining North American Industry Classification System (NAICS) codes and size standards.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 1 2010-10-01 2010-10-01 false Determining North American Industry Classification System (NAICS) codes and size standards. 19.303 Section 19.303 Federal Acquisition... Classification System (NAICS) codes and size standards. (a) The contracting officer shall determine the...

  11. 75 FR 51465 - Medicare Program; Announcement of Five New Members to the Advisory Panel on Ambulatory Payment...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-08-20

    ... Panel. This expertise encompasses hospital payment systems; hospital medical-care delivery systems; provider billing systems; APC groups, Current Procedural Terminology codes, and alpha-numeric Healthcare Common Procedure Coding System codes; and the use of, and payment for, drugs and medical devices in the...

  12. Variable Coded Modulation software simulation

    NASA Astrophysics Data System (ADS)

    Sielicki, Thomas A.; Hamkins, Jon; Thorsen, Denise

    This paper reports on the design and performance of a new Variable Coded Modulation (VCM) system. This VCM system comprises eight of NASA's recommended codes from the Consultative Committee for Space Data Systems (CCSDS) standards, including four turbo and four AR4JA/C2 low-density parity-check codes, together with six modulations types (BPSK, QPSK, 8-PSK, 16-APSK, 32-APSK, 64-APSK). The signaling protocol for the transmission mode is based on a CCSDS recommendation. The coded modulation may be dynamically chosen, block to block, to optimize throughput.

  13. The Maximal C3 Self-Complementary Trinucleotide Circular Code X in Genes of Bacteria, Archaea, Eukaryotes, Plasmids and Viruses

    PubMed Central

    Michel, Christian J.

    2017-01-01

    In 1996, a set X of 20 trinucleotides was identified in genes of both prokaryotes and eukaryotes which has on average the highest occurrence in reading frame compared to its two shifted frames. Furthermore, this set X has an interesting mathematical property as X is a maximal C3 self-complementary trinucleotide circular code. In 2015, by quantifying the inspection approach used in 1996, the circular code X was confirmed in the genes of bacteria and eukaryotes and was also identified in the genes of plasmids and viruses. The method was based on the preferential occurrence of trinucleotides among the three frames at the gene population level. We extend here this definition at the gene level. This new statistical approach considers all the genes, i.e., of large and small lengths, with the same weight for searching the circular code X. As a consequence, the concept of circular code, in particular the reading frame retrieval, is directly associated to each gene. At the gene level, the circular code X is strengthened in the genes of bacteria, eukaryotes, plasmids, and viruses, and is now also identified in the genes of archaea. The genes of mitochondria and chloroplasts contain a subset of the circular code X. Finally, by studying viral genes, the circular code X was found in DNA genomes, RNA genomes, double-stranded genomes, and single-stranded genomes. PMID:28420220

  14. Deforestation and Carbon Loss in Southwest Amazonia: Impact of Brazil's Revised Forest Code

    NASA Astrophysics Data System (ADS)

    Roriz, Pedro Augusto Costa; Yanai, Aurora Miho; Fearnside, Philip Martin

    2017-09-01

    In 2012 Brazil's National Congress altered the country's Forest Code, decreasing various environmental protections in the set of regulations governing forests. This suggests consequences in increased deforestation and emissions of greenhouse gases and in decreased protection of fragile ecosystems. To ascertain the effects, a simulation was run to the year 2025 for the municipality (county) of Boca do Acre, Amazonas state, Brazil. A baseline scenario considered historical behavior (which did not respect the Forest Code), while two scenarios considered full compliance with the old Forest Code (Law 4771/1965) and the current Code (Law 12,651/2012) regarding the protection of "areas of permanent preservation" (APPs) along the edges of watercourses. The models were parameterized from satellite imagery and simulated using Dinamica-EGO software. Deforestation actors and processes in the municipality were observed in loco in 2012. Carbon emissions and loss of forest by 2025 were computed in the three simulation scenarios. There was a 10% difference in the loss of carbon stock and of forest between the scenarios with the two versions of the Forest Code. The baseline scenario showed the highest loss of carbon stocks and the highest increase in annual emissions. The greatest damage was caused by not protecting wetlands and riparian zones.

  15. Validation of a multi-layer Green's function code for ion beam transport

    NASA Astrophysics Data System (ADS)

    Walker, Steven; Tweed, John; Tripathi, Ram; Badavi, Francis F.; Miller, Jack; Zeitlin, Cary; Heilbronn, Lawrence

    To meet the challenge of future deep space programs, an accurate and efficient engineering code for analyzing the shielding requirements against high-energy galactic heavy radiations is needed. In consequence, a new version of the HZETRN code capable of simulating high charge and energy (HZE) ions with either laboratory or space boundary conditions is currently under development. The new code, GRNTRN, is based on a Green's function approach to the solution of Boltzmann's transport equation and like its predecessor is deterministic in nature. The computational model consists of the lowest order asymptotic approximation followed by a Neumann series expansion with non-perturbative corrections. The physical description includes energy loss with straggling, nuclear attenuation, nuclear fragmentation with energy dispersion and down shift. Code validation in the laboratory environment is addressed by showing that GRNTRN accurately predicts energy loss spectra as measured by solid-state detectors in ion beam experiments with multi-layer targets. In order to validate the code with space boundary conditions, measured particle fluences are propagated through several thicknesses of shielding using both GRNTRN and the current version of HZETRN. The excellent agreement obtained indicates that GRNTRN accurately models the propagation of HZE ions in the space environment as well as in laboratory settings and also provides verification of the HZETRN propagator.

  16. Deforestation and Carbon Loss in Southwest Amazonia: Impact of Brazil's Revised Forest Code.

    PubMed

    Roriz, Pedro Augusto Costa; Yanai, Aurora Miho; Fearnside, Philip Martin

    2017-09-01

    In 2012 Brazil's National Congress altered the country's Forest Code, decreasing various environmental protections in the set of regulations governing forests. This suggests consequences in increased deforestation and emissions of greenhouse gases and in decreased protection of fragile ecosystems. To ascertain the effects, a simulation was run to the year 2025 for the municipality (county) of Boca do Acre, Amazonas state, Brazil. A baseline scenario considered historical behavior (which did not respect the Forest Code), while two scenarios considered full compliance with the old Forest Code (Law 4771/1965) and the current Code (Law 12,651/2012) regarding the protection of "areas of permanent preservation" (APPs) along the edges of watercourses. The models were parameterized from satellite imagery and simulated using Dinamica-EGO software. Deforestation actors and processes in the municipality were observed in loco in 2012. Carbon emissions and loss of forest by 2025 were computed in the three simulation scenarios. There was a 10% difference in the loss of carbon stock and of forest between the scenarios with the two versions of the Forest Code. The baseline scenario showed the highest loss of carbon stocks and the highest increase in annual emissions. The greatest damage was caused by not protecting wetlands and riparian zones.

  17. The Initial Atmospheric Transport (IAT) Code: Description and Validation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Morrow, Charles W.; Bartel, Timothy James

    The Initial Atmospheric Transport (IAT) computer code was developed at Sandia National Laboratories as part of their nuclear launch accident consequences analysis suite of computer codes. The purpose of IAT is to predict the initial puff/plume rise resulting from either a solid rocket propellant or liquid rocket fuel fire. The code generates initial conditions for subsequent atmospheric transport calculations. The Initial Atmospheric Transfer (IAT) code has been compared to two data sets which are appropriate to the design space of space launch accident analyses. The primary model uncertainties are the entrainment coefficients for the extended Taylor model. The Titan 34Dmore » accident (1986) was used to calibrate these entrainment settings for a prototypic liquid propellant accident while the recent Johns Hopkins University Applied Physics Laboratory (JHU/APL, or simply APL) large propellant block tests (2012) were used to calibrate the entrainment settings for prototypic solid propellant accidents. North American Meteorology (NAM )formatted weather data profiles are used by IAT to determine the local buoyancy force balance. The IAT comparisons for the APL solid propellant tests illustrate the sensitivity of the plume elevation to the weather profiles; that is, the weather profile is a dominant factor in determining the plume elevation. The IAT code performed remarkably well and is considered validated for neutral weather conditions.« less

  18. Changes in mitochondrial genetic codes as phylogenetic characters: Two examples from the flatworms

    PubMed Central

    Telford, Maximilian J.; Herniou, Elisabeth A.; Russell, Robert B.; Littlewood, D. Timothy J.

    2000-01-01

    Shared molecular genetic characteristics other than DNA and protein sequences can provide excellent sources of phylogenetic information, particularly if they are complex and rare and are consequently unlikely to have arisen by chance convergence. We have used two such characters, arising from changes in mitochondrial genetic code, to define a clade within the Platyhelminthes (flatworms), the Rhabditophora. We have sampled 10 distinct classes within the Rhabditophora and find that all have the codon AAA coding for the amino acid Asn rather than the usual Lys and AUA for Ile rather than the usual Met. We find no evidence to support claims that the codon UAA codes for Tyr in the Platyhelminthes rather than the standard stop codon. The Rhabditophora are a very diverse group comprising the majority of the free-living turbellarian taxa and the parasitic Neodermata. In contrast, three other classes of turbellarian flatworm, the Acoela, Nemertodermatida, and Catenulida, have the standard invertebrate assignments for these codons and so are convincingly excluded from the rhabditophoran clade. We have developed a rapid computerized method for analyzing genetic codes and demonstrate the wide phylogenetic distribution of the standard invertebrate code as well as confirming already known metazoan deviations from it (ascidian, vertebrate, echinoderm/hemichordate). PMID:11027335

  19. Rocketdyne/Westinghouse nuclear thermal rocket engine modeling

    NASA Technical Reports Server (NTRS)

    Glass, James F.

    1993-01-01

    The topics are presented in viewgraph form and include the following: systems approach needed for nuclear thermal rocket (NTR) design optimization; generic NTR engine power balance codes; rocketdyne nuclear thermal system code; software capabilities; steady state model; NTR engine optimizer code-logic; reactor power calculation logic; sample multi-component configuration; NTR design code output; generic NTR code at Rocketdyne; Rocketdyne NTR model; and nuclear thermal rocket modeling directions.

  20. Properties of a certain stochastic dynamical system, channel polarization, and polar codes

    NASA Astrophysics Data System (ADS)

    Tanaka, Toshiyuki

    2010-06-01

    A new family of codes, called polar codes, has recently been proposed by Arikan. Polar codes are of theoretical importance because they are provably capacity achieving with low-complexity encoding and decoding. We first discuss basic properties of a certain stochastic dynamical system, on the basis of which properties of channel polarization and polar codes are reviewed, with emphasis on our recent results.

  1. Ultra-narrow bandwidth voice coding

    DOEpatents

    Holzrichter, John F [Berkeley, CA; Ng, Lawrence C [Danville, CA

    2007-01-09

    A system of removing excess information from a human speech signal and coding the remaining signal information, transmitting the coded signal, and reconstructing the coded signal. The system uses one or more EM wave sensors and one or more acoustic microphones to determine at least one characteristic of the human speech signal.

  2. Parallelization Issues and Particle-In Codes.

    NASA Astrophysics Data System (ADS)

    Elster, Anne Cathrine

    1994-01-01

    "Everything should be made as simple as possible, but not simpler." Albert Einstein. The field of parallel scientific computing has concentrated on parallelization of individual modules such as matrix solvers and factorizers. However, many applications involve several interacting modules. Our analyses of a particle-in-cell code modeling charged particles in an electric field, show that these accompanying dependencies affect data partitioning and lead to new parallelization strategies concerning processor, memory and cache utilization. Our test-bed, a KSR1, is a distributed memory machine with a globally shared addressing space. However, most of the new methods presented hold generally for hierarchical and/or distributed memory systems. We introduce a novel approach that uses dual pointers on the local particle arrays to keep the particle locations automatically partially sorted. Complexity and performance analyses with accompanying KSR benchmarks, have been included for both this scheme and for the traditional replicated grids approach. The latter approach maintains load-balance with respect to particles. However, our results demonstrate it fails to scale properly for problems with large grids (say, greater than 128-by-128) running on as few as 15 KSR nodes, since the extra storage and computation time associated with adding the grid copies, becomes significant. Our grid partitioning scheme, although harder to implement, does not need to replicate the whole grid. Consequently, it scales well for large problems on highly parallel systems. It may, however, require load balancing schemes for non-uniform particle distributions. Our dual pointer approach may facilitate this through dynamically partitioned grids. We also introduce hierarchical data structures that store neighboring grid-points within the same cache -line by reordering the grid indexing. This alignment produces a 25% savings in cache-hits for a 4-by-4 cache. A consideration of the input data's effect on the simulation may lead to further improvements. For example, in the case of mean particle drift, it is often advantageous to partition the grid primarily along the direction of the drift. The particle-in-cell codes for this study were tested using physical parameters, which lead to predictable phenomena including plasma oscillations and two-stream instabilities. An overview of the most central references related to parallel particle codes is also given.

  3. Health problems and disability in long-term sickness absence: ICF coding of medical certificates.

    PubMed

    Morgell, Roland; Backlund, Lars G; Arrelöv, Britt; Strender, Lars-Erik; Nilsson, Gunnar H

    2011-11-11

    The purpose of this study was to test the feasibility of International Classification of Functioning, Disability and Health (ICF) and to explore the distribution, including gender differences, of health problems and disabilities as reflected in long-term sickness absence certificates. A total of 433 patients with long sick-listing periods, 267 women and 166 men, were included in the study. All certificates exceeding 28 days of sick-listing sent to the local office of the Swedish Social Insurance Administration of a municipality in the Stockholm area were collected during four weeks in 2004-2005. ICD-10 medical diagnosis codes in the certificates were retrieved and free text information on disabilities in body function, body structure or activity and participation were coded according to ICF short version. In 89.8% of the certificates there were descriptions of disabilities that readily could be classified according to ICF. In a reliability test 123/131 (94%) items of randomly chosen free text information were identically classified by two of the authors. On average 2.4 disability categories (range 0-9) were found per patient; the most frequent were 'Sensation of pain' (35.1% of the patients), 'Emotional functions' (34.1%), 'Energy and drive functions' (22.4%), and 'Sleep functions' (16.9%). The dominating ICD-10 diagnostic groups were 'Mental and behavioural disorders' (34.4%) and 'Diseases of the musculoskeletal system and connective tissue' (32.8%). 'Reaction to severe stress and adjustment disorders' (14.7%), and 'Depressive episode' (11.5%) were the most frequent diagnostic codes. Disabilities in mental functions and activity/participation were more commonly described among women, while disabilities related to the musculoskeletal system were more frequent among men. Both ICD-10 diagnoses and ICF categories were dominated by mental and musculoskeletal health problems, but there seems to be gender differences, and ICF classification as a complement to ICD-10 could provide a better understanding of the consequences of diseases and how individual patients can cope with their health problems. ICF is feasible for secondary classifying of free text descriptions of disabilities stated in sick-leave certificates and seems to be useful as a complement to ICD-10 for sick-listing management and research.

  4. Spatial transform coding of color images.

    NASA Technical Reports Server (NTRS)

    Pratt, W. K.

    1971-01-01

    The application of the transform-coding concept to the coding of color images represented by three primary color planes of data is discussed. The principles of spatial transform coding are reviewed and the merits of various methods of color-image representation are examined. A performance analysis is presented for the color-image transform-coding system. Results of a computer simulation of the coding system are also given. It is shown that, by transform coding, the chrominance content of a color image can be coded with an average of 1.0 bits per element or less without serious degradation. If luminance coding is also employed, the average rate reduces to about 2.0 bits per element or less.

  5. Bar-Code System for a Microbiological Laboratory

    NASA Technical Reports Server (NTRS)

    Law, Jennifer; Kirschner, Larry

    2007-01-01

    A bar-code system has been assembled for a microbiological laboratory that must examine a large number of samples. The system includes a commercial bar-code reader, computer hardware and software components, plus custom-designed database software. The software generates a user-friendly, menu-driven interface.

  6. Geographic Information Systems using CODES linked data (Crash outcome data evaluation system)

    DOT National Transportation Integrated Search

    2001-04-01

    This report presents information about geographic information systems (GIS) and CODES linked data. Section one provides an overview of a GIS and the benefits of linking to CODES. Section two outlines the basic issues relative to the types of map data...

  7. Women Faculty Distressed: Descriptions and Consequences of Academic Contrapower Harassment

    ERIC Educational Resources Information Center

    Lampman, Claudia; Crew, Earl C.; Lowery, Shea D.; Tompkins, Kelley

    2016-01-01

    Academic contrapower harassment (ACPH) occurs when someone with seemingly less power in an educational setting (e.g., a student) harasses someone more powerful (e.g., a professor). A representative sample of 289 professors from U.S. institutions of higher education described their worst incident with ACPH. Open-ended responses were coded using a…

  8. An Examination of Differences in Consequences of Punishment among PK-12 School Administrators

    ERIC Educational Resources Information Center

    Randle, Dawn DuBose

    2010-01-01

    The purpose of this study was to examine the differences in the administering of punishment procedures for violations of a school district's Code of Student Conduct among school-based administrators. Specifically, this study was concerned with the impact of the socio-demographic variables of: gender, years of administrative experience,…

  9. Whose Code Are You Teaching? A Popular Australian Coursebook Unravelled

    ERIC Educational Resources Information Center

    Ritchie, Annabelle

    2005-01-01

    The study of curriculum materials is of interest to social researchers seeking to understand the social constructions of reality. All texts embody a number of purposeful choices about how reality is to be represented, and these choices have consequences for what is "foregrounded, backgrounded, placed in the margins, distorted, short-cut,…

  10. Developmental Dyslexia and Explicit Long-Term Memory

    ERIC Educational Resources Information Center

    Menghini, Deny; Carlesimo, Giovanni Augusto; Marotta, Luigi; Finzi, Alessandra; Vicari, Stefano

    2010-01-01

    The reduced verbal long-term memory capacities often reported in dyslexics are generally interpreted as a consequence of their deficit in phonological coding. The present study was aimed at evaluating whether the learning deficit exhibited by dyslexics was restricted only to the verbal component of the long-term memory abilities or also involved…

  11. Preparing to "Not" Be a Footballer: Higher Education and Professional Sport

    ERIC Educational Resources Information Center

    Hickey, Christopher; Kelly, Peter

    2008-01-01

    In the commercialised and professionalised world of elite sport, issues associated with career pathways and post sporting career options have a particular resonance. In various football codes, an unexpected knock, twist, bend or break can profoundly impact a player's career. In this high risk and high consequence environment, a number of sports…

  12. Validity of the Child Facial Coding System for the Assessment of Acute Pain in Children With Cerebral Palsy.

    PubMed

    Hadden, Kellie L; LeFort, Sandra; O'Brien, Michelle; Coyte, Peter C; Guerriere, Denise N

    2016-04-01

    The purpose of the current study was to examine the concurrent and discriminant validity of the Child Facial Coding System for children with cerebral palsy. Eighty-five children (mean = 8.35 years, SD = 4.72 years) were videotaped during a passive joint stretch with their physiotherapist and during 3 time segments: baseline, passive joint stretch, and recovery. Children's pain responses were rated from videotape using the Numerical Rating Scale and Child Facial Coding System. Results indicated that Child Facial Coding System scores during the passive joint stretch significantly correlated with Numerical Rating Scale scores (r = .72, P < .01). Child Facial Coding System scores were also significantly higher during the passive joint stretch than the baseline and recovery segments (P < .001). Facial activity was not significantly correlated with the developmental measures. These findings suggest that the Child Facial Coding System is a valid method of identifying pain in children with cerebral palsy. © The Author(s) 2015.

  13. Palindromic Genes in the Linear Mitochondrial Genome of the Nonphotosynthetic Green Alga Polytomella magna

    PubMed Central

    Smith, David Roy; Hua, Jimeng; Archibald, John M.; Lee, Robert W.

    2013-01-01

    Organelle DNA is no stranger to palindromic repeats. But never has a mitochondrial or plastid genome been described in which every coding region is part of a distinct palindromic unit. While sequencing the mitochondrial DNA of the nonphotosynthetic green alga Polytomella magna, we uncovered precisely this type of genic arrangement. The P. magna mitochondrial genome is linear and made up entirely of palindromes, each containing 1–7 unique coding regions. Consequently, every gene in the genome is duplicated and in an inverted orientation relative to its partner. And when these palindromic genes are folded into putative stem-loops, their predicted translational start sites are often positioned in the apex of the loop. Gel electrophoresis results support the linear, 28-kb monomeric conformation of the P. magna mitochondrial genome. Analyses of other Polytomella taxa suggest that palindromic mitochondrial genes were present in the ancestor of the Polytomella lineage and lost or retained to various degrees in extant species. The possible origins and consequences of this bizarre genomic architecture are discussed. PMID:23940100

  14. LDPC coded OFDM over the atmospheric turbulence channel.

    PubMed

    Djordjevic, Ivan B; Vasic, Bane; Neifeld, Mark A

    2007-05-14

    Low-density parity-check (LDPC) coded optical orthogonal frequency division multiplexing (OFDM) is shown to significantly outperform LDPC coded on-off keying (OOK) over the atmospheric turbulence channel in terms of both coding gain and spectral efficiency. In the regime of strong turbulence at a bit-error rate of 10(-5), the coding gain improvement of the LDPC coded single-side band unclipped-OFDM system with 64 sub-carriers is larger than the coding gain of the LDPC coded OOK system by 20.2 dB for quadrature-phase-shift keying (QPSK) and by 23.4 dB for binary-phase-shift keying (BPSK).

  15. Convolutional coding techniques for data protection

    NASA Technical Reports Server (NTRS)

    Massey, J. L.

    1975-01-01

    Results of research on the use of convolutional codes in data communications are presented. Convolutional coding fundamentals are discussed along with modulation and coding interaction. Concatenated coding systems and data compression with convolutional codes are described.

  16. An all-digital receiver for satellite audio broadcasting signals using trellis coded quasi-orthogonal code-division multiplexing

    NASA Astrophysics Data System (ADS)

    Braun, Walter; Eglin, Peter; Abello, Ricard

    1993-02-01

    Spread Spectrum Code Division Multiplex is an attractive scheme for the transmission of multiple signals over a satellite transponder. By using orthogonal or quasi-orthogonal spreading codes the interference between the users can be virtually eliminated. However, the acquisition and tracking of the spreading code phase can not take advantage of the code orthogonality since sequential acquisition and Delay-Locked loop tracking depend on correlation with code phases other than the optimal despreading phase. Hence, synchronization is a critical issue in such a system. A demonstration hardware for the verification of the orthogonal CDM synchronization and data transmission concept is being designed and implemented. The system concept, the synchronization scheme, and the implementation are described. The performance of the system is discussed based on computer simulations.

  17. Los Alamos radiation transport code system on desktop computing platforms

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Briesmeister, J.F.; Brinkley, F.W.; Clark, B.A.

    The Los Alamos Radiation Transport Code System (LARTCS) consists of state-of-the-art Monte Carlo and discrete ordinates transport codes and data libraries. These codes were originally developed many years ago and have undergone continual improvement. With a large initial effort and continued vigilance, the codes are easily portable from one type of hardware to another. The performance of scientific work-stations (SWS) has evolved to the point that such platforms can be used routinely to perform sophisticated radiation transport calculations. As the personal computer (PC) performance approaches that of the SWS, the hardware options for desk-top radiation transport calculations expands considerably. Themore » current status of the radiation transport codes within the LARTCS is described: MCNP, SABRINA, LAHET, ONEDANT, TWODANT, TWOHEX, and ONELD. Specifically, the authors discuss hardware systems on which the codes run and present code performance comparisons for various machines.« less

  18. 48 CFR 1.105-1 - Publication and code arrangement.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 1 2010-10-01 2010-10-01 false Publication and code arrangement. 1.105-1 Section 1.105-1 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION GENERAL FEDERAL ACQUISITION REGULATIONS SYSTEM Purpose, Authority, Issuance 1.105-1 Publication and code...

  19. 48 CFR 901.105-1 - Publication and code arrangement.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 48 Federal Acquisition Regulations System 5 2013-10-01 2013-10-01 false Publication and code arrangement. 901.105-1 Section 901.105-1 Federal Acquisition Regulations System DEPARTMENT OF ENERGY GENERAL FEDERAL ACQUISITION REGULATIONS SYSTEM Purpose, Authority, Issuance 901.105-1 Publication and code...

  20. 48 CFR 1.105-1 - Publication and code arrangement.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 48 Federal Acquisition Regulations System 1 2012-10-01 2012-10-01 false Publication and code arrangement. 1.105-1 Section 1.105-1 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION GENERAL FEDERAL ACQUISITION REGULATIONS SYSTEM Purpose, Authority, Issuance 1.105-1 Publication and code...

  1. 48 CFR 901.105-1 - Publication and code arrangement.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 48 Federal Acquisition Regulations System 5 2012-10-01 2012-10-01 false Publication and code arrangement. 901.105-1 Section 901.105-1 Federal Acquisition Regulations System DEPARTMENT OF ENERGY GENERAL FEDERAL ACQUISITION REGULATIONS SYSTEM Purpose, Authority, Issuance 901.105-1 Publication and code...

  2. 48 CFR 1.105-1 - Publication and code arrangement.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 48 Federal Acquisition Regulations System 1 2014-10-01 2014-10-01 false Publication and code arrangement. 1.105-1 Section 1.105-1 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION GENERAL FEDERAL ACQUISITION REGULATIONS SYSTEM Purpose, Authority, Issuance 1.105-1 Publication and code...

  3. How to Measure Motivational Interviewing Fidelity in Randomized Controlled Trials: Practical Recommendations.

    PubMed

    Jelsma, Judith G M; Mertens, Vera-Christina; Forsberg, Lisa; Forsberg, Lars

    2015-07-01

    Many randomized controlled trials in which motivational interviewing (MI) is a key intervention make no provision for the assessment of treatment fidelity. This methodological shortcoming makes it impossible to distinguish between high- and low-quality MI interventions, and, consequently, to know whether MI provision has contributed to any intervention effects. This article makes some practical recommendations for the collection, selection, coding and reporting of MI fidelity data, as measured using the Motivational Interviewing Treatment Integrity Code. We hope that researchers will consider these recommendations and include MI fidelity measures in future studies. Copyright © 2015 Elsevier Inc. All rights reserved.

  4. Probabilistic evaluation of fuselage-type composite structures

    NASA Technical Reports Server (NTRS)

    Shiao, Michael C.; Chamis, Christos C.

    1992-01-01

    A methodology is developed to computationally simulate the uncertain behavior of composite structures. The uncertain behavior includes buckling loads, natural frequencies, displacements, stress/strain etc., which are the consequences of the random variation (scatter) of the primitive (independent random) variables in the constituent, ply, laminate and structural levels. This methodology is implemented in the IPACS (Integrated Probabilistic Assessment of Composite Structures) computer code. A fuselage-type composite structure is analyzed to demonstrate the code's capability. The probability distribution functions of the buckling loads, natural frequency, displacement, strain and stress are computed. The sensitivity of each primitive (independent random) variable to a given structural response is also identified from the analyses.

  5. Scaling features of noncoding DNA

    NASA Technical Reports Server (NTRS)

    Stanley, H. E.; Buldyrev, S. V.; Goldberger, A. L.; Havlin, S.; Peng, C. K.; Simons, M.

    1999-01-01

    We review evidence supporting the idea that the DNA sequence in genes containing noncoding regions is correlated, and that the correlation is remarkably long range--indeed, base pairs thousands of base pairs distant are correlated. We do not find such a long-range correlation in the coding regions of the gene, and utilize this fact to build a Coding Sequence Finder Algorithm, which uses statistical ideas to locate the coding regions of an unknown DNA sequence. Finally, we describe briefly some recent work adapting to DNA the Zipf approach to analyzing linguistic texts, and the Shannon approach to quantifying the "redundancy" of a linguistic text in terms of a measurable entropy function, and reporting that noncoding regions in eukaryotes display a larger redundancy than coding regions. Specifically, we consider the possibility that this result is solely a consequence of nucleotide concentration differences as first noted by Bonhoeffer and his collaborators. We find that cytosine-guanine (CG) concentration does have a strong "background" effect on redundancy. However, we find that for the purine-pyrimidine binary mapping rule, which is not affected by the difference in CG concentration, the Shannon redundancy for the set of analyzed sequences is larger for noncoding regions compared to coding regions.

  6. Surviving "Payment by Results": a simple method of improving clinical coding in burn specialised services in the United Kingdom.

    PubMed

    Wallis, Katy L; Malic, Claudia C; Littlewood, Sonia L; Judkins, Keith; Phipps, Alan R

    2009-03-01

    Coding inpatient episodes plays an important role in determining the financial remuneration of a clinical service. Insufficient or incomplete data may have very significant consequences on its viability. We created a document that improves the coding process in our Burns Centre. At Yorkshire Regional Burns Centre an inpatient summary sheet was designed to prospectively record and present essential information on a daily basis, for use in the coding process. The level of care was also recorded. A 3-month audit was conducted to assess the efficacy of the new forms. Forty-nine patients were admitted to the Burns Centre with a mean age of 27.6 years and TBSA ranging from 0.5% to 65%. The total stay in the Burns Centre was 758 days, of which 22% were at level B3-B5 and 39% at level B2. The use of the new discharge document identified potential income of about 500,000 GB pound sterling at our local daily tariffs for high dependency and intensive care. The new form is able to ensure a high quality of coding with a possible direct impact on the financial resources accrued for burn care.

  7. W-026, Waste Receiving and Processing Facility data management system validation and verification report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Palmer, M.E.

    1997-12-05

    This V and V Report includes analysis of two revisions of the DMS [data management system] System Requirements Specification (SRS) and the Preliminary System Design Document (PSDD); the source code for the DMS Communication Module (DMSCOM) messages; the source code for selected DMS Screens, and the code for the BWAS Simulator. BDM Federal analysts used a series of matrices to: compare the requirements in the System Requirements Specification (SRS) to the specifications found in the System Design Document (SDD), to ensure the design supports the business functions, compare the discreet parts of the SDD with each other, to ensure thatmore » the design is consistent and cohesive, compare the source code of the DMS Communication Module with the specifications, to ensure that the resultant messages will support the design, compare the source code of selected screens to the specifications to ensure that resultant system screens will support the design, compare the source code of the BWAS simulator with the requirements to interface with DMS messages and data transfers relating to the BWAS operations.« less

  8. Coding of procedures documented by general practitioners in Swedish primary care-an explorative study using two procedure coding systems

    PubMed Central

    2012-01-01

    Background Procedures documented by general practitioners in primary care have not been studied in relation to procedure coding systems. We aimed to describe procedures documented by Swedish general practitioners in electronic patient records and to compare them to the Swedish Classification of Health Interventions (KVÅ) and SNOMED CT. Methods Procedures in 200 record entries were identified, coded, assessed in relation to two procedure coding systems and analysed. Results 417 procedures found in the 200 electronic patient record entries were coded with 36 different Classification of Health Interventions categories and 148 different SNOMED CT concepts. 22.8% of the procedures could not be coded with any Classification of Health Interventions category and 4.3% could not be coded with any SNOMED CT concept. 206 procedure-concept/category pairs were assessed as a complete match in SNOMED CT compared to 10 in the Classification of Health Interventions. Conclusions Procedures documented by general practitioners were present in nearly all electronic patient record entries. Almost all procedures could be coded using SNOMED CT. Classification of Health Interventions covered the procedures to a lesser extent and with a much lower degree of concordance. SNOMED CT is a more flexible terminology system that can be used for different purposes for procedure coding in primary care. PMID:22230095

  9. Child Injury Deaths: Comparing Prevention Information from Two Coding Systems

    PubMed Central

    Schnitzer, Patricia G.; Ewigman, Bernard G.

    2006-01-01

    Objectives The International Classification of Disease (ICD) external cause of injury E-codes do not sufficiently identify injury circumstances amenable to prevention. The researchers developed an alternative classification system (B-codes) that incorporates behavioral and environmental factors, for use in childhood injury research, and compare the two coding systems in this paper. Methods All fatal injuries among children less than age five that occurred between January 1, 1992, and December 31, 1994, were classified using both B-codes and E-codes. Results E-codes identified the most common causes of injury death: homicide (24%), fires (21%), motor vehicle incidents (21%), drowning (10%), and suffocation (9%). The B-codes further revealed that homicides (51%) resulted from the child being shaken or struck by another person; many fires deaths (42%) resulted from children playing with matches or lighters; drownings (46%) usually occurred in natural bodies of water; and most suffocation deaths (68%) occurred in unsafe sleeping arrangements. Conclusions B-codes identify additional information with specific relevance for prevention of childhood injuries. PMID:15944169

  10. The role of inheritance in structuring hyperextended rift systems

    NASA Astrophysics Data System (ADS)

    Manatschal, Gianreto; Lavier, Luc; Chenin, Pauline

    2015-04-01

    A long-standing question in Earth Sciences is related to the importance of inheritance in controlling tectonic processes. In contrast to physical processes that are generally applicable, assessing the role of inheritance suffers from two major problems: firstly, it is difficult to appraise without having insights into the history of a geological system; and secondly all inherited features are not reactivated during subsequent deformation phases. Therefore, the aim of our presentation is to give some conceptual framework about how inheritance may control the architecture and evolution of hyperextended rift systems. We use the term inheritance to refer to the difference between an "ideal" layer-cake type lithosphere and a "real" lithosphere containing heterogeneities and we define 3 types of inheritance, namely structural, compositional and thermal inheritance. Moreover, we assume that the evolution of hyperextended rift systems reflects the interplay between their inheritance (innate/"genetic code") and the physical processes at play (acquired/external factors). Thus, by observing the architecture and evolution of hyperextended rift systems and integrating the physical processes, one my get hints on what may have been the original inheritance of a system. Using this approach, we focus on 3 well-studied rift systems that are the Alpine Tethys, Pyrenean-Bay of Biscay and Iberia-Newfoundland rift systems. For the studied examples we can show that: 1) strain localization on a local scale and during early stages of rifting is controlled by inherited structures and weaknesses 2) the architecture of the necking zone seems to be influenced by the distribution and importance of ductile layers during decoupled deformation and is consequently controlled by the thermal structure and/or the inherited composition of the curst 3) the location of breakup in the 3 examples is not significantly controlled by the inherited structures 4) inherited mantle composition and rift-related mantle processes may control the rheology of the mantle, the magmatic budget, the thermal structure and the localization of final rifting Conversely, the deformation in hyperextended domains is strongly controlled by weak hydrated minerals (e.g. clay, serpentinite) that result form the breakdown of feldspar and olivine due to fluid and reaction assisted deformation and is consequently not inherited but the result of rift induced processes. These key observations show that both inheritance and rift-induced processes play a significant role in the development of magma-poor rift systems and that the role of inheritance may change as the physical conditions vary during the evolving rifting and as rift-induced processes (serpentinization; magma) become more important. Thus, it is not only important to determine the "genetic code" of a rift system, but also to understand how it interacts and evolves during rifting. Understand how far these new ideas and concepts derived from the southern North Atlantic and Alpine Tethys can be translated to other less explored hyperextended rift systems will be one of the challenges of the future research in rifted margins.

  11. The Social Interactive Coding System (SICS): An On-Line, Clinically Relevant Descriptive Tool.

    ERIC Educational Resources Information Center

    Rice, Mabel L.; And Others

    1990-01-01

    The Social Interactive Coding System (SICS) assesses the continuous verbal interactions of preschool children as a function of play areas, addressees, script codes, and play levels. This paper describes the 26 subjects and the setting involved in SICS development, coding definitions and procedures, training procedures, reliability, sample…

  12. The design of wavefront coded imaging system

    NASA Astrophysics Data System (ADS)

    Lan, Shun; Cen, Zhaofeng; Li, Xiaotong

    2016-10-01

    Wavefront Coding is a new method to extend the depth of field, which combines optical design and signal processing together. By using optical design software ZEMAX ,we designed a practical wavefront coded imaging system based on a conventional Cooke triplet system .Unlike conventional optical system, the wavefront of this new system is modulated by a specially designed phase mask, which makes the point spread function (PSF)of optical system not sensitive to defocus. Therefore, a series of same blurred images obtained at the image plane. In addition, the optical transfer function (OTF) of the wavefront coded imaging system is independent of focus, which is nearly constant with misfocus and has no regions of zeros. All object information can be completely recovered through digital filtering at different defocus positions. The focus invariance of MTF is selected as merit function in this design. And the coefficients of phase mask are set as optimization goals. Compared to conventional optical system, wavefront coded imaging system obtains better quality images under different object distances. Some deficiencies appear in the restored images due to the influence of digital filtering algorithm, which are also analyzed in this paper. The depth of field of the designed wavefront coded imaging system is about 28 times larger than initial optical system, while keeping higher optical power and resolution at the image plane.

  13. Enhanced 2/3 four-ary modulation code using soft-decision Viterbi decoding for four-level holographic data storage systems

    NASA Astrophysics Data System (ADS)

    Kong, Gyuyeol; Choi, Sooyong

    2017-09-01

    An enhanced 2/3 four-ary modulation code using soft-decision Viterbi decoding is proposed for four-level holographic data storage systems. While the previous four-ary modulation codes focus on preventing maximum two-dimensional intersymbol interference patterns, the proposed four-ary modulation code aims at maximizing the coding gains for better bit error rate performances. For achieving significant coding gains from the four-ary modulation codes, we design a new 2/3 four-ary modulation code in order to enlarge the free distance on the trellis through extensive simulation. The free distance of the proposed four-ary modulation code is extended from 1.21 to 2.04 compared with that of the conventional four-ary modulation code. The simulation result shows that the proposed four-ary modulation code has more than 1 dB gains compared with the conventional four-ary modulation code.

  14. Automated encoding of clinical documents based on natural language processing.

    PubMed

    Friedman, Carol; Shagina, Lyudmila; Lussier, Yves; Hripcsak, George

    2004-01-01

    The aim of this study was to develop a method based on natural language processing (NLP) that automatically maps an entire clinical document to codes with modifiers and to quantitatively evaluate the method. An existing NLP system, MedLEE, was adapted to automatically generate codes. The method involves matching of structured output generated by MedLEE consisting of findings and modifiers to obtain the most specific code. Recall and precision applied to Unified Medical Language System (UMLS) coding were evaluated in two separate studies. Recall was measured using a test set of 150 randomly selected sentences, which were processed using MedLEE. Results were compared with a reference standard determined manually by seven experts. Precision was measured using a second test set of 150 randomly selected sentences from which UMLS codes were automatically generated by the method and then validated by experts. Recall of the system for UMLS coding of all terms was .77 (95% CI.72-.81), and for coding terms that had corresponding UMLS codes recall was .83 (.79-.87). Recall of the system for extracting all terms was .84 (.81-.88). Recall of the experts ranged from .69 to .91 for extracting terms. The precision of the system was .89 (.87-.91), and precision of the experts ranged from .61 to .91. Extraction of relevant clinical information and UMLS coding were accomplished using a method based on NLP. The method appeared to be comparable to or better than six experts. The advantage of the method is that it maps text to codes along with other related information, rendering the coded output suitable for effective retrieval.

  15. A novel construction method of QC-LDPC codes based on the subgroup of the finite field multiplicative group for optical transmission systems

    NASA Astrophysics Data System (ADS)

    Yuan, Jian-guo; Zhou, Guang-xiang; Gao, Wen-chun; Wang, Yong; Lin, Jin-zhao; Pang, Yu

    2016-01-01

    According to the requirements of the increasing development for optical transmission systems, a novel construction method of quasi-cyclic low-density parity-check (QC-LDPC) codes based on the subgroup of the finite field multiplicative group is proposed. Furthermore, this construction method can effectively avoid the girth-4 phenomena and has the advantages such as simpler construction, easier implementation, lower encoding/decoding complexity, better girth properties and more flexible adjustment for the code length and code rate. The simulation results show that the error correction performance of the QC-LDPC(3 780,3 540) code with the code rate of 93.7% constructed by this proposed method is excellent, its net coding gain is respectively 0.3 dB, 0.55 dB, 1.4 dB and 1.98 dB higher than those of the QC-LDPC(5 334,4 962) code constructed by the method based on the inverse element characteristics in the finite field multiplicative group, the SCG-LDPC(3 969,3 720) code constructed by the systematically constructed Gallager (SCG) random construction method, the LDPC(32 640,30 592) code in ITU-T G.975.1 and the classic RS(255,239) code which is widely used in optical transmission systems in ITU-T G.975 at the bit error rate ( BER) of 10-7. Therefore, the constructed QC-LDPC(3 780,3 540) code is more suitable for optical transmission systems.

  16. The EPQ Code System for Simulating the Thermal Response of Plasma-Facing Components to High-Energy Electron Impact

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ward, Robert Cameron; Steiner, Don

    2004-06-15

    The generation of runaway electrons during a thermal plasma disruption is a concern for the safe and economical operation of a tokamak power system. Runaway electrons have high energy, 10 to 300 MeV, and may potentially cause extensive damage to plasma-facing components (PFCs) through large temperature increases, melting of metallic components, surface erosion, and possible burnout of coolant tubes. The EPQ code system was developed to simulate the thermal response of PFCs to a runaway electron impact. The EPQ code system consists of several parts: UNIX scripts that control the operation of an electron-photon Monte Carlo code to calculate themore » interaction of the runaway electrons with the plasma-facing materials; a finite difference code to calculate the thermal response, melting, and surface erosion of the materials; a code to process, scale, transform, and convert the electron Monte Carlo data to volumetric heating rates for use in the thermal code; and several minor and auxiliary codes for the manipulation and postprocessing of the data. The electron-photon Monte Carlo code used was Electron-Gamma-Shower (EGS), developed and maintained by the National Research Center of Canada. The Quick-Therm-Two-Dimensional-Nonlinear (QTTN) thermal code solves the two-dimensional cylindrical modified heat conduction equation using the Quickest third-order accurate and stable explicit finite difference method and is capable of tracking melting or surface erosion. The EPQ code system is validated using a series of analytical solutions and simulations of experiments. The verification of the QTTN thermal code with analytical solutions shows that the code with the Quickest method is better than 99.9% accurate. The benchmarking of the EPQ code system and QTTN versus experiments showed that QTTN's erosion tracking method is accurate within 30% and that EPQ is able to predict the occurrence of melting within the proper time constraints. QTTN and EPQ are verified and validated as able to calculate the temperature distribution, phase change, and surface erosion successfully.« less

  17. 48 CFR 19.303 - Determining North American Industry Classification System codes and size standards.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... Industry Classification System codes and size standards. 19.303 Section 19.303 Federal Acquisition... of Small Business Status for Small Business Programs 19.303 Determining North American Industry... North American Industry Classification System (NAICS) code and related small business size standard and...

  18. 48 CFR 1601.104-1 - Publication and code arrangement.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 6 2011-10-01 2011-10-01 false Publication and code arrangement. 1601.104-1 Section 1601.104-1 Federal Acquisition Regulations System OFFICE OF PERSONNEL... SYSTEM Purpose, Authority, Issuance 1601.104-1 Publication and code arrangement. (a) The FEHBAR and its...

  19. 48 CFR 2101.104-1 - Publication and code arrangement.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 6 2011-10-01 2011-10-01 false Publication and code arrangement. 2101.104-1 Section 2101.104-1 Federal Acquisition Regulations System OFFICE OF PERSONNEL... REGULATIONS SYSTEM Purpose, Authority, Issuance 2101.104-1 Publication and code arrangement. (a) The LIFAR and...

  20. Decomposition of the optical transfer function: wavefront coding imaging systems

    NASA Astrophysics Data System (ADS)

    Muyo, Gonzalo; Harvey, Andy R.

    2005-10-01

    We describe the mapping of the optical transfer function (OTF) of an incoherent imaging system into a geometrical representation. We show that for defocused traditional and wavefront-coded systems the OTF can be represented as a generalized Cornu spiral. This representation provides a physical insight into the way in which wavefront coding can increase the depth of field of an imaging system and permits analytical quantification of salient OTF parameters, such as the depth of focus, the location of nulls, and amplitude and phase modulation of the wavefront-coding OTF.

  1. Joint design of QC-LDPC codes for coded cooperation system with joint iterative decoding

    NASA Astrophysics Data System (ADS)

    Zhang, Shunwai; Yang, Fengfan; Tang, Lei; Ejaz, Saqib; Luo, Lin; Maharaj, B. T.

    2016-03-01

    In this paper, we investigate joint design of quasi-cyclic low-density-parity-check (QC-LDPC) codes for coded cooperation system with joint iterative decoding in the destination. First, QC-LDPC codes based on the base matrix and exponent matrix are introduced, and then we describe two types of girth-4 cycles in QC-LDPC codes employed by the source and relay. In the equivalent parity-check matrix corresponding to the jointly designed QC-LDPC codes employed by the source and relay, all girth-4 cycles including both type I and type II are cancelled. Theoretical analysis and numerical simulations show that the jointly designed QC-LDPC coded cooperation well combines cooperation gain and channel coding gain, and outperforms the coded non-cooperation under the same conditions. Furthermore, the bit error rate performance of the coded cooperation employing jointly designed QC-LDPC codes is better than those of random LDPC codes and separately designed QC-LDPC codes over AWGN channels.

  2. VeryVote: A Voter Verifiable Code Voting System

    NASA Astrophysics Data System (ADS)

    Joaquim, Rui; Ribeiro, Carlos; Ferreira, Paulo

    Code voting is a technique used to address the secure platform problem of remote voting. A code voting system consists in secretly sending, e.g. by mail, code sheets to voters that map their choices to entry codes in their ballot. While voting, the voter uses the code sheet to know what code to enter in order to vote for a particular candidate. In effect, the voter does the vote encryption and, since no malicious software on the PC has access to the code sheet it is not able to change the voter’s intention. However, without compromising the voter’s privacy, the vote codes are not enough to prove that the vote is recorded and counted as cast by the election server.

  3. Audit of Clinical Coding of Major Head and Neck Operations

    PubMed Central

    Mitra, Indu; Malik, Tass; Homer, Jarrod J; Loughran, Sean

    2009-01-01

    INTRODUCTION Within the NHS, operations are coded using the Office of Population Censuses and Surveys (OPCS) classification system. These codes, together with diagnostic codes, are used to generate Healthcare Resource Group (HRG) codes, which correlate to a payment bracket. The aim of this study was to determine whether allocated procedure codes for major head and neck operations were correct and reflective of the work undertaken. HRG codes generated were assessed to determine accuracy of remuneration. PATIENTS AND METHODS The coding of consecutive major head and neck operations undertaken in a tertiary referral centre over a retrospective 3-month period were assessed. Procedure codes were initially ascribed by professional hospital coders. Operations were then recoded by the surgical trainee in liaison with the head of clinical coding. The initial and revised procedure codes were compared and used to generate HRG codes, to determine whether the payment banding had altered. RESULTS A total of 34 cases were reviewed. The number of procedure codes generated initially by the clinical coders was 99, whereas the revised codes generated 146. Of the original codes, 47 of 99 (47.4%) were incorrect. In 19 of the 34 cases reviewed (55.9%), the HRG code remained unchanged, thus resulting in the correct payment. Six cases were never coded, equating to £15,300 loss of payment. CONCLUSIONS These results highlight the inadequacy of this system to reward hospitals for the work carried out within the NHS in a fair and consistent manner. The current coding system was found to be complicated, ambiguous and inaccurate, resulting in loss of remuneration. PMID:19220944

  4. Recent improvements of reactor physics codes in MHI

    NASA Astrophysics Data System (ADS)

    Kosaka, Shinya; Yamaji, Kazuya; Kirimura, Kazuki; Kamiyama, Yohei; Matsumoto, Hideki

    2015-12-01

    This paper introduces recent improvements for reactor physics codes in Mitsubishi Heavy Industries, Ltd(MHI). MHI has developed a new neutronics design code system Galaxy/Cosmo-S(GCS) for PWR core analysis. After TEPCO's Fukushima Daiichi accident, it is required to consider design extended condition which has not been covered explicitly by the former safety licensing analyses. Under these circumstances, MHI made some improvements for GCS code system. A new resonance calculation model of lattice physics code and homogeneous cross section representative model for core simulator have been developed to apply more wide range core conditions corresponding to severe accident status such like anticipated transient without scram (ATWS) analysis and criticality evaluation of dried-up spent fuel pit. As a result of these improvements, GCS code system has very wide calculation applicability with good accuracy for any core conditions as far as fuel is not damaged. In this paper, the outline of GCS code system is described briefly and recent relevant development activities are presented.

  5. Recent improvements of reactor physics codes in MHI

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kosaka, Shinya, E-mail: shinya-kosaka@mhi.co.jp; Yamaji, Kazuya; Kirimura, Kazuki

    2015-12-31

    This paper introduces recent improvements for reactor physics codes in Mitsubishi Heavy Industries, Ltd(MHI). MHI has developed a new neutronics design code system Galaxy/Cosmo-S(GCS) for PWR core analysis. After TEPCO’s Fukushima Daiichi accident, it is required to consider design extended condition which has not been covered explicitly by the former safety licensing analyses. Under these circumstances, MHI made some improvements for GCS code system. A new resonance calculation model of lattice physics code and homogeneous cross section representative model for core simulator have been developed to apply more wide range core conditions corresponding to severe accident status such like anticipatedmore » transient without scram (ATWS) analysis and criticality evaluation of dried-up spent fuel pit. As a result of these improvements, GCS code system has very wide calculation applicability with good accuracy for any core conditions as far as fuel is not damaged. In this paper, the outline of GCS code system is described briefly and recent relevant development activities are presented.« less

  6. Numerical predictions of EML (electromagnetic launcher) system performance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schnurr, N.M.; Kerrisk, J.F.; Davidson, R.F.

    1987-01-01

    The performance of an electromagnetic launcher (EML) depends on a large number of parameters, including the characteristics of the power supply, rail geometry, rail and insulator material properties, injection velocity, and projectile mass. EML system performance is frequently limited by structural or thermal effects in the launcher (railgun). A series of computer codes has been developed at the Los Alamos National Laboratory to predict EML system performance and to determine the structural and thermal constraints on barrel design. These codes include FLD, a two-dimensional electrostatic code used to calculate the high-frequency inductance gradient and surface current density distribution for themore » rails; TOPAZRG, a two-dimensional finite-element code that simultaneously analyzes thermal and electromagnetic diffusion in the rails; and LARGE, a code that predicts the performance of the entire EML system. Trhe NIKE2D code, developed at the Lawrence Livermore National Laboratory, is used to perform structural analyses of the rails. These codes have been instrumental in the design of the Lethality Test System (LTS) at Los Alamos, which has an ultimate goal of accelerating a 30-g projectile to a velocity of 15 km/s. The capabilities of the individual codes and the coupling of these codes to perform a comprehensive analysis is discussed in relation to the LTS design. Numerical predictions are compared with experimental data and presented for the LTS prototype tests.« less

  7. Two-dimensional imaging via a narrowband MIMO radar system with two perpendicular linear arrays.

    PubMed

    Wang, Dang-wei; Ma, Xiao-yan; Su, Yi

    2010-05-01

    This paper presents a system model and method for the 2-D imaging application via a narrowband multiple-input multiple-output (MIMO) radar system with two perpendicular linear arrays. Furthermore, the imaging formulation for our method is developed through a Fourier integral processing, and the parameters of antenna array including the cross-range resolution, required size, and sampling interval are also examined. Different from the spatial sequential procedure sampling the scattered echoes during multiple snapshot illuminations in inverse synthetic aperture radar (ISAR) imaging, the proposed method utilizes a spatial parallel procedure to sample the scattered echoes during a single snapshot illumination. Consequently, the complex motion compensation in ISAR imaging can be avoided. Moreover, in our array configuration, multiple narrowband spectrum-shared waveforms coded with orthogonal polyphase sequences are employed. The mainlobes of the compressed echoes from the different filter band could be located in the same range bin, and thus, the range alignment in classical ISAR imaging is not necessary. Numerical simulations based on synthetic data are provided for testing our proposed method.

  8. Modality and morphology: what we write may not be what we say.

    PubMed

    Rapp, Brenda; Fischer-Baum, Simon; Miozzo, Michele

    2015-06-01

    Written language is an evolutionarily recent human invention; consequently, its neural substrates cannot be determined by the genetic code. How, then, does the brain incorporate skills of this type? One possibility is that written language is dependent on evolutionarily older skills, such as spoken language; another is that dedicated substrates develop with expertise. If written language does depend on spoken language, then acquired deficits of spoken and written language should necessarily co-occur. Alternatively, if at least some substrates are dedicated to written language, such deficits may doubly dissociate. We report on 5 individuals with aphasia, documenting a double dissociation in which the production of affixes (e.g., the -ing in jumping) is disrupted in writing but not speaking or vice versa. The findings reveal that written- and spoken-language systems are considerably independent from the standpoint of morpho-orthographic operations. Understanding this independence of the orthographic system in adults has implications for the education and rehabilitation of people with written-language deficits. © The Author(s) 2015.

  9. The island of time: yélî dnye, the language of rossel island.

    PubMed

    Levinson, Stephen C; Majid, Asifa

    2013-01-01

    This paper describes the linguistic description of time, the accompanying gestural system, and the "mental time lines" found in the speakers of Yélî Dnye, an isolate language spoken offshore from Papua New Guinea. Like many indigenous languages, Yélî Dnye has no fixed anchoring of time and thus no calendrical time. Instead, time in Yélî Dnye linguistic description is primarily anchored to the time of speaking, with six diurnal tenses and special nominals for n days from coding time; this is supplemented with special constructions for overlapping events. Consequently there is relatively little cross-over or metaphor from space to time. The gesture system, on the other hand, uses pointing to sun position to indicate time of day and may make use of systematic time lines. Experimental evidence fails to show a single robust axis used for mapping time to space. This suggests that there may not be a strong, universal tendency for systematic space-time mappings.

  10. Thermal-hydraulic analysis of N Reactor graphite and shield cooling system performance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Low, J.O.; Schmitt, B.E.

    1988-02-01

    A series of bounding (worst-case) calculations were performed using a detailed hydrodynamic RELAP5 model of the N Reactor graphite and shield cooling system (GSCS). These calculations were specifically aimed to answer issues raised by the Westinghouse Independent Safety Review (WISR) committee. These questions address the operability of the GSCS during a worst-case degraded-core accident that requires the GDCS to mitigate the consequences of the accident. An accident scenario previously developed was designed as the hydrogen-mitigation design-basis accident (HMDBA). Previous HMDBA heat transfer analysis,, using the TRUMP-BD code, was used to define the thermal boundary conditions that the GSDS may bemore » exposed to. These TRUMP/HMDBA analysis results were used to define the bounding operating conditions of the GSCS during the course of an HMDBA transient. Nominal and degraded GSCS scenarios were investigated using RELAP5 within or at the bounds of the HMDBA transient. 10 refs., 42 figs., 10 tabs.« less

  11. Early Warning Signals of Ecological Transitions: Methods for Spatial Patterns

    PubMed Central

    Brock, William A.; Carpenter, Stephen R.; Ellison, Aaron M.; Livina, Valerie N.; Seekell, David A.; Scheffer, Marten; van Nes, Egbert H.; Dakos, Vasilis

    2014-01-01

    A number of ecosystems can exhibit abrupt shifts between alternative stable states. Because of their important ecological and economic consequences, recent research has focused on devising early warning signals for anticipating such abrupt ecological transitions. In particular, theoretical studies show that changes in spatial characteristics of the system could provide early warnings of approaching transitions. However, the empirical validation of these indicators lag behind their theoretical developments. Here, we summarize a range of currently available spatial early warning signals, suggest potential null models to interpret their trends, and apply them to three simulated spatial data sets of systems undergoing an abrupt transition. In addition to providing a step-by-step methodology for applying these signals to spatial data sets, we propose a statistical toolbox that may be used to help detect approaching transitions in a wide range of spatial data. We hope that our methodology together with the computer codes will stimulate the application and testing of spatial early warning signals on real spatial data. PMID:24658137

  12. STBC AF relay for unmanned aircraft system

    NASA Astrophysics Data System (ADS)

    Adachi, Fumiyuki; Miyazaki, Hiroyuki; Endo, Chikara

    2015-01-01

    If a large scale disaster similar to the Great East Japan Earthquake 2011 happens, some areas may be isolated from the communications network. Recently, unmanned aircraft system (UAS) based wireless relay communication has been attracting much attention since it is able to quickly re-establish the connection between isolated areas and the network. However, the channel between ground station (GS) and unmanned aircraft (UA) is unreliable due to UA's swing motion and as consequence, the relay communication quality degrades. In this paper, we introduce space-time block coded (STBC) amplify-and-forward (AF) relay for UAS based wireless relay communication to improve relay communication quality. A group of UAs forms single frequency network (SFN) to perform STBC-AF cooperative relay. In STBC-AF relay, only conjugate operation, block exchange and amplifying are required at UAs. Therefore, STBC-AF relay improves the relay communication quality while alleviating the complexity problem at UAs. It is shown by computer simulation that STBC-AF relay can achieve better throughput performance than conventional AF relay.

  13. Scale invariance in chaotic time series: Classical and quantum examples

    NASA Astrophysics Data System (ADS)

    Landa, Emmanuel; Morales, Irving O.; Stránský, Pavel; Fossion, Rubén; Velázquez, Victor; López Vieyra, J. C.; Frank, Alejandro

    Important aspects of chaotic behavior appear in systems of low dimension, as illustrated by the Map Module 1. It is indeed a remarkable fact that all systems tha make a transition from order to disorder display common properties, irrespective of their exacta functional form. We discuss evidence for 1/f power spectra in the chaotic time series associated in classical and quantum examples, the one-dimensional map module 1 and the spectrum of 48Ca. A Detrended Fluctuation Analysis (DFA) method is applied to investigate the scaling properties of the energy fluctuations in the spectrum of 48Ca obtained with a large realistic shell model calculation (ANTOINE code) and with a random shell model (TBRE) calculation also in the time series obtained with the map mod 1. We compare the scale invariant properties of the 48Ca nuclear spectrum sith similar analyses applied to the RMT ensambles GOE and GDE. A comparison with the corresponding power spectra is made in both cases. The possible consequences of the results are discussed.

  14. How far could we make ourselves understood by the Andromedans? - an evolutionary cybernetic problem in hierarchical dynamics

    NASA Astrophysics Data System (ADS)

    Santoli, Salvatore

    1994-01-01

    The mechanistic interpretation of the communication process between cognitive hierarchical systems as an iterated pair of convolutions between the incoming discrete time series signals and the chaotic dynamics (CD) at the nm-scale of the perception (energy) wetware level, with the consequent feeding of the resulting collective properties to the CD software (symbolic) level, shows that the category of quality, largely present in Galilean quantitative-minded science, is to be increasingly made into quantity for finding optimum common codes for communication between different intelligent beings. The problem is similar to that solved by biological evolution, of communication between the conscious logic brain and the underlying unfelt ultimate extra-logical processes, as well as to the problem of the mind-body or the structure-function dichotomies. Perspective cybernated nanotechnological and/or nanobiological interfaces, and time evolution of the 'contact language' (the iterated dialogic process) as a self-organising system might improve human-alien understanding.

  15. Detection and Modeling of High-Dimensional Thresholds for Fault Detection and Diagnosis

    NASA Technical Reports Server (NTRS)

    He, Yuning

    2015-01-01

    Many Fault Detection and Diagnosis (FDD) systems use discrete models for detection and reasoning. To obtain categorical values like oil pressure too high, analog sensor values need to be discretized using a suitablethreshold. Time series of analog and discrete sensor readings are processed and discretized as they come in. This task isusually performed by the wrapper code'' of the FDD system, together with signal preprocessing and filtering. In practice,selecting the right threshold is very difficult, because it heavily influences the quality of diagnosis. If a threshold causesthe alarm trigger even in nominal situations, false alarms will be the consequence. On the other hand, if threshold settingdoes not trigger in case of an off-nominal condition, important alarms might be missed, potentially causing hazardoussituations. In this paper, we will in detail describe the underlying statistical modeling techniques and algorithm as well as the Bayesian method for selecting the most likely shape and its parameters. Our approach will be illustrated by several examples from the Aerospace domain.

  16. Identification codes for organizations listed in computerized data systems of the U.S. Geological Survey

    USGS Publications Warehouse

    Blackwell, C.D.

    1988-01-01

    Codes for the unique identification of public and private organizations listed in computerized data systems are presented. These codes are used by the U.S. Geological Survey 's National Water Data Exchange (NAWDEX), National Water Data Storage and Retrieval System (WATSTORE), National Cartographic Information Center (NCIC), and Office of Water Data Coordination (OWDC). The format structure of the codes is discussed and instructions are given for requesting new books. (Author 's abstract)

  17. CORESAFE: A Formal Approach against Code Replacement Attacks on Cyber Physical Systems

    DTIC Science & Technology

    2018-04-19

    AFRL-AFOSR-JP-TR-2018-0035 CORESAFE:A Formal Approach against Code Replacement Attacks on Cyber Physical Systems Sandeep Shukla INDIAN INSTITUTE OF...Formal Approach against Code Replacement Attacks on Cyber Physical Systems 5a.  CONTRACT NUMBER 5b.  GRANT NUMBER FA2386-16-1-4099 5c.  PROGRAM ELEMENT...Institute of Technology Kanpur India Final Report for AOARD Grant “CORESAFE: A Formal Approach against Code Replacement Attacks on Cyber Physical

  18. Combined trellis coding with asymmetric MPSK modulation: An MSAT-X report

    NASA Technical Reports Server (NTRS)

    Simon, M. K.; Divsalar, D.

    1985-01-01

    Traditionally symmetric, multiple phase-shift-keyed (MPSK) signal constellations, i.e., those with uniformly spaced signal points around the circle, have been used for both uncoded and coded systems. Although symmetric MPSK signal constellations are optimum for systems with no coding, the same is not necessarily true for coded systems. This appears to show that by designing the signal constellations to be asymmetric, one can, in many instances, obtain a significant performance improvement over the traditional symmetric MPSK constellations combined with trellis coding. The joint design of n/(n + 1) trellis codes and asymmetric 2 sup n + 1 - point MPSK is considered, which has a unity bandwidth expansion relative to uncoded 2 sup n-point symmetric MPSK. The asymptotic performance gains due to coding and asymmetry are evaluated in terms of the minimum free Euclidean distance free of the trellis. A comparison of the maximum value of this performance measure with the minimum distance d sub min of the uncoded system is an indication of the maximum reduction in required E sub b/N sub O that can be achieved for arbitrarily small system bit-error rates. It is to be emphasized that the introduction of asymmetry into the signal set does not effect the bandwidth of power requirements of the system; hence, the above-mentioned improvements in performance come at little or no cost. MPSK signal sets in coded systems appear in the work of Divsalar.

  19. Barriers and facilitators to the implementation of a school-based physical activity policy in Canada: application of the theoretical domains framework.

    PubMed

    Weatherson, Katie A; McKay, Rhyann; Gainforth, Heather L; Jung, Mary E

    2017-10-23

    In British Columbia Canada, a Daily Physical Activity (DPA) policy was mandated that requires elementary school teachers to provide students with opportunities to achieve 30 min of physical activity during the school day. However, the implementation of school-based physical activity policies is influenced by many factors. A theoretical examination of the factors that impede and enhance teachers' implementation of physical activity policies is necessary in order to develop strategies to improve policy practice and achieve desired outcomes. This study used the Theoretical Domains Framework (TDF) to understand teachers' barriers and facilitators to the implementation of the DPA policy in one school district. Additionally, barriers and facilitators were examined and compared according to how the teacher implemented the DPA policy during the instructional school day. Interviews were conducted with thirteen teachers and transcribed verbatim. One researcher performed barrier and facilitator extraction, with double extraction occurring across a third of the interview transcripts by a second researcher. A deductive and inductive analytical approach in a two-stage process was employed whereby barriers and facilitators were deductively coded using TDF domains (content analysis) and analyzed for sub-themes within each domain. Two researchers performed coding. A total of 832 items were extracted from the interview transcripts. Some items were coded into multiple TDF domains, resulting in a total of 1422 observations. The most commonly coded TDF domains accounting for 75% of the total were Environmental context and resources (ECR; n = 250), Beliefs about consequences (n = 225), Social influences (n = 193), Knowledge (n = 100), and Intentions (n = 88). Teachers who implemented DPA during instructional time differed from those who relied on non-instructional time in relation to Goals, Behavioural regulation, Social/professional role and identity, Beliefs about Consequences. Forty-one qualitative sub-themes were identified across the fourteen domains and exemplary quotes were highlighted. Teachers identified barriers and facilitators relating to all TDF domains, with ECR, Beliefs about consequences, Social influences, Knowledge and Intentions being the most often discussed influencers of DPA policy implementation. Use of the TDF to understand the implementation factors can assist with the systematic development of future interventions to improve implementation.

  20. Predictions of one-group interfacial area transport in TRACE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Worosz, T.; Talley, J. D.; Kim, S.

    In current nuclear reactor system analysis codes utilizing the two-fluid model, flow regime dependent correlations are used to specify the interfacial area concentration (a i). This approach does not capture the continuous evolution of the interfacial structures, and thus, it can pose issues near the transition boundaries. Consequently, a pilot version of the system analysis code TRACE is being developed that employs the interfacial area transport equation (IATE). In this approach, dynamic estimation of a i is provided through mechanistic models for bubble coalescence and breakup. The implementation of the adiabatic, one-group IATE into TRACE is assessed against experimental datamore » from 50 air-water, two-phase flow conditions in pipes ranging in inner diameter from 2.54 to 20.32 cm for both vertical co-current upward and downward flows. Predictions of pressure, void fraction, bubble velocity, and a i data are made. TRACE employing the conventional flow regime-based approach is found to underestimate a i and can only predict linear trends since the calculation is governed by the pressure. Furthermore, trends opposite to that of the data are predicted for some conditions. In contrast, TRACE with the one-group IATE demonstrates a significant improvement in predicting the experimental data with an average disagreement of {+-} 13%. Additionally, TRACE with the one-group IATE is capable of predicting nonlinear axial development of a, by accounting for various bubble interaction mechanisms, such as coalescence and disintegration. (authors)« less

  1. Socioeconomic issues affecting the treatment of obesity in the new millennium.

    PubMed

    Martin, L F; Robinson, A; Moore, B J

    2000-10-01

    The prevalence of obesity among the populations of most developed countries has increased to such an extent that the healthcare and social security/disability system will accumulate direct and indirect costs related to obesity that will be more substantial than those for any other primary disease within this generation. For the past decade, the Healthcare Financing Agency, which oversees the Medicare and Medicaid programmes, has required all physicians and healthcare agencies serving beneficiaries of these programmes to include diagnoses using codes established by the ninth revision of the World Health Organization's International Classification of Diseases. This coding system actually distorts data collection and undermines appropriate medical insurance reimbursement for the treatment of obesity. Societal prejudices, inability of governmental agencies to address future concerns and the business community's attempts to control healthcare costs without addressing the underlying issues contributing to these costs have led to confusion on how to confront this emerging epidemic. How will we develop the scientific knowledge and the political willpower to confront this epidemic? First, we need more accurate methods for classifying obesity and for measuring the cost of treatment. We can then determine if it is more cost effective to prevent or treat obesity early in its evolution or pay for its consequences in the form of treatment costs associated with its multiple comorbid diseases, such as hypertension, other cardiovascular disorders, diabetes mellitus, osteoarthritis and cancers, plus the lost productivity from absenteeism, premature retirement and death.

  2. Public involvement in the priority setting activities of a wait time management initiative: a qualitative case study.

    PubMed

    Bruni, Rebecca A; Laupacis, Andreas; Levinson, Wendy; Martin, Douglas K

    2007-11-16

    As no health system can afford to provide all possible services and treatments for the people it serves, each system must set priorities. Priority setting decision makers are increasingly involving the public in policy making. This study focuses on public engagement in a key priority setting context that plagues every health system around the world: wait list management. The purpose of this study is to describe and evaluate priority setting for the Ontario Wait Time Strategy, with special attention to public engagement. This study was conducted at the Ontario Wait Time Strategy in Ontario, Canada which is part of a Federal-Territorial-Provincial initiative to improve access and reduce wait times in five areas: cancer, cardiac, sight restoration, joint replacements, and diagnostic imaging. There were two sources of data: (1) over 25 documents (e.g. strategic planning reports, public updates), and (2) 28 one-on-one interviews with informants (e.g. OWTS participants, MOHLTC representatives, clinicians, patient advocates). Analysis used a modified thematic technique in three phases: open coding, axial coding, and evaluation. The Ontario Wait Time Strategy partially meets the four conditions of 'accountability for reasonableness'. The public was not directly involved in the priority setting activities of the Ontario Wait Time Strategy. Study participants identified both benefits (supporting the initiative, experts of the lived experience, a publicly funded system and sustainability of the healthcare system) and concerns (personal biases, lack of interest to be involved, time constraints, and level of technicality) for public involvement in the Ontario Wait Time Strategy. Additionally, the participants identified concern for the consequences (sustainability, cannibalism, and a class system) resulting from the Ontario Wait Times Strategy. We described and evaluated a wait time management initiative (the Ontario Wait Time Strategy) with special attention to public engagement, and provided a concrete plan to operationalize a strategy for improving public involvement in this, and other, wait time initiatives.

  3. Public involvement in the priority setting activities of a wait time management initiative: a qualitative case study

    PubMed Central

    Bruni, Rebecca A; Laupacis, Andreas; Levinson, Wendy; Martin, Douglas K

    2007-01-01

    Background As no health system can afford to provide all possible services and treatments for the people it serves, each system must set priorities. Priority setting decision makers are increasingly involving the public in policy making. This study focuses on public engagement in a key priority setting context that plagues every health system around the world: wait list management. The purpose of this study is to describe and evaluate priority setting for the Ontario Wait Time Strategy, with special attention to public engagement. Methods This study was conducted at the Ontario Wait Time Strategy in Ontario, Canada which is part of a Federal-Territorial-Provincial initiative to improve access and reduce wait times in five areas: cancer, cardiac, sight restoration, joint replacements, and diagnostic imaging. There were two sources of data: (1) over 25 documents (e.g. strategic planning reports, public updates), and (2) 28 one-on-one interviews with informants (e.g. OWTS participants, MOHLTC representatives, clinicians, patient advocates). Analysis used a modified thematic technique in three phases: open coding, axial coding, and evaluation. Results The Ontario Wait Time Strategy partially meets the four conditions of 'accountability for reasonableness'. The public was not directly involved in the priority setting activities of the Ontario Wait Time Strategy. Study participants identified both benefits (supporting the initiative, experts of the lived experience, a publicly funded system and sustainability of the healthcare system) and concerns (personal biases, lack of interest to be involved, time constraints, and level of technicality) for public involvement in the Ontario Wait Time Strategy. Additionally, the participants identified concern for the consequences (sustainability, cannibalism, and a class system) resulting from the Ontario Wait Times Strategy. Conclusion We described and evaluated a wait time management initiative (the Ontario Wait Time Strategy) with special attention to public engagement, and provided a concrete plan to operationalize a strategy for improving public involvement in this, and other, wait time initiatives. PMID:18021393

  4. Advanced Imaging Optics Utilizing Wavefront Coding.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Scrymgeour, David; Boye, Robert; Adelsberger, Kathleen

    2015-06-01

    Image processing offers a potential to simplify an optical system by shifting some of the imaging burden from lenses to the more cost effective electronics. Wavefront coding using a cubic phase plate combined with image processing can extend the system's depth of focus, reducing many of the focus-related aberrations as well as material related chromatic aberrations. However, the optimal design process and physical limitations of wavefront coding systems with respect to first-order optical parameters and noise are not well documented. We examined image quality of simulated and experimental wavefront coded images before and after reconstruction in the presence of noise.more » Challenges in the implementation of cubic phase in an optical system are discussed. In particular, we found that limitations must be placed on system noise, aperture, field of view and bandwidth to develop a robust wavefront coded system.« less

  5. Exploring Type and Amount of Parent Talk during Individualized Family Service Plan Meetings

    ERIC Educational Resources Information Center

    Ridgley, Robyn; Snyder, Patricia; McWilliam, R. A.

    2014-01-01

    We discuss the utility of a coding system designed to evaluate the amount and type of parent talk during individualized family service plan (IFSP) meetings. The iterative processes used to develop the "Parent Communication Coding System" (PCCS) and its associated codes are described. In addition, we explored whether PCCS codes could be…

  6. [Amendment of the Kodeks rodzinny i opiekuńczy (Family and Guardianship Code), Chapter I. "Origin of a child"--some remarks of an expert witness in forensic genetics].

    PubMed

    Raczek, Ewa

    2009-01-01

    On June 13, 2009, the new Family and Guardianship Code came into effect. Many important modifications were implemented to Chapter I. "Origin of a child", the issue being of special importance in the work of a forensic geneticist. Those changes are related not only to arguableness of the fatherhood of both types--the one that is judged in lawsuit of denial of the fatherhood and that in which ineffectiveness of paternity is recognized--but for the first time they also demand on maternity testing. The Code defines who--according to Polish law--is a mother to a child and on this base states motherhood. In consequence, the main legal maxim Mater semper certa est, which has existed since Ancient Rome times is now annulled. The paper presents some remarks of an expert witness on the introduced changes.

  7. The Role of Hierarchy in Response Surface Modeling of Wind Tunnel Data

    NASA Technical Reports Server (NTRS)

    DeLoach, Richard

    2010-01-01

    This paper is intended as a tutorial introduction to certain aspects of response surface modeling, for the experimentalist who has started to explore these methods as a means of improving productivity and quality in wind tunnel testing and other aerospace applications. A brief review of the productivity advantages of response surface modeling in aerospace research is followed by a description of the advantages of a common coding scheme that scales and centers independent variables. The benefits of model term reduction are reviewed. A constraint on model term reduction with coded factors is described in some detail, which requires such models to be well-formulated, or hierarchical. Examples illustrate the consequences of ignoring this constraint. The implication for automated regression model reduction procedures is discussed, and some opinions formed from the author s experience are offered on coding, model reduction, and hierarchy.

  8. Optimizing Excited-State Electronic-Structure Codes for Intel Knights Landing: A Case Study on the BerkeleyGW Software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Deslippe, Jack; da Jornada, Felipe H.; Vigil-Fowler, Derek

    2016-10-06

    We profile and optimize calculations performed with the BerkeleyGW code on the Xeon-Phi architecture. BerkeleyGW depends both on hand-tuned critical kernels as well as on BLAS and FFT libraries. We describe the optimization process and performance improvements achieved. We discuss a layered parallelization strategy to take advantage of vector, thread and node-level parallelism. We discuss locality changes (including the consequence of the lack of L3 cache) and effective use of the on-package high-bandwidth memory. We show preliminary results on Knights-Landing including a roofline study of code performance before and after a number of optimizations. We find that the GW methodmore » is particularly well-suited for many-core architectures due to the ability to exploit a large amount of parallelism over plane-wave components, band-pairs, and frequencies.« less

  9. GCKP84-general chemical kinetics code for gas-phase flow and batch processes including heat transfer effects

    NASA Technical Reports Server (NTRS)

    Bittker, D. A.; Scullin, V. J.

    1984-01-01

    A general chemical kinetics code is described for complex, homogeneous ideal gas reactions in any chemical system. The main features of the GCKP84 code are flexibility, convenience, and speed of computation for many different reaction conditions. The code, which replaces the GCKP code published previously, solves numerically the differential equations for complex reaction in a batch system or one dimensional inviscid flow. It also solves numerically the nonlinear algebraic equations describing the well stirred reactor. A new state of the art numerical integration method is used for greatly increased speed in handling systems of stiff differential equations. The theory and the computer program, including details of input preparation and a guide to using the code are given.

  10. The State of Software for Evolutionary Biology.

    PubMed

    Darriba, Diego; Flouri, Tomáš; Stamatakis, Alexandros

    2018-05-01

    With Next Generation Sequencing data being routinely used, evolutionary biology is transforming into a computational science. Thus, researchers have to rely on a growing number of increasingly complex software. All widely used core tools in the field have grown considerably, in terms of the number of features as well as lines of code and consequently, also with respect to software complexity. A topic that has received little attention is the software engineering quality of widely used core analysis tools. Software developers appear to rarely assess the quality of their code, and this can have potential negative consequences for end-users. To this end, we assessed the code quality of 16 highly cited and compute-intensive tools mainly written in C/C++ (e.g., MrBayes, MAFFT, SweepFinder, etc.) and JAVA (BEAST) from the broader area of evolutionary biology that are being routinely used in current data analysis pipelines. Because, the software engineering quality of the tools we analyzed is rather unsatisfying, we provide a list of best practices for improving the quality of existing tools and list techniques that can be deployed for developing reliable, high quality scientific software from scratch. Finally, we also discuss journal as well as science policy and, more importantly, funding issues that need to be addressed for improving software engineering quality as well as ensuring support for developing new and maintaining existing software. Our intention is to raise the awareness of the community regarding software engineering quality issues and to emphasize the substantial lack of funding for scientific software development.

  11. Treatment-Seeking for Tuberculosis-Suggestive Symptoms: A Reflection on the Role of Human Agency in the Context of Universal Health Coverage in Malawi.

    PubMed

    Kumwenda, Moses; Desmond, Nicola; Hart, Graham; Choko, Augustine; Chipungu, Geoffrey A; Nyirenda, Deborah; Shand, Tim; Corbett, Elizabeth L; Chikovore, Jeremiah

    2016-01-01

    Tuberculosis (TB) is highly infectious and one of the leading killers globally. Several studies from sub-Saharan Africa highlight health systems challenges that affect ability to cope with existing disease burden, including TB, although most of these employ survey-type approaches. Consequently, few address community or patient perspectives and experiences. At the same time, understanding of the mechanisms by which the health systems challenges translate into seeking or avoidance of formal health care remains limited. This paper applies the notion of human agency to examine the ways people who have symptoms suggestive of TB respond to and deal with the symptoms vis-à-vis major challenges inherent within health delivery systems. Empirical data were drawn from a qualitative study exploring the ways in which notions of masculinity affect engagement with care, including men's well-documented tendency to delay in seeking care for TB symptoms. The study was carried out in three high-density locales of urban Blantyre, Malawi. Data were collected in March 2011 -March 2012 using focus group discussions, of which eight (mixed sex = two; female only = three; male only = three) were with 74 ordinary community members, and two (both mixed sex) were with 20 health workers; and in-depth interviews with 20 TB patients (female = 14) and 20 un-investigated chronic coughers (female = eight). The research process employed a modified version of grounded theory. Data were coded using a coding scheme that was initially generated from the study aims and subsequently progressively amended to incorporate concepts emerging during the analysis. Coded data were retrieved, re-read, and broken down and reconnected iteratively to generate themes. A myriad of problems were described for health systems at the primary health care level, centring largely on shortages of resources (human, equipment, and drugs) and unprofessional conduct by health care providers. Participants consistently pointed out how the problems could drive patients from promptly reporting symptoms at primary healthcare centres. The accounts suggest that in responding to illness symptoms including those suggestive of TB, patients navigate their options taking into cognisance past and current experiences with formal health systems. Understanding and factoring in the mediating role of such 'agency' is critical when implementing efforts to promote timely response to TB-suggestive symptoms.

  12. Investigation of Near Shannon Limit Coding Schemes

    NASA Technical Reports Server (NTRS)

    Kwatra, S. C.; Kim, J.; Mo, Fan

    1999-01-01

    Turbo codes can deliver performance that is very close to the Shannon limit. This report investigates algorithms for convolutional turbo codes and block turbo codes. Both coding schemes can achieve performance near Shannon limit. The performance of the schemes is obtained using computer simulations. There are three sections in this report. First section is the introduction. The fundamental knowledge about coding, block coding and convolutional coding is discussed. In the second section, the basic concepts of convolutional turbo codes are introduced and the performance of turbo codes, especially high rate turbo codes, is provided from the simulation results. After introducing all the parameters that help turbo codes achieve such a good performance, it is concluded that output weight distribution should be the main consideration in designing turbo codes. Based on the output weight distribution, the performance bounds for turbo codes are given. Then, the relationships between the output weight distribution and the factors like generator polynomial, interleaver and puncturing pattern are examined. The criterion for the best selection of system components is provided. The puncturing pattern algorithm is discussed in detail. Different puncturing patterns are compared for each high rate. For most of the high rate codes, the puncturing pattern does not show any significant effect on the code performance if pseudo - random interleaver is used in the system. For some special rate codes with poor performance, an alternative puncturing algorithm is designed which restores their performance close to the Shannon limit. Finally, in section three, for iterative decoding of block codes, the method of building trellis for block codes, the structure of the iterative decoding system and the calculation of extrinsic values are discussed.

  13. The accuracy of real-time procedure coding by theatre nurses: a comparison with the central national system.

    PubMed

    Maclean, Donald; Younes, Hakim Ben; Forrest, Margaret; Towers, Hazel K

    2012-03-01

    Accurate and timely clinical data are required for clinical and organisational purposes and is especially important for patient management, audit of surgical performance and the electronic health record. The recent introduction of computerised theatre management systems has enabled real-time (point-of-care) operative procedure coding by clinical staff. However the accuracy of these data is unknown. The aim of this Scottish study was to compare the accuracy of theatre nurses' real-time coding on the local theatre management system with the central Scottish Morbidity Record (SMR01). Paired procedural codes were recorded, qualitatively graded for precision and compared (n = 1038). In this study, real-time, point-of-care coding by theatre nurses resulted in significant coding errors compared with the central SMR01 database. Improved collaboration between full-time coders and clinical staff using computerised decision support systems is suggested.

  14. A procedure for linking psychosocial job characteristics data to health surveys.

    PubMed Central

    Schwartz, J E; Pieper, C F; Karasek, R A

    1988-01-01

    A system is presented for linking information about psychosocial characteristics of job situations to national health surveys. Job information can be imputed to individuals on surveys that contain three-digit US Census occupation codes. Occupational mean scores on psychosocial job characteristics-control over task situation (decision latitude), psychological work load, physical exertion, and other measures-for the linkage system are derived from US national surveys of working conditions (Quality of Employment Surveys 1969, 1972, and 1977). This paper discusses a new method for reducing the biases in multivariate analyses that are likely to arise when utilizing linkage systems based on mean scores. Such biases are reduced by modifying the linkage system to adjust imputed individual scores for demographic factors such as age, education, race, marital status and, implicitly, sex (since men and women have separate linkage data bases). Statistics on the linkage system's efficiency and reliability are reported. All dimensions have high inter-survey reproducibility. Despite their psychosocial nature, decision latitude and physical exertion can be more efficiently imputed with the linkage system than earnings (a non-psychosocial job characteristic). The linkage system presented here is a useful tool for initial epidemiological studies of the consequences of psychosocial job characteristics and constitutes the methodological basis for the subsequent paper. PMID:3389426

  15. Characteristics and Consequences of Adult Learning Methods and Strategies. Practical Evaluation Reports, Volume 2, Number 1

    ERIC Educational Resources Information Center

    Trivette, Carol M.; Dunst, Carl J.; Hamby, Deborah W.; O'Herin, Chainey E.

    2009-01-01

    The effectiveness of four adult learning methods (accelerated learning, coaching, guided design, and just-in-time training) constituted the focus of this research synthesis. Findings reported in "How People Learn" (Bransford et al., 2000) were used to operationally define six adult learning method characteristics, and to code and analyze…

  16. Search for the Missing lncs: Gene Regulatory Networks in Neural Crest Development and Long Non-coding RNA Biomarkers of Hirschsprung's Disease

    EPA Science Inventory

    Hirschsprung’s disease (HSCR), a birth defect characterized by variable aganglionosis of the gut, affects about 1 in 5000 births, and is a consequence of abnormal development of neural crest cells, from which enteric ganglia derive. In the companion article in this issue (Shen et...

  17. A Qualitative Study of Immigration Policy and Practice Dilemmas for Social Work Students

    ERIC Educational Resources Information Center

    Furman, Rich; Langer, Carol L.; Sanchez, Thomas Wayne; Negi, Nalini Junko

    2007-01-01

    Social policy shapes the infrastructure wherein social work is practiced. However, what happens when a particular social policy is seemingly incongruent with the social work code of ethics? How do social work students conceive and resolve potential practice dilemmas that may arise as a consequence? In this study, the authors explored potential…

  18. Students Behaving Badly: Policies on Weapons Violations in Florida Schools

    ERIC Educational Resources Information Center

    Dickinson, Wendy B.; Hall, Bruce W.

    2003-01-01

    This study looks at existing aspects of written school violence policies (Codes of Student Conduct) across large, mid-size, and small school districts in Florida. The aim was to provide a clearer picture of how weapons are defined, and the consequences of their possession, use, or display. Two research areas were addressed: (1) What constitutes a…

  19. Does cardiac catheterization laboratory activation by electrocardiography machine auto-interpretation reduce door-to-balloon time?

    PubMed

    Min, Mun Ki; Ryu, Ji Ho; Kim, Yong In; Park, Maeng Real; Park, Yong Myeon; Park, Sung Wook; Yeom, Seok Ran; Han, Sang Kyoon; Kim, Yang Weon

    2014-11-01

    In an attempt to begin ST-segment elevation myocardial infarction (STEMI) treatment more quickly (referred to as door-to-balloon [DTB] time) by minimizing preventable delays in electrocardiogram (ECG) interpretation, cardiac catheterization laboratory (CCL) activation was changed from activation by the emergency physician (code heart I) to activation by a single page if the ECG is interpreted as STEMI by the ECG machine (ECG machine auto-interpretation) (code heart II). We sought to determine the impact of ECG machine auto-interpretation on CCL activation. The study period was from June 2010 to May 2012 (from June to November 2011, code heart I; from December 2011 to May 2012, code heart II). All patients aged 18 years or older who were diagnosed with STEMI were evaluated for enrollment. Patients who experienced the code heart system were also included. Door-to-balloon time before and after code heart system were compared with a retrospective chart review. In addition, to determine the appropriateness of the activation, we compared coronary angiography performance rate and percentage of STEMI between code heart I and II. After the code heart system, the mean DTB time was significantly decreased (before, 96.51 ± 65.60 minutes; after, 65.40 ± 26.40 minutes; P = .043). The STEMI diagnosis and the coronary angiography performance rates were significantly lower in the code heart II group than in the code heart I group without difference in DTB time. Cardiac catheterization laboratory activation by ECG machine auto-interpretation does not reduce DTB time and often unnecessarily activates the code heart system compared with emergency physician-initiated activation. This system therefore decreases the appropriateness of CCL activation. Copyright © 2014 Elsevier Inc. All rights reserved.

  20. A Novel Design of Reconfigurable Wavelength-Time Optical Codes to Enhance Security in Optical CDMA Networks

    NASA Astrophysics Data System (ADS)

    Nasaruddin; Tsujioka, Tetsuo

    An optical CDMA (OCDMA) system is a flexible technology for future broadband multiple access networks. A secure OCDMA network in broadband optical access technologies is also becoming an issue of great importance. In this paper, we propose novel reconfigurable wavelength-time (W-T) optical codes that lead to secure transmission in OCDMA networks. The proposed W-T optical codes are constructed by using quasigroups (QGs) for wavelength hopping and one-dimensional optical orthogonal codes (OOCs) for time spreading; we call them QGs/OOCs. Both QGs and OOCs are randomly generated by a computer search to ensure that an eavesdropper could not improve its interception performance by making use of the coding structure. Then, the proposed reconfigurable QGs/OOCs can provide more codewords, and many different code set patterns, which differ in both wavelength and time positions for given code parameters. Moreover, the bit error probability of the proposed codes is analyzed numerically. To realize the proposed codes, a secure system is proposed by employing reconfigurable encoders/decoders based on array waveguide gratings (AWGs), which allow the users to change their codeword patterns to protect against eavesdropping. Finally, the probability of breaking a certain codeword in the proposed system is evaluated analytically. The results show that the proposed codes and system can provide a large codeword pattern, and decrease the probability of breaking a certain codeword, to enhance OCDMA network security.

  1. Cardinality enhancement utilizing Sequential Algorithm (SeQ) code in OCDMA system

    NASA Astrophysics Data System (ADS)

    Fazlina, C. A. S.; Rashidi, C. B. M.; Rahman, A. K.; Aljunid, S. A.

    2017-11-01

    Optical Code Division Multiple Access (OCDMA) has been important with increasing demand for high capacity and speed for communication in optical networks because of OCDMA technique high efficiency that can be achieved, hence fibre bandwidth is fully used. In this paper we will focus on Sequential Algorithm (SeQ) code with AND detection technique using Optisystem design tool. The result revealed SeQ code capable to eliminate Multiple Access Interference (MAI) and improve Bit Error Rate (BER), Phase Induced Intensity Noise (PIIN) and orthogonally between users in the system. From the results, SeQ shows good performance of BER and capable to accommodate 190 numbers of simultaneous users contrast with existing code. Thus, SeQ code have enhanced the system about 36% and 111% of FCC and DCS code. In addition, SeQ have good BER performance 10-25 at 155 Mbps in comparison with 622 Mbps, 1 Gbps and 2 Gbps bit rate. From the plot graph, 155 Mbps bit rate is suitable enough speed for FTTH and LAN networks. Resolution can be made based on the superior performance of SeQ code. Thus, these codes will give an opportunity in OCDMA system for better quality of service in an optical access network for future generation's usage

  2. New GOES satellite synchronized time code generation

    NASA Technical Reports Server (NTRS)

    Fossler, D. E.; Olson, R. K.

    1984-01-01

    The TRAK Systems' GOES Satellite Synchronized Time Code Generator is described. TRAK Systems has developed this timing instrument to supply improved accuracy over most existing GOES receiver clocks. A classical time code generator is integrated with a GOES receiver.

  3. Problems, solutions and recommendations for implementing CODES (Crash Outcome Data Evaluation System)

    DOT National Transportation Integrated Search

    2001-02-01

    Problems, solutions and recommendations for implementation have been contributed by 16 of the 27 CODES states and organized as appropriate under the administrative, linkage and application requirements for a Crash Outcome Data Evaluation System (CODE...

  4. Validation of the new diagnosis grouping system for pediatric emergency department visits using the International Classification of Diseases, 10th Revision.

    PubMed

    Lee, Jin Hee; Hong, Ki Jeong; Kim, Do Kyun; Kwak, Young Ho; Jang, Hye Young; Kim, Hahn Bom; Noh, Hyun; Park, Jungho; Song, Bongkyu; Jung, Jae Yun

    2013-12-01

    A clinically sensible diagnosis grouping system (DGS) is needed for describing pediatric emergency diagnoses for research, medical resource preparedness, and making national policy for pediatric emergency medical care. The Pediatric Emergency Care Applied Research Network (PECARN) developed the DGS successfully. We developed the modified PECARN DGS based on the different pediatric population of South Korea and validated the system to obtain the accurate and comparable epidemiologic data of pediatric emergent conditions of the selected population. The data source used to develop and validate the modified PECARN DGS was the National Emergency Department Information System of South Korea, which was coded by the International Classification of Diseases, 10th Revision (ICD-10) code system. To develop the modified DGS based on ICD-10 code, we matched the selected ICD-10 codes with those of the PECARN DGS by the General Equivalence Mappings (GEMs). After converting ICD-10 codes to ICD-9 codes by GEMs, we matched ICD-9 codes into PECARN DGS categories using the matrix developed by PECARN group. Lastly, we conducted the expert panel survey using Delphi method for the remaining diagnosis codes that were not matched. A total of 1879 ICD-10 codes were used in development of the modified DGS. After 1078 (57.4%) of 1879 ICD-10 codes were assigned to the modified DGS by GEM and PECARN conversion tools, investigators assigned each of the remaining 801 codes (42.6%) to DGS subgroups by 2 rounds of electronic Delphi surveys. And we assigned the remaining 29 codes (4%) into the modified DGS at the second expert consensus meeting. The modified DGS accounts for 98.7% and 95.2% of diagnoses of the 2008 and 2009 National Emergency Department Information System data set. This modified DGS also exhibited strong construct validity using the concepts of age, sex, site of care, and seasons. This also reflected the 2009 outbreak of H1N1 influenza in Korea. We developed and validated clinically feasible and sensible DGS system for describing pediatric emergent conditions in Korea. The modified PECARN DGS showed good comprehensiveness and demonstrated reliable construct validity. This modified DGS based on PECARN DGS framework may be effectively implemented for research, reporting, and resource planning in pediatric emergency system of South Korea.

  5. A Golay complementary TS-based symbol synchronization scheme in variable rate LDPC-coded MB-OFDM UWBoF system

    NASA Astrophysics Data System (ADS)

    He, Jing; Wen, Xuejie; Chen, Ming; Chen, Lin

    2015-09-01

    In this paper, a Golay complementary training sequence (TS)-based symbol synchronization scheme is proposed and experimentally demonstrated in multiband orthogonal frequency division multiplexing (MB-OFDM) ultra-wideband over fiber (UWBoF) system with a variable rate low-density parity-check (LDPC) code. Meanwhile, the coding gain and spectral efficiency in the variable rate LDPC-coded MB-OFDM UWBoF system are investigated. By utilizing the non-periodic auto-correlation property of the Golay complementary pair, the start point of LDPC-coded MB-OFDM UWB signal can be estimated accurately. After 100 km standard single-mode fiber (SSMF) transmission, at the bit error rate of 1×10-3, the experimental results show that the short block length 64QAM-LDPC coding provides a coding gain of 4.5 dB, 3.8 dB and 2.9 dB for a code rate of 62.5%, 75% and 87.5%, respectively.

  6. Coarse-coded higher-order neural networks for PSRI object recognition. [position, scale, and rotation invariant

    NASA Technical Reports Server (NTRS)

    Spirkovska, Lilly; Reid, Max B.

    1993-01-01

    A higher-order neural network (HONN) can be designed to be invariant to changes in scale, translation, and inplane rotation. Invariances are built directly into the architecture of a HONN and do not need to be learned. Consequently, fewer training passes and a smaller training set are required to learn to distinguish between objects. The size of the input field is limited, however, because of the memory required for the large number of interconnections in a fully connected HONN. By coarse coding the input image, the input field size can be increased to allow the larger input scenes required for practical object recognition problems. We describe a coarse coding technique and present simulation results illustrating its usefulness and its limitations. Our simulations show that a third-order neural network can be trained to distinguish between two objects in a 4096 x 4096 pixel input field independent of transformations in translation, in-plane rotation, and scale in less than ten passes through the training set. Furthermore, we empirically determine the limits of the coarse coding technique in the object recognition domain.

  7. Quantum computation with realistic magic-state factories

    NASA Astrophysics Data System (ADS)

    O'Gorman, Joe; Campbell, Earl T.

    2017-03-01

    Leading approaches to fault-tolerant quantum computation dedicate a significant portion of the hardware to computational factories that churn out high-fidelity ancillas called magic states. Consequently, efficient and realistic factory design is of paramount importance. Here we present the most detailed resource assessment to date of magic-state factories within a surface code quantum computer, along the way introducing a number of techniques. We show that the block codes of Bravyi and Haah [Phys. Rev. A 86, 052329 (2012), 10.1103/PhysRevA.86.052329] have been systematically undervalued; we track correlated errors both numerically and analytically, providing fidelity estimates without appeal to the union bound. We also introduce a subsystem code realization of these protocols with constant time and low ancilla cost. Additionally, we confirm that magic-state factories have space-time costs that scale as a constant factor of surface code costs. We find that the magic-state factory required for postclassical factoring can be as small as 6.3 million data qubits, ignoring ancilla qubits, assuming 10-4 error gates and the availability of long-range interactions.

  8. Joint sparse coding based spatial pyramid matching for classification of color medical image.

    PubMed

    Shi, Jun; Li, Yi; Zhu, Jie; Sun, Haojie; Cai, Yin

    2015-04-01

    Although color medical images are important in clinical practice, they are usually converted to grayscale for further processing in pattern recognition, resulting in loss of rich color information. The sparse coding based linear spatial pyramid matching (ScSPM) and its variants are popular for grayscale image classification, but cannot extract color information. In this paper, we propose a joint sparse coding based SPM (JScSPM) method for the classification of color medical images. A joint dictionary can represent both the color information in each color channel and the correlation between channels. Consequently, the joint sparse codes calculated from a joint dictionary can carry color information, and therefore this method can easily transform a feature descriptor originally designed for grayscale images to a color descriptor. A color hepatocellular carcinoma histological image dataset was used to evaluate the performance of the proposed JScSPM algorithm. Experimental results show that JScSPM provides significant improvements as compared with the majority voting based ScSPM and the original ScSPM for color medical image classification. Copyright © 2014 Elsevier Ltd. All rights reserved.

  9. CARES/LIFE Software Commercialization

    NASA Technical Reports Server (NTRS)

    1995-01-01

    The NASA Lewis Research Center has entered into a letter agreement with BIOSYM Technologies Inc. (now merged with Molecular Simulations Inc. (MSI)). Under this agreement, NASA will provide a developmental copy of the CARES/LIFE computer program to BIOSYM for evaluation. This computer code predicts the time-dependent reliability of a thermomechanically loaded component. BIOSYM will become familiar with CARES/LIFE, provide results of computations useful in validating the code, evaluate it for potential commercialization, and submit suggestions for improvements or extensions to the code or its documentation. If BIOSYM/Molecular Simulations reaches a favorable evaluation of CARES/LIFE, NASA will enter into negotiations for a cooperative agreement with BIOSYM/Molecular Simulations to further develop the code--adding features such as a user-friendly interface and other improvements. This agreement would give BIOSYM intellectual property rights in the modified codes, which they could protect and then commercialize. NASA would provide BIOSYM with the NASA-developed source codes and would agree to cooperate with BIOSYM in further developing the code. In return, NASA would receive certain use rights in the modified CARES/LIFE program. Presently BIOSYM Technologies Inc. has been involved with integration issues concerning its merger with Molecular Simulations Inc., since both companies used to compete in the computational chemistry market, and to some degree, in the materials market. Consequently, evaluation of the CARES/LIFE software is on hold for a month or two while the merger is finalized. Their interest in CARES continues, however, and they expect to get back to the evaluation by early November 1995.

  10. It's time to make management a true profession.

    PubMed

    Khurana, Rakesh; Nohria, Nitin

    2008-10-01

    In the face of the recent institutional breakdown of trust in business, managers are losing legitimacy. To regain public trust, management needs to become a true profession in much the way medicine and law have, argue Khurana and Nohria of Harvard Business School. True professions have codes, and the meaning and consequences of those codes are taught as part of the formal education required of their members. Through these codes, professional institutions forge an implicit social contract with society: Trust us to control and exercise jurisdiction over an important occupational category, and, in return, we will ensurethat the members of our profession are worthy of your trust--that they will not only be competent to perform the tasks entrusted to them, but that they will also conduct themselves with high standardsand great integrity. The authors believe that enforcing educational standards and a code of ethics is unlikely to choke entrepreneurial creativity. Indeed, if the field of medicine is any indication, a code may even stimulate creativity. The main challenge in writing a code lies in reaching a broad consensus on the aims and social purpose of management. There are two deeply divided schools of thought. One school argues that management's aim should simply be to maximize shareholder wealth; the other argues that management's purpose is to balance the claims of all the firm's stakeholders. Any code will have to steer a middle course in order to accommodate both the value-creating impetus of the shareholder value concept and the accountability inherent in the stakeholder approach.

  11. Health information management: an introduction to disease classification and coding.

    PubMed

    Mony, Prem Kumar; Nagaraj, C

    2007-01-01

    Morbidity and mortality data constitute an important component of a health information system and their coding enables uniform data collation and analysis as well as meaningful comparisons between regions or countries. Strengthening the recording and reporting systems for health monitoring is a basic requirement for an efficient health information management system. Increased advocacy for and awareness of a uniform coding system together with adequate capacity building of physicians, coders and other allied health and information technology personnel would pave the way for a valid and reliable health information management system in India. The core requirements for the implementation of disease coding are: (i) support from national/institutional health administrators, (ii) widespread availability of the ICD-10 material for morbidity and mortality coding; (iii) enhanced human and financial resources; and (iv) optimal use of informatics. We describe the methodology of a disease classification and codification system as also its applications for developing and maintaining an effective health information management system for India.

  12. From Verified Models to Verifiable Code

    NASA Technical Reports Server (NTRS)

    Lensink, Leonard; Munoz, Cesar A.; Goodloe, Alwyn E.

    2009-01-01

    Declarative specifications of digital systems often contain parts that can be automatically translated into executable code. Automated code generation may reduce or eliminate the kinds of errors typically introduced through manual code writing. For this approach to be effective, the generated code should be reasonably efficient and, more importantly, verifiable. This paper presents a prototype code generator for the Prototype Verification System (PVS) that translates a subset of PVS functional specifications into an intermediate language and subsequently to multiple target programming languages. Several case studies are presented to illustrate the tool's functionality. The generated code can be analyzed by software verification tools such as verification condition generators, static analyzers, and software model-checkers to increase the confidence that the generated code is correct.

  13. Study on the properties of infrared wavefront coding athermal system under several typical temperature gradient distributions

    NASA Astrophysics Data System (ADS)

    Cai, Huai-yu; Dong, Xiao-tong; Zhu, Meng; Huang, Zhan-hua

    2018-01-01

    Wavefront coding for athermal technique can effectively ensure the stability of the optical system imaging in large temperature range, as well as the advantages of compact structure and low cost. Using simulation method to analyze the properties such as PSF and MTF of wavefront coding athermal system under several typical temperature gradient distributions has directive function to characterize the working state of non-ideal temperature environment, and can effectively realize the system design indicators as well. In this paper, we utilize the interoperability of data between Solidworks and ZEMAX to simplify the traditional process of structure/thermal/optical integrated analysis. Besides, we design and build the optical model and corresponding mechanical model of the infrared imaging wavefront coding athermal system. The axial and radial temperature gradients of different degrees are applied to the whole system by using SolidWorks software, thus the changes of curvature, refractive index and the distance between the lenses are obtained. Then, we import the deformation model to ZEMAX for ray tracing, and obtain the changes of PSF and MTF in optical system. Finally, we discuss and evaluate the consistency of the PSF (MTF) of the wavefront coding athermal system and the image restorability, which provides the basis and reference for the optimal design of the wavefront coding athermal system. The results show that the adaptability of single material infrared wavefront coding athermal system to axial temperature gradient can reach the upper limit of temperature fluctuation of 60°C, which is much higher than that of radial temperature gradient.

  14. Viewing hybrid systems as products of control systems and automata

    NASA Technical Reports Server (NTRS)

    Grossman, R. L.; Larson, R. G.

    1992-01-01

    The purpose of this note is to show how hybrid systems may be modeled as products of nonlinear control systems and finite state automata. By a hybrid system, we mean a network of consisting of continuous, nonlinear control system connected to discrete, finite state automata. Our point of view is that the automata switches between the control systems, and that this switching is a function of the discrete input symbols or letters that it receives. We show how a nonlinear control system may be viewed as a pair consisting of a bialgebra of operators coding the dynamics, and an algebra of observations coding the state space. We also show that a finite automata has a similar representation. A hybrid system is then modeled by taking suitable products of the bialgebras coding the dynamics and the observation algebras coding the state spaces.

  15. The study on dynamic cadastral coding rules based on kinship relationship

    NASA Astrophysics Data System (ADS)

    Xu, Huan; Liu, Nan; Liu, Renyi; Lu, Jingfeng

    2007-06-01

    Cadastral coding rules are an important supplement to the existing national and local standard specifications for building cadastral database. After analyzing the course of cadastral change, especially the parcel change with the method of object-oriented analysis, a set of dynamic cadastral coding rules based on kinship relationship corresponding to the cadastral change is put forward and a coding format composed of street code, block code, father parcel code, child parcel code and grandchild parcel code is worked out within the county administrative area. The coding rule has been applied to the development of an urban cadastral information system called "ReGIS", which is not only able to figure out the cadastral code automatically according to both the type of parcel change and the coding rules, but also capable of checking out whether the code is spatiotemporally unique before the parcel is stored in the database. The system has been used in several cities of Zhejiang Province and got a favorable response. This verifies the feasibility and effectiveness of the coding rules to some extent.

  16. Prior-to-Secondary School Course Classification System: School Codes for the Exchange of Data (SCED). NFES 2011-801

    ERIC Educational Resources Information Center

    National Forum on Education Statistics, 2011

    2011-01-01

    In this handbook, "Prior-to-Secondary School Course Classification System: School Codes for the Exchange of Data" (SCED), the National Center for Education Statistics (NCES) and the National Forum on Education Statistics have extended the existing secondary course classification system with codes and descriptions for courses offered at…

  17. Proposing a Web-Based Tutorial System to Teach Malay Language Braille Code to the Sighted

    ERIC Educational Resources Information Center

    Wah, Lee Lay; Keong, Foo Kok

    2010-01-01

    The "e-KodBrailleBM Tutorial System" is a web-based tutorial system which is specially designed to teach, facilitate and support the learning of Malay Language Braille Code to individuals who are sighted. The targeted group includes special education teachers, pre-service teachers, and parents. Learning Braille code involves memorisation…

  18. Design and implementation of a scene-dependent dynamically selfadaptable wavefront coding imaging system

    NASA Astrophysics Data System (ADS)

    Carles, Guillem; Ferran, Carme; Carnicer, Artur; Bosch, Salvador

    2012-01-01

    A computational imaging system based on wavefront coding is presented. Wavefront coding provides an extension of the depth-of-field at the expense of a slight reduction of image quality. This trade-off results from the amount of coding used. By using spatial light modulators, a flexible coding is achieved which permits it to be increased or decreased as needed. In this paper a computational method is proposed for evaluating the output of a wavefront coding imaging system equipped with a spatial light modulator, with the aim of thus making it possible to implement the most suitable coding strength for a given scene. This is achieved in an unsupervised manner, thus the whole system acts as a dynamically selfadaptable imaging system. The program presented here controls the spatial light modulator and the camera, and also processes the images in a synchronised way in order to implement the dynamic system in real time. A prototype of the system was implemented in the laboratory and illustrative examples of the performance are reported in this paper. Program summaryProgram title: DynWFC (Dynamic WaveFront Coding) Catalogue identifier: AEKC_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEKC_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 10 483 No. of bytes in distributed program, including test data, etc.: 2 437 713 Distribution format: tar.gz Programming language: Labview 8.5 and NI Vision and MinGW C Compiler Computer: Tested on PC Intel ® Pentium ® Operating system: Tested on Windows XP Classification: 18 Nature of problem: The program implements an enhanced wavefront coding imaging system able to adapt the degree of coding to the requirements of a specific scene. The program controls the acquisition by a camera, the display of a spatial light modulator and the image processing operations synchronously. The spatial light modulator is used to implement the phase mask with flexibility given the trade-off between depth-of-field extension and image quality achieved. The action of the program is to evaluate the depth-of-field requirements of the specific scene and subsequently control the coding established by the spatial light modulator, in real time.

  19. Comparisons Between Stability Prediction and Measurements for the Reusable Solid Rocket Motor

    NASA Technical Reports Server (NTRS)

    Fischbach, Sean R.; Kenny, R. Jeremy

    2010-01-01

    The Space Transportation System has used the solid rocket boosters for lift-off and ascent propulsion over the history of the program. Part of the structural loads assessment of the assembled vehicle is the contribution due to solid rocket booster thrust oscillations. These thrust oscillations are a consequence of internal motor pressure oscillations active during operation. Understanding of these pressure oscillations is key to predicting the subsequent thrust oscillations and vehicle loading. The pressure oscillation characteristics of the Reusable Solid Rocket Motor (RSRM) design are reviewed in this work. Dynamic pressure data from the static test and flight history are shown, with emphasis on amplitude, frequency, and timing of the oscillations. Physical mechanisms that cause these oscillations are described by comparing data observations to predictions made by the Solid Stability Prediction (SSP) code.

  20. Field Programmable Gate Array Reliability Analysis Guidelines for Launch Vehicle Reliability Block Diagrams

    NASA Technical Reports Server (NTRS)

    Al Hassan, Mohammad; Britton, Paul; Hatfield, Glen Spencer; Novack, Steven D.

    2017-01-01

    Field Programmable Gate Arrays (FPGAs) integrated circuits (IC) are one of the key electronic components in today's sophisticated launch and space vehicle complex avionic systems, largely due to their superb reprogrammable and reconfigurable capabilities combined with relatively low non-recurring engineering costs (NRE) and short design cycle. Consequently, FPGAs are prevalent ICs in communication protocols and control signal commands. This paper will identify reliability concerns and high level guidelines to estimate FPGA total failure rates in a launch vehicle application. The paper will discuss hardware, hardware description language, and radiation induced failures. The hardware contribution of the approach accounts for physical failures of the IC. The hardware description language portion will discuss the high level FPGA programming languages and software/code reliability growth. The radiation portion will discuss FPGA susceptibility to space environment radiation.

  1. Nursing Management Minimum Data Set: Cost-Effective Tool To Demonstrate the Value of Nurse Staffing in the Big Data Science Era.

    PubMed

    Pruinelli, Lisiane; Delaney, Connie W; Garciannie, Amy; Caspers, Barbara; Westra, Bonnie L

    2016-01-01

    There is a growing body of evidence of the relationship of nurse staffing to patient, nurse, and financial outcomes. With the advent of big data science and developing big data analytics in nursing, data science with the reuse of big data is emerging as a timely and cost-effective approach to demonstrate nursing value. The Nursing Management Minimum Date Set (NMMDS) provides standard administrative data elements, definitions, and codes to measure the context where care is delivered and, consequently, the value of nursing. The integration of the NMMDS elements in the current health system provides evidence for nursing leaders to measure and manage decisions, leading to better patient, staffing, and financial outcomes. It also enables the reuse of data for clinical scholarship and research.

  2. Modification of the band offset in boronitrene

    NASA Astrophysics Data System (ADS)

    Obodo, K. O.; Andrew, R. C.; Chetty, N.

    2011-10-01

    Using density functional methods within the generalized gradient approximation implemented in the Quantum Espresso codes, we modify the band offset in a single layer of boronitrene by substituting a double line of carbon atoms. This effectively introduces a line of dipoles at the interface. We considered various junctions of this system within the zigzag and armchair orientations. Our results show that the “zigzag-short” structure is energetically most stable, with a formation energy of 0.502 eV and with a band offset of 1.51 eV. The “zigzag-long” structure has a band offset of 1.99 eV. The armchair structures are nonpolar, while the zigzag-single structures show a charge accumulation for the C-substituted B and charge depletion for the C-substituted N at the junction. Consequently there is no shifting of the bands.

  3. The physiology of deglutition and the pathophysiology and complications of oropharyngeal dysphagia.

    PubMed

    Steele, Catriona M

    2012-01-01

    The opening session of the 2nd International Conference on Oropharyngeal Dysphagia featured a series of invited talks reviewing the definition of dysphagia, its prevalence and its pathophysiology. The discussion arising from these talks focused heavily on the current underrecognition of dysphagia as a significant concern for older adults, particularly those over 75. The burdens associated with dysphagia in this sector of the population were recognized to be substantial, both in social/psychological terms and in terms of economic consequences for the healthcare system. The importance of developing swallow screening protocols as a routine method for the early identification of dysphagia and aspiration was explored. The idea of launching political initiatives aimed at increasing awareness and the utilization of appropriate dysphagia healthcare codes was also discussed. Copyright © 2012 S. Karger AG, Basel.

  4. Reinventing radiology reimbursement.

    PubMed

    Marshall, John; Adema, Denise

    2005-01-01

    Lee Memorial Health System (LMHS), located in southwest Florida, consists of 5 hospitals, a home health agency, a skilled nursing facility, multiple outpatient centers, walk-in medical centers, and primary care physician offices. LMHS annually performs more than 300,000 imaging procedures with gross imaging revenues exceeding dollar 350 million. In fall 2002, LMHS received the results of an independent audit of its IR coding. The overall IR coding error rate was determined to be 84.5%. The projected net financial impact of these errors was an annual reimbursement loss of dollar 182,000. To address the issues of coding errors and reimbursement loss, LMHS implemented its clinical reimbursementspecialist (CRS) system in October 2003, as an extension of financial services' reimbursement division. LMHS began with CRSs in 3 service lines: emergency department, cardiac catheterization, and radiology. These 3 CRSs coordinate all facets of their respective areas' chargemaster, patient charges, coding, and reimbursement functions while serving as a resident coding expert within their clinical areas. The radiology reimbursement specialist (RRS) combines an experienced radiologic technologist, interventional technologist, medical records coder, financial auditor, reimbursement specialist, and biller into a single position. The RRS's radiology experience and technologist knowledge are key assets to resolving coding conflicts and handling complex interventional coding. In addition, performing a daily charge audit and an active code review are essential if an organization is to eliminate coding errors. One of the inherent effects of eliminating coding errors is the capturing of additional RVUs and units of service. During its first year, based on account level detail, the RRS system increased radiology productivity through the additional capture of just more than 3,000 RVUs and 1,000 additional units of service. In addition, the physicians appreciate having someone who "keeps up with all the coding changes" and looks out for the charges. By assisting a few physicians' staff with coding questions, providing coding updates, and allowing them to sit in on educational sessions, at least 2 physicians have transferred some their volume to LMHS from a competitor. The provision of a "clean account," without coding errors, allows the biller to avoid the rework and billing delays caused by coding issues. During the first quarter of the RRS system, the billers referred an average of 9 accounts per day for coding resolution. During the fourth quarter of the system, these referrals were reduced to less than one per day. Prior to the RRS system, resolving these issues took an average of 4 business days. Now the conflicts are resolved within 24 hours.

  5. Improving performance of DS-CDMA systems using chaotic complex Bernoulli spreading codes

    NASA Astrophysics Data System (ADS)

    Farzan Sabahi, Mohammad; Dehghanfard, Ali

    2014-12-01

    The most important goal of spreading spectrum communication system is to protect communication signals against interference and exploitation of information by unintended listeners. In fact, low probability of detection and low probability of intercept are two important parameters to increase the performance of the system. In Direct Sequence Code Division Multiple Access (DS-CDMA) systems, these properties are achieved by multiplying the data information in spreading sequences. Chaotic sequences, with their particular properties, have numerous applications in constructing spreading codes. Using one-dimensional Bernoulli chaotic sequence as spreading code is proposed in literature previously. The main feature of this sequence is its negative auto-correlation at lag of 1, which with proper design, leads to increase in efficiency of the communication system based on these codes. On the other hand, employing the complex chaotic sequences as spreading sequence also has been discussed in several papers. In this paper, use of two-dimensional Bernoulli chaotic sequences is proposed as spreading codes. The performance of a multi-user synchronous and asynchronous DS-CDMA system will be evaluated by applying these sequences under Additive White Gaussian Noise (AWGN) and fading channel. Simulation results indicate improvement of the performance in comparison with conventional spreading codes like Gold codes as well as similar complex chaotic spreading sequences. Similar to one-dimensional Bernoulli chaotic sequences, the proposed sequences also have negative auto-correlation. Besides, construction of complex sequences with lower average cross-correlation is possible with the proposed method.

  6. Modulation transfer function of a fish-eye lens based on the sixth-order wave aberration theory.

    PubMed

    Jia, Han; Lu, Lijun; Cao, Yiqing

    2018-01-10

    A calculation program of the modulation transfer function (MTF) of a fish-eye lens is developed with the autocorrelation method, in which the sixth-order wave aberration theory of ultra-wide-angle optical systems is used to simulate the wave aberration distribution at the exit pupil of the optical systems. The autocorrelation integral is processed with the Gauss-Legendre integral, and the magnification chromatic aberration is discussed to calculate polychromatic MTF. The MTF calculation results of a given example are then compared with those previously obtained based on the fourth-order wave aberration theory of plane-symmetrical optical systems and with those from the Zemax program. The study shows that MTF based on the sixth-order wave aberration theory has satisfactory calculation accuracy even for a fish-eye lens with a large acceptance aperture. And the impacts of different types of aberrations on the MTF of a fish-eye lens are analyzed. Finally, we apply the self-adaptive and normalized real-coded genetic algorithm and the MTF developed in the paper to optimize the Nikon F/2.8 fish-eye lens; consequently, the optimized system shows better MTF performances than those of the original design.

  7. A code-aided carrier synchronization algorithm based on improved nonbinary low-density parity-check codes

    NASA Astrophysics Data System (ADS)

    Bai, Cheng-lin; Cheng, Zhi-hui

    2016-09-01

    In order to further improve the carrier synchronization estimation range and accuracy at low signal-to-noise ratio ( SNR), this paper proposes a code-aided carrier synchronization algorithm based on improved nonbinary low-density parity-check (NB-LDPC) codes to study the polarization-division-multiplexing coherent optical orthogonal frequency division multiplexing (PDM-CO-OFDM) system performance in the cases of quadrature phase shift keying (QPSK) and 16 quadrature amplitude modulation (16-QAM) modes. The simulation results indicate that this algorithm can enlarge frequency and phase offset estimation ranges and enhance accuracy of the system greatly, and the bit error rate ( BER) performance of the system is improved effectively compared with that of the system employing traditional NB-LDPC code-aided carrier synchronization algorithm.

  8. Association between implementation of a code stroke system and poststroke epilepsy.

    PubMed

    Chen, Ziyi; Churilov, Leonid; Chen, Ziyuan; Naylor, Jillian; Koome, Miriam; Yan, Bernard; Kwan, Patrick

    2018-03-27

    We aimed to investigate the effect of a code stroke system on the development of poststroke epilepsy. We retrospectively analyzed consecutive patients treated with IV thrombolysis under or outside the code stroke system between 2003 and 2012. Patients were followed up for at least 2 years or until death. Factors with p < 0.1 in univariate comparisons were selected for multivariable logistic and Cox regression. A total of 409 patients met the eligibility criteria. Their median age at stroke onset was 75 years (interquartile range 64-83 years); 220 (53.8%) were male. The median follow-up duration was 1,074 days (interquartile range 119-1,671 days). Thirty-two patients (7.8%) had poststroke seizures during follow-up, comprising 7 (1.7%) with acute symptomatic seizures and 25 (6.1%) with late-onset seizures. Twenty-six patients (6.4%) fulfilled the definition of poststroke epilepsy. Three hundred eighteen patients (77.8%) were treated with the code stroke system while 91 (22.2%) were not. After adjustment for age and stroke etiology, use of the code stroke system was associated with decreased odds of poststroke epilepsy (odds ratio = 0.36, 95% confidence interval 0.14-0.87, p = 0.024). Cox regression showed lower adjusted hazard rates for poststroke epilepsy within 5 years for patients managed under the code stroke system (hazard ratio = 0.60, 95% confidence interval 0.47-0.79, p < 0.001). The code stroke system was associated with reduced odds and instantaneous risk of poststroke epilepsy. Further studies are required to identify the contribution of the individual components and mechanisms against epileptogenesis after stroke. This study provides Class III evidence that for people with acute ischemic stroke, implementation of a code stroke system reduces the risk of poststroke epilepsy. © 2018 American Academy of Neurology.

  9. An ultrasound transient elastography system with coded excitation.

    PubMed

    Diao, Xianfen; Zhu, Jing; He, Xiaonian; Chen, Xin; Zhang, Xinyu; Chen, Siping; Liu, Weixiang

    2017-06-28

    Ultrasound transient elastography technology has found its place in elastography because it is safe and easy to operate. However, it's application in deep tissue is limited. The aim of this study is to design an ultrasound transient elastography system with coded excitation to obtain greater detection depth. The ultrasound transient elastography system requires tissue vibration to be strictly synchronous with ultrasound detection. Therefore, an ultrasound transient elastography system with coded excitation was designed. A central component of this transient elastography system was an arbitrary waveform generator with multi-channel signals output function. This arbitrary waveform generator was used to produce the tissue vibration signal, the ultrasound detection signal and the synchronous triggering signal of the radio frequency data acquisition system. The arbitrary waveform generator can produce different forms of vibration waveform to induce different shear wave propagation in the tissue. Moreover, it can achieve either traditional pulse-echo detection or a phase-modulated or a frequency-modulated coded excitation. A 7-chip Barker code and traditional pulse-echo detection were programmed on the designed ultrasound transient elastography system to detect the shear wave in the phantom excited by the mechanical vibrator. Then an elasticity QA phantom and sixteen in vitro rat livers were used for performance evaluation of the two detection pulses. The elasticity QA phantom's results show that our system is effective, and the rat liver results show the detection depth can be increased more than 1 cm. In addition, the SNR (signal-to-noise ratio) is increased by 15 dB using the 7-chip Barker coded excitation. Applying 7-chip Barker coded excitation technique to the ultrasound transient elastography can increase the detection depth and SNR. Using coded excitation technology to assess the human liver, especially in obese patients, may be a good choice.

  10. Analysis of airborne antenna systems using geometrical theory of diffraction and moment method computer codes

    NASA Technical Reports Server (NTRS)

    Hartenstein, Richard G., Jr.

    1985-01-01

    Computer codes have been developed to analyze antennas on aircraft and in the presence of scatterers. The purpose of this study is to use these codes to develop accurate computer models of various aircraft and antenna systems. The antenna systems analyzed are a P-3B L-Band antenna, an A-7E UHF relay pod antenna, and traffic advisory antenna system installed on a Bell Long Ranger helicopter. Computer results are compared to measured ones with good agreement. These codes can be used in the design stage of an antenna system to determine the optimum antenna location and save valuable time and costly flight hours.

  11. ETF system code: composition and applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Reid, R.L.; Wu, K.F.

    1980-01-01

    A computer code has been developed for application to ETF tokamak system and conceptual design studies. The code determines cost, performance, configuration, and technology requirements as a function of tokamak parameters. The ETF code is structured in a modular fashion in order to allow independent modeling of each major tokamak component. The primary benefit of modularization is that it allows updating of a component module, such as the TF coil module, without disturbing the remainder of the system code as long as the input/output to the modules remains unchanged. The modules may be run independently to perform specific design studies,more » such as determining the effect of allowable strain on TF coil structural requirements, or the modules may be executed together as a system to determine global effects, such as defining the impact of aspect ratio on the entire tokamak system.« less

  12. A systematic literature review of automated clinical coding and classification systems

    PubMed Central

    Williams, Margaret; Fenton, Susan H; Jenders, Robert A; Hersh, William R

    2010-01-01

    Clinical coding and classification processes transform natural language descriptions in clinical text into data that can subsequently be used for clinical care, research, and other purposes. This systematic literature review examined studies that evaluated all types of automated coding and classification systems to determine the performance of such systems. Studies indexed in Medline or other relevant databases prior to March 2009 were considered. The 113 studies included in this review show that automated tools exist for a variety of coding and classification purposes, focus on various healthcare specialties, and handle a wide variety of clinical document types. Automated coding and classification systems themselves are not generalizable, nor are the results of the studies evaluating them. Published research shows these systems hold promise, but these data must be considered in context, with performance relative to the complexity of the task and the desired outcome. PMID:20962126

  13. [ENT and head and neck surgery in the German DRG system 2007].

    PubMed

    Franz, D; Roeder, N; Hörmann, K; Alberty, J

    2007-07-01

    The German DRG system has been further developed into version 2007. For ENT and head and neck surgery, significant changes in the coding of diagnoses and medical operations as well as in the the DRG structure have been made. New ICD codes for sleep apnoea and acquired tracheal stenosis have been implemented. Surgery on the acoustic meatus, removal of auricle hyaline cartilage for transplantation (e. g. rhinosurgery) and tonsillotomy have been coded in the 2007 version. In addition, the DRG structure has been improved. Case allocation of more than one significant operation has been established. The G-DRG system has gained in complexity. High demands are made on the coding of complex cases, whereas standard cases require mostly only one specific diagnosis and one specific OPS code. The quality of case allocation for ENT patients within the G-DRG system has been improved. Nevertheless, further adjustments of the G-DRG system are necessary.

  14. Computer code for analyzing the performance of aquifer thermal energy storage systems

    NASA Astrophysics Data System (ADS)

    Vail, L. W.; Kincaid, C. T.; Kannberg, L. D.

    1985-05-01

    A code called Aquifer Thermal Energy Storage System Simulator (ATESSS) has been developed to analyze the operational performance of ATES systems. The ATESSS code provides an ability to examine the interrelationships among design specifications, general operational strategies, and unpredictable variations in the demand for energy. The uses of the code can vary the well field layout, heat exchanger size, and pumping/injection schedule. Unpredictable aspects of supply and demand may also be examined through the use of a stochastic model of selected system parameters. While employing a relatively simple model of the aquifer, the ATESSS code plays an important role in the design and operation of ATES facilities by augmenting experience provided by the relatively few field experiments and demonstration projects. ATESSS has been used to characterize the effect of different pumping/injection schedules on a hypothetical ATES system and to estimate the recovery at the St. Paul, Minnesota, field experiment.

  15. A systematic literature review of automated clinical coding and classification systems.

    PubMed

    Stanfill, Mary H; Williams, Margaret; Fenton, Susan H; Jenders, Robert A; Hersh, William R

    2010-01-01

    Clinical coding and classification processes transform natural language descriptions in clinical text into data that can subsequently be used for clinical care, research, and other purposes. This systematic literature review examined studies that evaluated all types of automated coding and classification systems to determine the performance of such systems. Studies indexed in Medline or other relevant databases prior to March 2009 were considered. The 113 studies included in this review show that automated tools exist for a variety of coding and classification purposes, focus on various healthcare specialties, and handle a wide variety of clinical document types. Automated coding and classification systems themselves are not generalizable, nor are the results of the studies evaluating them. Published research shows these systems hold promise, but these data must be considered in context, with performance relative to the complexity of the task and the desired outcome.

  16. Industrial Code Development

    NASA Technical Reports Server (NTRS)

    Shapiro, Wilbur

    1991-01-01

    The industrial codes will consist of modules of 2-D and simplified 2-D or 1-D codes, intended for expeditious parametric studies, analysis, and design of a wide variety of seals. Integration into a unified system is accomplished by the industrial Knowledge Based System (KBS), which will also provide user friendly interaction, contact sensitive and hypertext help, design guidance, and an expandable database. The types of analysis to be included with the industrial codes are interfacial performance (leakage, load, stiffness, friction losses, etc.), thermoelastic distortions, and dynamic response to rotor excursions. The first three codes to be completed and which are presently being incorporated into the KBS are the incompressible cylindrical code, ICYL, and the compressible cylindrical code, GCYL.

  17. Interface requirements for coupling a containment code to a reactor system thermal hydraulic codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baratta, A.J.

    1997-07-01

    To perform a complete analysis of a reactor transient, not only the primary system response but the containment response must also be accounted for. Such transients and accidents as a loss of coolant accident in both pressurized water and boiling water reactors and inadvertent operation of safety relief valves all challenge the containment and may influence flows because of containment feedback. More recently, the advanced reactor designs put forth by General Electric and Westinghouse in the US and by Framatome and Seimens in Europe rely on the containment to act as the ultimate heat sink. Techniques used by analysts andmore » engineers to analyze the interaction of the containment and the primary system were usually iterative in nature. Codes such as RELAP or RETRAN were used to analyze the primary system response and CONTAIN or CONTEMPT the containment response. The analysis was performed by first running the system code and representing the containment as a fixed pressure boundary condition. The flows were usually from the primary system to the containment initially and generally under choked conditions. Once the mass flows and timing are determined from the system codes, these conditions were input into the containment code. The resulting pressures and temperatures were then calculated and the containment performance analyzed. The disadvantage of this approach becomes evident when one performs an analysis of a rapid depressurization or a long term accident sequence in which feedback from the containment can occur. For example, in a BWR main steam line break transient, the containment heats up and becomes a source of energy for the primary system. Recent advances in programming and computer technology are available to provide an alternative approach. The author and other researchers have developed linkage codes capable of transferring data between codes at each time step allowing discrete codes to be coupled together.« less

  18. A new coding system for metabolic disorders demonstrates gaps in the international disease classifications ICD-10 and SNOMED-CT, which can be barriers to genotype-phenotype data sharing.

    PubMed

    Sollie, Annet; Sijmons, Rolf H; Lindhout, Dick; van der Ploeg, Ans T; Rubio Gozalbo, M Estela; Smit, G Peter A; Verheijen, Frans; Waterham, Hans R; van Weely, Sonja; Wijburg, Frits A; Wijburg, Rudolph; Visser, Gepke

    2013-07-01

    Data sharing is essential for a better understanding of genetic disorders. Good phenotype coding plays a key role in this process. Unfortunately, the two most widely used coding systems in medicine, ICD-10 and SNOMED-CT, lack information necessary for the detailed classification and annotation of rare and genetic disorders. This prevents the optimal registration of such patients in databases and thus data-sharing efforts. To improve care and to facilitate research for patients with metabolic disorders, we developed a new coding system for metabolic diseases with a dedicated group of clinical specialists. Next, we compared the resulting codes with those in ICD and SNOMED-CT. No matches were found in 76% of cases in ICD-10 and in 54% in SNOMED-CT. We conclude that there are sizable gaps in the SNOMED-CT and ICD coding systems for metabolic disorders. There may be similar gaps for other classes of rare and genetic disorders. We have demonstrated that expert groups can help in addressing such coding issues. Our coding system has been made available to the ICD and SNOMED-CT organizations as well as to the Orphanet and HPO organizations for further public application and updates will be published online (www.ddrmd.nl and www.cineas.org). © 2013 WILEY PERIODICALS, INC.

  19. The "Wow! signal" of the terrestrial genetic code

    NASA Astrophysics Data System (ADS)

    shCherbak, Vladimir I.; Makukov, Maxim A.

    2013-05-01

    It has been repeatedly proposed to expand the scope for SETI, and one of the suggested alternatives to radio is the biological media. Genomic DNA is already used on Earth to store non-biological information. Though smaller in capacity, but stronger in noise immunity is the genetic code. The code is a flexible mapping between codons and amino acids, and this flexibility allows modifying the code artificially. But once fixed, the code might stay unchanged over cosmological timescales; in fact, it is the most durable construct known. Therefore it represents an exceptionally reliable storage for an intelligent signature, if that conforms to biological and thermodynamic requirements. As the actual scenario for the origin of terrestrial life is far from being settled, the proposal that it might have been seeded intentionally cannot be ruled out. A statistically strong intelligent-like "signal" in the genetic code is then a testable consequence of such scenario. Here we show that the terrestrial code displays a thorough precision-type orderliness matching the criteria to be considered an informational signal. Simple arrangements of the code reveal an ensemble of arithmetical and ideographical patterns of the same symbolic language. Accurate and systematic, these underlying patterns appear as a product of precision logic and nontrivial computing rather than of stochastic processes (the null hypothesis that they are due to chance coupled with presumable evolutionary pathways is rejected with P-value < 10-13). The patterns are profound to the extent that the code mapping itself is uniquely deduced from their algebraic representation. The signal displays readily recognizable hallmarks of artificiality, among which are the symbol of zero, the privileged decimal syntax and semantical symmetries. Besides, extraction of the signal involves logically straightforward but abstract operations, making the patterns essentially irreducible to any natural origin. Plausible ways of embedding the signal into the code and possible interpretation of its content are discussed. Overall, while the code is nearly optimized biologically, its limited capacity is used extremely efficiently to pass non-biological information.

  20. Weather data dissemination to aircraft

    NASA Technical Reports Server (NTRS)

    Mcfarland, Richard H.; Parker, Craig B.

    1990-01-01

    Documentation exists that shows weather to be responsible for approximately 40 percent of all general aviation accidents with fatalities. Weather data products available on the ground are becoming more sophisticated and greater in number. Although many of these data are critical to aircraft safety, they currently must be transmitted verbally to the aircraft. This process is labor intensive and provides a low rate of information transfer. Consequently, the pilot is often forced to make life-critical decisions based on incomplete and outdated information. Automated transmission of weather data from the ground to the aircraft can provide the aircrew with accurate data in near-real time. The current National Airspace System Plan calls for such an uplink capability to be provided by the Mode S Beacon System data link. Although this system has a very advanced data link capability, it will not be capable of providing adequate weather data to all airspace users in its planned configuration. This paper delineates some of the important weather data uplink system requirements, and describes a system which is capable of meeting these requirements. The proposed system utilizes a run-length coding technique for image data compression and a hybrid phase and amplitude modulation technique for the transmission of both voice and weather data on existing aeronautical Very High Frequency (VHF) voice communication channels.

Top